Research Labs

Our Human-Centered Computing PhD program is home to 11 cutting-edge research labs redefining how people and technology connect. Our labs pioneer breakthroughs in VR, AR, and XR experiences; design intelligent agents and dialogue-driven tutors that learn from and adapt to users; and push the boundaries of child–computer interaction and K–12 computing education. We lead advances in immersive visualization and analytics, safeguard privacy and trust in digital systems, and create lifelike virtual humans, spatial audio environments, and health-focused simulations that improve lives. What sets us apart is our impact beyond the lab. In fact, our researchers co-design and collaborate with educators, clinicians, policymakers, and communities to deliver technologies that not only work in theory but transform the way people live, learn, and thrive.
Research Categories:
Across our Human‑Centered Computing division, 11 labs now span a visible VR/AR/XR thrust (virtual humans, AR task guidance, immersive analytics, spatial audio), child–computer interaction (NUIs for children and families), human–AI/agents, learning sciences and CS education, visualization/visual analytics, privacy, security, and trust, accessibility and equity, audio/sonification, health and well‑being, and a cross‑cutting machine learning effort that powers modeling, perception, adaptation, and evaluation across these areas.
The Computing for Social Good Lab (CFSG) works on designing, building, and evaluating computational technologies as they relate to the human condition and reflecting on how these technologies affect society. Addressing national social matters within the CFSG Lab enables us to work with various departments, such as political science, psychology, business, engineering, sociology, history, and athletics. We also collaborate beyond the university’s borders, working with state and federal government offices and organizations. Our research covers a variety of areas, including voting/elections technologies, fairness/bias in AI, advanced learning technologies, culturally relevant computing or ethnocomputing, privacy/security, usability and accessibility. The goal is to build innovative solutions to real-world problems by integrating people, information, culture, policy, and technology to address societal issues.
Director: Juan E. Gilbert, Ph.D.
The Embodied Learning & Experience (ELX) Lab conducts research in two broad areas: learning technologies and technologies for health and well-being. The guiding principle of the lab is to conduct research for real users and in authentic settings. The ELX lab engages in human-centered design and evaluation of both existing and new technologies. Example projects include wearable technologies for learning, Maker-based learning, culturally-relevant computing in education, and motion tracking in learning environments. The ELX group is diverse and very interdisciplinary.
Director: Sharon Lynn Chu, Ph.D.
We are an excited group of faculty, graduate, and undergraduate researchers in the Computer & Information Science & Engineering department at the University of Florida focused on creating fun and engaging learning experiences for K-12 and undergraduate students. We have degrees in Computer Science, Computer Engineering, Human Centered Computing, Mechanical Engineering, Mathematics, Learning Sciences, and Early Childhood Education. We work in collaboration with researchers and practitioners in Education, Digital Arts and Sciences, and State Departments of Education. We conduct research in Computer Science Education (CS ED) Research, Learning Technology Design & Evaluation, and Curriculum Development & Assessment.
Director: Christina Gardner-McCune, Ph.D.; Co-Director: Jeremiah Blanchard, Ph.D.
The Intelligent Agents Research Group (IARG) explores how intelligent agents can be designed, developed, and applied to support human learning and collaboration. Our work ranges from building agent architectures and decision-making models to deploying agents in real-world educational settings. We build and study agents that adapt to learners and provide intelligent feedback through tutoring. We also design agent-based training environments for interpersonal skills such as collaborative problem solving and negotiation, equipping learners with strategies to work together effectively. In addition, we develop resources for low-resource languages, extending the reach of agent technologies to global communities. We advance the science of intelligent agents while designing technologies that make education more adaptive, and impactful.
Director: Emmanuel Dorley, Ph.D.
The Intelligent Natural Interaction Technology (INIT) lab focuses on advanced interaction technologies such as touch, speech, and gesture, especially for children in the context of educational interfaces. INIT Lab projects advance human-computer interaction (HCI) research questions of how users want to interact with these natural modalities, and computer science research questions of how to build recognition algorithms that can understand user input in these ambiguous modalities. Research in the INIT Lab both integrates and contributes to research in human-computer interaction, child-computer interaction, multimodal interaction, machine learning and artificial intelligence, cognitive science, and interaction design.
Director: Lisa Anthony, Ph.D.
The Interactive Data and Immersive Environments (Indie) research lab conducts human-centered research of interactive visualizations. The Indie lab focuses on the design and evaluation of applications and techniques that support effective interaction and understanding of data, information, and virtual environments. Indie research lies under the umbrella of human-computer interaction (HCI), and projects address topics information visualization, visual analytics, virtual reality, or 3D interaction.
Director: Eric Ragan, Ph.D.
The Jain Lab works on problems at the intersection of computer graphics, virtual reality, user attention modeling, and privacy and security.
Director: Eakta Jain, Ph.D.
The LearnDialogue research group’s research focuses on dialogue for teaching and learning. The LearnDialogue group builds computational models of dialogue and learning, and these models drive the adaptivity of intelligent systems. They also design, develop, and investigate learning environments for individuals and collaborative groups including tutorial dialogue, intelligent tutoring, virtual agents for learning, and game-based learning. Additionally, the LearnDialogue group aims to transform the way that young learners experience computer science and engages in evidence-based multidisciplinary research and computer science curriculum development.
Director: Kristy Elizabeth Boyer, Ph.D.
The Ruiz Human-Computer Interaction Lab at the University of Florida focuses on tackling problems in the field of human-computer interaction. The lab’s current research focuses on developing new Natural User Interfaces (NUI) that support non-verbal communication in novel ways and developing new gestural interactions for virtual reality and augmented reality.
Director: Jaime Ruiz, Ph.D.
The SoundPAD Lab focuses on the Perception, Application, and Development of 3D Audio systems. The lab’s research lies at the intersection of human-centered computing, electrical engineering, and auditory perception. As virtual reality (VR) and augmented reality (AR) becomes more commonplace, significant research is needed to investigate 3D audio’s role in systems of the future. Broadly, the lab’s goal is to elucidate the human factors that must be taken into account in the design and use of 3D audio systems. Selected Publications: Profile link
Director: Kyla Mcmullen, Ph.D.
- The Virtual Experiences Research Group (VERG) is on a mission to transform how people learn, connect, and grow—through experiences with virtual humans. We explore the cutting edge of virtual reality, mixed reality, and AI-driven characters to design training tools that don’t just teach skills—they inspire empathy, improve communication, and change lives.
- Our current focus? Helping people navigate emotionally complex, high-stakes interpersonal scenarios by simulating real-world interactions in safe, controlled virtual environments. From tough patient conversations to high-pressure team dynamics, we use technology to prepare students for the human side of healthcare.
- VERG is a highly interdisciplinary group. Students work closely with experts in medicine, nursing, communication, psychology, speech pathology, and education-gaining hands-on experience while contributing to impactful research that directly affects practice and policy.
- If you’re passionate about building technologies that move hearts and minds, and want to be part of a collaborative team making a difference, come explore more at our website
Director: Benjamin Lok, Ph.D.