We collaborate closely with the UA Holodeck, National Science Foundation MRI Track 2 Development project, directed by Dr. Winslow Burleson. The UA Holodeck is an immersive, collaborative, and virtual/physical research environment accelerating the convergence of science and innovation at the University of Arizona. 


This project, developing a distributed instrument to physically integrate seamlessly the physical with the virtual, aims to create a unique experiential supercomputer, an immersive, collaborative, virtual/physical research environment with unparalleled tools for intellectual and creative output, a Holodeck that scientifically exceeds Star Trek science fiction. The work should advance the next-generation experiences in human interaction and deep integration of virtual and physical settings, creating rich actualizing environments to support research and discovery of new paradigms. The flexible, modular, reconfigurable infrastructure will connect researchers, research, and educational facilities across the university, and external researchers, communities, and industry partners worldwide. The instrument will enable exploration of a myriad of research questions involving virtual environments, telepresence, collaborative engagement, and remote interaction and create a strong foundation for extended collaborations. The project integrates qualitative and quantitative assessment of affect and motivation, with foundations of learning science, motion science, acoustics, modeling and simulation, robotics and fabrication to improve research effectiveness and scalability to address real world challenges.


The work is accomplished by integrating 3-D printing to physically realize simulated forms, robots to allow virtual models to impact the physical world and interact with it, haptics to provide human collaborators a sense of the tactile feel of virtual objects, and physiological state monitors to try to get "inside" the human experience. This "experiential supercomputing" serves as a paradigm for human collaboration that translates to all disciplines spanning computer science, engineering, music, psychology, nursing, radiology, applied math, biology, and medicine. The effort transcends visualization to approach human collaboration in its totality and has the potential to transform our fundamental understanding of collaboration, learning, creativity, discovery, and innovation. The NSF Holodeck, developed through a collaboration between UArizona and NYU, will be a well-integrated software/hardware instrument incorporating visual, audio, and physical components and novel technologies to enhance social interactions (human-human, human-agent, and human-robot). In its incorporation of rapid prototyping and fabrication tools, this unique instrument fosters creative capacity, and tight coupling of interactive visual, audio, and physical experience. Its research capabilities support comprehensive capture of behavioral, physiological, affective, and cognitive data, and visualization and analysis of the data in real time. Creating innovative environments that bridge simulation, cyber learning, scientific visualization, human-computer interaction, and applied physical science research; the effort enables new types of science where researchers from diverse disciplines can interact with theoretical models, real objects, robots, and agents, engendering insights not possible using 2D, 3D, or other currently available representations.


For students interested in joining the lab: It is possible to collaborate with the Sensor Lab through Holodeck's Vertically Integrated Projects. 

Find out more about the Holodeck vip!