Exploring and Modifying the Sense of Time in Virtual Environments
VIRTUALTIMES combines insights from psychology, psychopathology, cognitive neuroscience, engineering, computer science and philosophy.
To do this, we use a range of different technologies and collect a rich array of empirical data including experimental, behavioral, physiological and neural data.
At the IGPP we are testing changes in time perception and the induction of flow states during video gaming. For this we use games that are played both on a screen and in virtual reality (VR)—such as Thumper (Drool 2016). A set of different questionnaires is employed to assess the participant’s subjective states during gaming and comparing it to behavioral success in the game.
Complementing our research on flow, we are conducting an experiment on boredom in virtual environments. To this end, we have created a VR waiting room that emulates a real waiting room at the IGPP. The real room has been previously used in an experiment that will now be conducted in the VR room in order to compare the effects that waiting has in the subjective experience of participants in real and virtual scenarios.
We are conducting studies with Electroencephalography (EEG) to seek for neural correlates of subjective gaming experience such as passage of time and flow. EEG is a non-invasive electrophysiological monitoring method for recording and exploring spontaneous electrical activity of the brain. With electrodes placed along the participant’s scalp, the system measures voltage fluctuations resulting from the activation of neurons in the brain.
The UH team has a fully equipped VR laboratory, with an electrically shielded room and two 40-channel Brain Products QuickAmp 40 amplifiers for measuring EEG, facial EMG, EDA, and ECG signals. For the implementation of immersive VR experiences, the lab has pairs of Oculus Rift and HTC vive VR head-mounted displays with eye-tracking device integrated into the headsets. In addition, we have built a desktop setup with motion tracking sensors that allows projection of the participant’s body into the VR environment. The VR environments are created using Unity and Unreal engines. Finally, our lab has created a diverse set of highly realistic virtual human agents with facial emotional expressions and other types of nonverbal emotional behaviors (e.g., ability to touch the participant via a haptic link) to allow investigation of time perception in social interactions.
For over 20 years, the HCI group explored novel forms of human-computer interaction using Virtual Reality, Augmented Reality and Mixed Reality, taking into account the requirements defined by the physical, cognitive, and perceptive skills of users by such systems. In total the scientific findings resulted in more than 300 publications and several best paper and best poster awards. Currently, more than 50 people are working on over 30 projects.
Challenges range from the understanding and design of high-level concepts and models of human cognition, communication, and collaboration to the development of engineering principles and techniques necessary for the creation of rich, interactive, and intelligent user interfaces for computerized real, virtual, and blended media environments.
The group’s facilities include more than 15 laboratories, which are fully equipped with state-of-the-art hardware and software. These feature various body tracking systems, a scanner, two Cave Automatic Virtual Environments (CAVEs), several Head-Mounted Displays (HMDs) and multiple tailored touch displays.