Exploring and Modifying the Sense of Time in Virtual Environments
VIRTUALTIMES combines insights from psychology, psychopathology, cognitive neuroscience, engineering, computer science and philosophy.
To do this, we use a range of different technologies and collect a rich array of empirical data including experimental, behavioral, physiological and neural data.
At the IGPP we are testing changes in time perception and the induction of flow states during video gaming. For this we use games that are played both on a screen and in virtual reality (VR)—such as Thumper (Drool 2016). A set of different questionnaires is employed to assess the participant’s subjective states during gaming and comparing it to behavioral success in the game.
Complementing our research on flow, we are conducting an experiment on boredom in virtual environments. To this end, we have created a VR waiting room that emulates a real waiting room at the IGPP. The real room has been previously used in an experiment that will now be conducted in the VR room in order to compare the effects that waiting has in the subjective experience of participants in real and virtual scenarios.
We are conducting studies with Electroencephalography (EEG) to seek for neural correlates of subjective gaming experience such as passage of time and flow. EEG is a non-invasive electrophysiological monitoring method for recording and exploring spontaneous electrical activity of the brain. With electrodes placed along the participant’s scalp, the system measures voltage fluctuations resulting from the activation of neurons in the brain.
The UH team has a fully equipped VR laboratory, with an electrically shielded room and two 40-channel Brain Products QuickAmp 40 amplifiers for measuring EEG, facial EMG, EDA, and ECG signals. For the implementation of immersive VR experiences, the lab has pairs of Oculus Rift and HTC vive VR head-mounted displays with eye-tracking device integrated into the headsets. In addition, we have built a desktop setup with motion tracking sensors that allows projection of the participant’s body into the VR environment. The VR environments are created using Unity and Unreal engines. Finally, our lab has created a diverse set of highly realistic virtual human agents with facial emotional expressions and other types of nonverbal emotional behaviors (e.g., ability to touch the participant via a haptic link) to allow investigation of time perception in social interactions.