Deep Reality

MIT Medialab

Interactive Virtual Reality (VR) experience that uses biometric information for reflection and relaxation. We monitor in real-time brain activity using a modified version of the Muse EEG and track heart rate (HR) and electro dermal activity (EDA) using an Empatica E4 wristband. We use this data to procedurally generate 3D creatures and change the lighting of the environment to reflect the internal state of the viewer in a set of visuals depicting an underwater audiovisual composition. These 3D creatures are created to unconsciously influence the body signals of the observer via subtle pulses of light, movement and sound. We aim to decrease heart rate and respiration by subtle, almost imperceptible light flickering, sound pulsations and slow movements of these creatures to increase relaxation.

Amores, J., Fuste, A., and Richer, R.. “Deep Reality: Towards Increasing Relaxation in VR by Subtly Changing Light, Sound and Movement Based on HR, EDA, and EEG.” Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems. ACM, 2019.

Fernandez, J. A., Maes, P., Fusté, A. and Richer, R. “Deep reality: an underwater VR experience to promote relaxation by unconscious HR, EDA, and brain activity biofeedback.” ACM SIGGRAPH 2019 Virtual, Augmented, and Mixed Reality. ACM, 2019.