This ongoing project investigates the application of immersive theatre and improvisation based devising methods in the development of room scale virtual reality experiences.
The projects allow a participant to put on a Vive head mounted display and interact with a virtual avatar performed by a live actor and a virtual environment in real time. Each environment is associated with a rough story idea. The participant can improvise interactions and dialogue with the live actor. Some variations of this setup also introduce an additional character pre-recorded/captured by the same or another actor. Most environments include physical props that match locations of some virtual objects, creating a possibility of haptic feedback. With attached optical markers, some of the props are also physically manipulatable. Through this work we seek to gain better understanding for developing innovative VR experiences that involve co-presence and cooperation among multiple participants, haptic based on real objects with the foreseeable applications in the arts as well as education, various types of training and multi-player simulation.
The technical setup takes places inside a 40x40’ volume with 20x20’ trackable area and a projection screen for the audience and the actor. Physical furniture and props provide haptic feedback for the participant. Besides the furniture and the screen, spike tape marks on the floor guide the actor. We combine optical tracking of the live actor and physical props with HTC Vive headset and controllers tracking via the lighthouses. Vicon Blade in combination with Unity 3d or Motionbuilder is used for prototyping and developing the experience. Immersive sound is optionally used in the experiences allowing the participant to hear actor’s voice through headphones as they speak through wireless microphone.