PROVOCATION
Get moving in VR! BirdBot grew out of an early Sandbox Collaboration we had using the Kinect to get good full-body interaction in virtual reality (rather than just being able to move or play with things using controllers). It is also a response to one of our core research interests in this project which is to create more physically active and stimulating virtual reality experiences.
The resulting prototype is what we call a "movement toy" and there are a few movements we targeted specifically including "balance," "level changes," and any gross motor action (in this case flapping the arms). But really any desired movement could become a mechanic of this "toy."
MAKING
We created is a series of Virtual Environments for the Oculus Rift using a Kinect as our sensor. One of our creative interests was to see what happens when we start with a movement idea and let the virtual world grow from there. A movement creates a story and the story creates the world. So it was a very intuitive, emergent process and evolved through many iterations that existed in the collaborative space between our minds/bodies. We had some fantastic brainstorming sessions with visual artist Isla Hansen about making a physical installation to experience while in VR and will continue that going forward. The nature imagery and heron came from our discussions about de-centering the human and making non-mirrored interfaces. When you put the headset on and enter the world of Birdbot you are in a peaceful room with grids on the walls but it is filled with trees and your shadow is a heron. If you flap your arms, a hidden world is revealed and as you balance on one foot (a challenge in VR) you rise up into a bright pink tunnel where you can make music with light-up chimes. Finally you enter a flyover world where you soar over a collage of compassionate landscapes that were created by students in our Teaching Clusters, including a tapestry made up of family photographs compiled from our research team.
REFLECTION
As always in the iterative design process, some of the things we tried out but didn’t use provided us with fun learning experiences and make the work stronger. The challenges of computer recognition of particular motions is a long-standing issue but the KINECT has made things easier and it is fantastic to see people moving and laughing and feeling good in VR.
Further relfection by Alice Grishchenko at http://www.humanetechosu.org/humaneblog/2017/5/12/bird-thoughts
Collaborators: Norah Zuniga Shaw (Dance, Principal Investigator); Alice Grishchenko (Lead Designer); Isla Hansen (Art); Maria Palazzi (Design), and students in Palazzi's Design 6400 class: Breanne Butters, Stacey Sherrick, Sarah Lawler, Zachary Winegardner, Kevin Bruggeman, Devin Ensz, Bruce Evans, Dreama Cleaver, Kien Hong. Demo Location: SIM Lab @ accad.osu.edu.