Current Smartphone Virtual Reality solutions fail to engage the user in a truly realistic way. They are mostly static experiences that move around the user as opposed to the user moving through the environment. Additional peripherals such as Google’s Daydream controller try to improve this static experience by adding motion based input to make the user feel more immersed but still do not allow the user move through the environment.
Step is designed to free the user and allow them to explore virtual worlds in a natural and intuitive way. The user will be able to interact, move, and feel the environment surrounding them without wearing any additional hardware beyond the VR headset. The user's hand and arm motion is detecting using Microsoft Kinect while the user moves on an elliptical. Motion on the elliptical is translated into motion in the virtual environment. The user's hip movement is used to rotate the platform holding the elliptical allowing the user's in-game movement to result in real-world action. The virtual environment is developed using the Unity game development suite and deployed to a smartphone placed inside the user's headset.