Bring Real Human into VR environment

 Oculus+Kinect prototype

Prototype of Oculus and Kinect like depth sensor integration (from : http://www.roadtovr.com).

Background: 

Microsoft Kinect and similar depth camera has been released for 2 years. These depth cameras enabled users to control and interact with game scene without the need to touch a game controller, through a natural user interface using gestures and full-body movement. Thus, Kinect and similar depth sensors can be considered as a full-body markerless motion capture (MoCap) device. Our lab recently received latest Head-Mounted display (HMD) Oculus Rift developer kits. It provides 1280*800 resolution display and contains a 3-axis accelerometer, which can be used to detect head orientation.

Project idea:

The objective of this project is to integrate the features from Kinect like sensor and Oculus Rift HMD.  Using a Kinect-like depth sensor that can be fixed in front of Oculus Rift HMD to detect real-time human movement in real world and map it onto a virtual character (e.g. cartoon character) in a virtual environment. Final virtual environment including character movement will be represented from a perspective view in Oculus Rift HMD. For example, one subject wearing Kinect + Oculus Rift HMD perceives a real person in front of him in different character representations. The advanced goal of this project, two subjects wearing 2 suites HMD + Kinect would perceive each other in a same environment to conduct a collaborative task.

Developments:

The development of this project is based on the interactive render engine, Unity 3D ver 4.x with Oculus Rift SDK. Latest version of Unity was released with a new feature of making animation, Mecanim. Mecanim is a powerful and flexible animation system brings your human and non-human characters to life. Mecanim API is also provided by Unity and it will be used with Inverse Kinematics (IK) in this project.

Requirement:

3D Character Animation
Programming (C# or Javascript Unity3D)

Contact:

Ronan BOULIC (Ronan.boulic@epfl.ch   INJ141 )

Nan WANG (nan.wang@epfl.ch INJ138 )