“Was that me?” Human perception of guided interaction

 

Background

When producing a movement, the central nervous system maintains a signal copy (efference copy) used to estimate the outcome of the movement, which is further compared with the sensed information (re-afference) of that movement. This comparison mechanism is useful to recognize self-produced movements from those of other subjects, and it may be tuned to accommodate consistent distortions to the sensorial system. Furthermore, studies have experimentally manipulated the agreement of (re-afferent) sensorial signals, showing that vision is predominant over other senses. That is, discrepant sensorial feedback (e.g. visio-proprioceptive and visuo-vestibular) tends to be solved in favor of vision.

Therefore, by manipulating visual feedback of a movement we may make the subject believe that the movement he sees is the movement he did, provided that the sensorial incongruence and the difference between re-afference and efference copy are not noticeable. Such manipulations are useful in Virtual Reality to overcome technological limitations, such as limitations of tracking space and haptic feedback.

 

Fig. 1: (left) sample of movement redirecting towards a target. The movement that was actually performed is shown in red. (right) ISO multidirectional pointing task.

Project Idea

This project proposes the usage of virtual reality to measure perception thresholds of distorted movements on an ISO standard multidirectional reaching task. More specifically, tolerance to three types of movement are to be measured (i) guiding movements – task made easier (ii) constricting movement – task made harder (iii) orthogonally distorted movements – nearly irrelevant to task difficulty.

 

Student work: 

1. The virtual body motion control platform will be provided, but some modifications to the implemented distortions may have to be made.

2. Implement the task

3. Logging scripts are provided, but it is likely that they will have to be adapted to accommodate proposed measurements.

4. Conduct an experiment

5. Analyze collected data – perception threshold (“did the avatar moved like you?”) and performance (coefficients of a Fitts’s law fit and throughput)

Development framework:

The development of this project is based on the interactive game engine Unity 3D.

Hardware:

Head mounted display – Oculus Rift DK2;

Motion tracking – Phasespace / playstation move controllers

The student will develop an Unity package to allow the easy integration of this functionality to any Unity project.

Requirement:

Programming (C# or Javascript Unity3D).

Contact:

Ronan BOULIC (Ronan.boulic at epfl.ch   INJ141 )

Henrique GALVAN DEBARBA (henrique.galvandebarba at epfl.ch INJ139 )