University of Cambridge > Talks.cam > Inference Group > Eye-Tracking in Virtual Reality

Eye-Tracking in Virtual Reality

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact David MacKay.

Collaborative virtual environments (CVEs) allow co-located or remote participants to experience communication inside a rich social, spatial and informational context. Co-presence amongst remote participants is achieved by enabling local sites to access a shared virtual environment.

We have developed the first CVE system (referred to as EyeCVE) that provides eye-tracking capability to each participant and uses it to drive avatar gaze in real-time in a CAVE -like environment. Remote participants can see and interact with their counterparts across the remote sites in the form of avatars, while changes to the virtual environment are updated in real-time. In our research we are interested in analyzing how gaze affects communication in immersive CVEs. One of the methods used is ‘conversation analysis’, which exposes systematic practices, verbal and non-verbal, during human interaction and analyzes how actions are routinely carried out in this context [Murgia et al., IEEE DS -RT08][Wolff et al., IEEE DS -RT08].

The work is funded by EPSRC and is carried out in collaboration with the School of Systems Engineering, University of Reading, the Centre for Virtual Environments, University of Salford, The Department of Computer Science, UCL , and the School of Human and Life Sciences, Roehampton University.

This talk is part of the Inference Group series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2019 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity