University of Cambridge > > Inference Group > Mapping facial gestures to control cursors and switches

Mapping facial gestures to control cursors and switches

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Carl Scheffler.

The OpenGazer project has recently been revived! OpenGazer is an open source application that tracks a user’s gaze directly from webcam images in real-time. The final goal is to write in Dasher using only one’s eyes. I will discuss work in progress on two problems that we are trying to solve:

1) A binary gesture switch: Here the problem is to learn “yes” and “no” gestures from facial images captured from an ordinary webcam. The learning is done in a short setup phase, after which these signals are automatically detected. I will show some preliminary results. We hope that this switch can be used as a communication device for patients with locked-in syndrome.

2) Robust head pose estimation: Here the problem is to control a mouse cursor with one’s head. Again, there is a short training phase involved (specific to the user), after which a regression algorithm computes the corresponding cursor coordinates.

This talk is part of the Inference Group series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.


© 2006-2023, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity