University of Cambridge > Talks.cam > Inference Group > Mapping facial gestures to control cursors and switches

Mapping facial gestures to control cursors and switches

Download to your calendar using vCal

If you have a question about this talk, please contact Carl Scheffler .

The OpenGazer project has recently been revived! OpenGazer is an open source application that tracks a userโ€™s gaze directly from webcam images in real-time. The final goal is to write in Dasher using only one’s eyes. I will discuss work in progress on two problems that we are trying to solve:

1) A binary gesture switch: Here the problem is to learn โ€œyesโ€ and โ€œnoโ€ gestures from facial images captured from an ordinary webcam. The learning is done in a short setup phase, after which these signals are automatically detected. I will show some preliminary results. We hope that this switch can be used as a communication device for patients with locked-in syndrome.

2) Robust head pose estimation: Here the problem is to control a mouse cursor with oneโ€™s head. Again, there is a short training phase involved (specific to the user), after which a regression algorithm computes the corresponding cursor coordinates.

This talk is part of the Inference Group series.

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

ยฉ 2006-2025 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity