BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Multimodal Gaze-Supported Interaction - Sophie Stellmach\, Technis
 che Universität Dresden
DTSTART:20131112T100000Z
DTEND:20131112T110000Z
UID:TALK48113@talks.cam.ac.uk
CONTACT:Microsoft Research Cambridge Talks Admins
DESCRIPTION:While our eye gaze represents an important medium for perceivi
 ng our environment\, it also serves as a fast and implicit way for signali
 ng interest in somebody or something. This could also benefit a flexible a
 nd convenient interaction with diverse computing systems ranging from smal
 l handheld devices to multiple large-sized screens. Considerable research 
 has already been pursued on gaze-only interaction\, which is however often
  described as error-prone\, imprecise\, and unnatural. To overcome these c
 hallenges\, multimodal combinations of gaze with additional input modaliti
 es show a high potential for fast\, fluent\, and convenient human-computer
  interaction in diverse user contexts. Promising examples for this novel s
 tyle of multimodal gaze-supported interaction are the seamless selection a
 nd manipulation of graphical objects displayed on distant screens by using
  a combination of a mobile handheld (such as a smartphone) and gaze input.
 \n \nIn my talk\, I will provide a brief introduction to gaze-based intera
 ction in general and present insights into my research at the Interactive 
 Media Lab. Thereby\, I will particularly emphasize the high potential of t
 he emerging area of multimodal gaze-supported interaction.\n \n
LOCATION:Auditorium\, Microsoft Research Ltd\, 21 Station Road\, Cambridge
 \, CB1 2FB
END:VEVENT
END:VCALENDAR
