BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Intelligent Inclusive Human Machine Interaction - Dr Pradipta Bisw
 as (University of Cambridge)
DTSTART:20160210T110000Z
DTEND:20160210T120000Z
UID:TALK64359@talks.cam.ac.uk
CONTACT:Mari Huhtala
DESCRIPTION:How can we design inclusive intelligent interface or interacti
 on? Say for example you have decided to launch a new tablet which will be 
 easier to use than existing devices by people with disabilities and elderl
 y users and thus will conquer a wider market segment than competitors.  \n
 One possibility is to enlighten the development team about the problems fa
 ced by people with different range of abilities with existing devices and 
 ask them to develop a solution which does not have such problems. Another 
 possibility will be to take an existing tablet\, load it with a few profil
 es tuned for different range of abilities – like profile for low vision\
 , colour blindness\, Parkinson’s Disease and so on and ask end users to 
 activate an appropriate profile after buying the tablet. The third possibi
 lity is more ambitious – we can have an online intelligent agent observi
 ng human machine interaction as part of the operating system of the tablet
  and adapting user interfaces as users kept on using their devices.\nI inv
 ented the *Inclusive User Model* that supports all these possibilities. It
  has a simulator which can help designers to visualize\, understand and me
 asure effect of physical impairment on designs. I published a few of these
  models as web service which can be used to create an online user profile 
 that can adapt user interface irrespective of device and application. A pa
 rt of the user profile is standardized by publishing through an ITU-T repo
 rt. I have also investigated eye gaze\, hand and finger movements of users
  and developed a few models using neural network\, polynomial regression a
 nd Bayesian logic that can predict users’ pointing target in a graphical
  user interface and can significantly reduce pointing and selection times.
  This talk will present a few applications and demonstrations of the Inclu
 sive User Model and *multimodal eye gaze tracking* technology.
LOCATION:Sir Arhur Marshall Room\, Engineering Design Centre\, CUED
END:VEVENT
END:VCALENDAR
