EYEPHONE

Ms. A. Sivasankari, Mrs. G. Sangeetha Lakshmi, M. Saranya

Abstract: As Smartphones evolve  researchers  are  studying  new  techniques  to  ease  the  human-mobile  interaction. We  propose  EyePhone, a novel “hand-free” interfacing  system  capable  of  driving  mobile  applications/functions  using  only  the user’s  eyes  movement  and  actions (e.g., wink). EyePhone  tracks  the  user’s  eye  movement  across  the  phone’s  display  using  the camera  mounted  on  the  front of  the  phone;   more  specifically,  machine learning  algorithms  are  used  to Track  the  eye  and  infer  its  position on the  mobile  phone  display as  a  user  views a particular application;  and Detect eye blinks that emulate mouse clicks to activate the target application  under  view.

We  present  a prototype  implementation  of  EyePhone  on  a  Nokia N810, which  is  capable  of  tracking  the  position  of  the  eye  on  the  display,  mapping this  positions  to  an  application  that  is  activated  by  a wink. At  no  time  does  the user  have  to  physically  touch  the  phone  display.

Keywords:  Human-Phone Interaction, Mobile Sensing Systems, Machine Learning, Mobile Phones.

Title: EYEPHONE

Author: Ms. A. Sivasankari, Mrs. G. Sangeetha Lakshmi, M. Saranya

International Journal of Computer Science and Information Technology Research

ISSN 2348-120X (online), ISSN 2348-1196 (print)

Research Publish Journals

Vol. 2, Issue 3, July 2014 - September 2014

Citation
Share : Facebook Twitter Linked In

Citation
EYEPHONE by Ms. A. Sivasankari, Mrs. G. Sangeetha Lakshmi, M. Saranya