Eye-based HCI with Full Specification of Mouse and Keyboard Using Pupil Knowledge in the Gaze Estimation
Citations Over TimeTop 19% of 2011 papers
Abstract
Eye-based Human Computer Interaction (HCI) with full specification of mouse and keyboard by human eyes only is proposed for, in particular, disabled person and for input device of wearable computing. It utilizes pupil knowledge in gaze estimation process. Existing conventional eye-mouse, gaze-mouse does not robust against various types of users with different features of color, shape and size of eyes and also is affected by illumination changes, users' movement (attitude changes), etc. Using knowledge about features, the influences are eliminated. Also the proposed eye-based HCI system allows simultaneous key input of more than three keys, for instance, "Ctl+Alt+Del" for initiating task manager, left/right click and drag/drop of mouse event capabilities. Although currently commercially available screen keyboard allows simultaneous key input for two keys such as "shit+*", "Alt+*", it does not allow three keys input at once. Such these full specification of mouse events and keyboard functions are available for the proposed HCI system. Experimental results with six different nationalities of users with the different features shows effectiveness of using the knowledge for improvement of gaze estimation accuracy.
Related Papers
- → Human Eye Pupil Detection System for Different IRIS Database Images(2021)21 cited
- → Hand Gestures for Mouse Movements with Hand and Computer Interaction model(2023)2 cited
- → Cursorkeyboard: An input method for small touch screen devices(2012)2 cited
- Research of bright pupil effect based on Chinese human eye model(2014)
- → Short range fingertip pointing interface using hand detection by depth camera(2018)