Flexi-modal and multi-machine user interfaces
2003pp. 343–348
Citations Over TimeTop 10% of 2003 papers
Brad A. Myers, Robert Malkin, M. Bett, Alex Waibel, Benjamin Bostwick, Robert C. Miller, Jie Chi Yang, Matthias Denecke, Edgar Seemann, Jie Zhu, Choon Hong Peck, Desheng Kong, Jeffrey Nichols, Bill Scherlis
Abstract
We describe our system which facilitates collaboration using multiple modalities, including speech, handwriting, gestures, gaze tracking, direct manipulation, large projected touch-sensitive displays, laser pointer tracking, regular monitors with a mouse and keyboard, and wireless networked handhelds. Our system allows multiple, geographically dispersed participants to simultaneously and flexibly mix different modalities using the right interface at the right time on one or more machines. We discuss each of the modalities provided, how they were integrated in the system architecture, and how the user interface enabled one or more people to flexibly use one or more devices.
Related Papers
- → Ghosts in the interface: Meta-user interface visualizations as guides for multi-touch interaction(2008)17 cited
- → Pen-based user interface(2004)15 cited
- → Interface usage measurements in a user interface management system(1988)41 cited
- → Design assistance for user-adapted interaction(1996)3 cited
- Development of touch user interface for applications in healthcare(2011)