Multimodal Interfaces that Flex, Adapt, and Persist [Special Issue].
Multimodal Interfaces that Flex, Adapt, and Persist [Special Issue]
S. Oviatt, T Darrell, and M Flickner, eds. 2004. Communications of the ACM 47(1): 30-75.
"The articles in this section attest [that] strength in a multimodal interface derives from a number of factors, including their compatibility with users' abilities and existing work practices, and the flexibility these hybrid interfaces permit.... Computer speech and vision processing are two component technologies that are fundamental and pervasively used in developing these new systems, as discussed in the articles by [L. Deng and X. Huang] and by [M. Turk]. Some very recent multibiometric multimodal interfaces actually process three heterogeneous information sources together, including behavioral input based on speech and vision processing (voice, face) along with physiological input (fingerprint), as described by [A. Jain and A. Ross]. All of these multimodal interfaces are hybrids, and they are beginning to inject a new level of hybrid vigor into next-generation interface design.... [According to L. Reeves, J. Lai, J. Larson, S. Oviatt, T. S. Balaji, S. Buisine, P. Collings, P. Cohen, B. Kraal, J. Martin, M. McTear, T. Raman, K. Stanney, H. Su, and Q. Ying Wangl, multimodal interface designs based on this human communication model are able to achieve a new hybrid vigor that increases their uniqueness, robustness, and resistance to damage, while also extending their utility to more challenging mobile environments and larger groups of diverse users.... [They examine] related principle-driven multimodal interface guidelines.... Flexible multimodal interfaces that combine speech with touch or stylus input for selection are now being commercialized for invehicle, smart phone, and other applications, as illustrated by the multimodal interface created by SpeechWorks and Ford at the 2003 North American International Auto Show and described by [R. Pieraccini, K. Dayanidhi, J. Bloom, J. Gui Dahan, M. Phillips, B. Goodman, and K. Venkatesh Prasad].... To improve their coverage, reliability, and usability, multimodal interfaces likewise are being designed that can automatically learn and adapt to important user, task, and environmental parameters." Articles by [H. Nock, G. Iyengar, and C. Netil and [A. Jain and A. Ross] "emphasize the challenges involved in collecting large amounts of data needed to support successful adaptive processing, as well as the development of scalable systems that can handle large diverse user groups and challenging field settings.... [P. Cohen and D. McGee describe] new tangible multimodal interfaces designed to minimize or avoid loss of valuable functionality for medical, air traffic control, military, and similar applications. By embedding multimodal processing techniques into a familiar and portable paper-based interface, these authors also outline how Multimodal Interaction with Paper (MIP) systems can model users' existing work practices and assist in overcoming resistance to computer adoption.... This special section ... illustrate[s] how multimodal interfaces are rapidly beginning to incorporate new strategies to improve their performance while also providing new forms of computational functionality, especially for field and mobile applications."
|Printer friendly Cite/link Email Feedback|
|Title Annotation:||Information Management|
|Author:||Eble, Michelle F.|
|Date:||Aug 1, 2004|
|Previous Article:||The Many Books of Nature: Renaissance Naturalists and Information Overload.|
|Next Article:||Reading Strategies for Coping with Information Overload ca. 1550-1700.|