COGAIN: COmmunication by GAze INteraction

Reading the latest RTDinfo (Magazine for European Research), I learned about the COGAIN project: a consortium of European universities and industry. COGAIN focuses on improving the quality of life for those whose life is impaired by motor-control disorders, such as ALS or CP:

COGAIN assistive technologies will empower the target group to communicate by using the capabilities they have and by offering compensation for capabilities that are deteriorating. The users will be able to use applications that help them to be in control of the environment, or achieve a completely new level of convenience and speed in gaze-based communication. Using the technology developed in the network, text can be created quickly by eye typing, and it can be rendered with the user’s own voice. In addition to this, the network will provide entertainment applications for making the life of the users more enjoyable and more equal. COGAIN believes that assistive technologies serve best by providing applications that are both empowering and fun to use.

One of the output of the project is the application called Dasher. Dasher is a information-efficient communication system driven by continuous pointing gestures. Instead of using a keyboard, the user writes by continuous steering, zooming into a landscape painted with letters. Dasher can be driven by a regular mouse, by touch-screen, or by gaze-direction. Dasher uses a language model to reduce the number of gestures needed. Anything can be written, and well-predicted phrases can be written fastest. The language model can be trained on any documents, it learns as the user writes, and Dasher works in any of the languages of Europe. With practice, users can write at 25 words per minute via a gaze-tracker.

Newdasher



Sarahandmick

Copyright notice: the present content was taken from the following URL, the copyrights are reserved by the respective author/s.

REFERENCES:

[1] Aula, A., Majaranta, P. and Räihä, K.-J. (2005). Eye-tracking Reveals the Personal Styles for Search Result Evaluation. Human-Computer Interaction – INTERACT 2005, Lecture Notes in Computer Science 3585, Springer-Verlag, September 2005, 1058-1061.

[2] Böhme, M. and Barth, E. (2005). Challenges in Single-Camera Remote Eye-Tracking. In 1st Conference on Communication by Gaze Interaction (COGAIN), Copenhagen, Denmark.

Tags: , ,

Leave a Reply