Showing posts with label cogain. Show all posts
Showing posts with label cogain. Show all posts

Tuesday, January 26, 2010

ETRA 2k10 Program Announced!

The awaited program for this years Eye Tracking Research and Applications symposium held in Austin, Texas, 22nd-24th March has been announced. The biennial get-together for leading research on eye movement research targeting computer scientists, engineers and behavioral researchers, is organized in conjunction with the European Communication By Gaze Interaction (COGAIN) association. Which emphasizes a certain focus on gaze-based interaction for individuals with physical motor control disabilities. This years keynote will be given by Scott MacKenzie, Associate Professor of Computer Science and Engineering at York University, Canada.

The long papers section contains 18 entries reflecting the various areas of eye gaze research, from eye tracking, data analysis, visualization, cognitive studies, and interaction & control. In addition, the long list of short papers and a full poster section will ensure a worthwhile event for anyone interested in eye movement related research.

Update: Official detailed program now available.
Update: The papers are now available online.

Looking forward to meeting you there!

Eye tracking & technical achievements

Full papers:
  • Homography Normalization for Robust Gaze Estimation in Uncalibrated Setups
    Dan Witzner Hansen, Javier San Agustin, and Arantxa Villanueva.
    Full paper.

  • Head-Mounted Eye-Tracking of Infants’ Natural Interactions: A New Method
    John Franchak, Kari Kretch, Kasey Soska, Jason Babcock, and Karen Adolph. Full paper.

  • User-Calibration-Free Remote Gaze Estimation System
    Dmitri Model and Moshe Eizenman. Full paper.
Short papers:
  • The Pupillometric Precision of a Remote Video Eye Tracker
    Jeff Klingner. Short paper.

  • Biometric Identification via an Oculomotor Plant Mathematical Model
    Oleg Komogortsev, Sampath Jayarathna, Cecilia Aragon, and Mechehoul Mahmoud. Short paper.

  • SemantiCode: Using Content Similarity and Database-driven Matching to Code Wearable Eyetracker Gaze Data
    Daniel Pontillo, Thomas Kinsman, and Jeff Pelz
    . Short paper.

  • Gaze Estimation Method based on an Aspherical Model of the Cornea: Surface of Revolution about the Optical Axis of the Eye
    Takashi Nagamatsu, Yukina Iwamoto, Junzo Kamahara, Naoki Tanaka, and Michiya Yamamoto. Short paper.

  • User-calibration-free Gaze Tracking with Estimation of the Horizontal Angles between the Visual and the Optical Axes of Both Eyes
    Takashi Nagamatsu, Ryuichi Sugano, Yukina Iwamoto, Junzo Kamahara, and Naoki Tanaka. Short paper.

    Posters:

  • Evaluation of a Low-Cost Open-Source Gaze Tracker
    John Hansen, Dan Witzner Hansen, Emilie Møllenbach, Martin Tall, Javier San Agustin, Maria Barrett, and Henrik Skovsgaard. Poster.

  • Measuring Vergence Over Stereoscopic Video with a Remote Eye Tracker
    Brian Daugherty, Andrew Duchowski, Donald House, and Celambarasan Ramasamy. Poster.

  • Learning Relevant Eye Movement Feature Spaces Across Users.
    Zakria Hussain, Kitsuchart Pasupa, and John Shawe-Taylor
    . Poster.

  • Interactive Interface for Remote Administration of Clinical Tests Based on Eye Tracking
    Alberto Faro, Daniela Giordano, Concetto Spampinato, Davide De Tommaso, and Simona Ullo. Poster.

  • Robust Optical Eye Detection During Head Movement
    Jeffrey Mulligan and Kevin Gabayan. Poster.

  • Estimating 3D Point-of-regard and Visualizing Gaze Trajectories under Natural Head Movements
    Kentaro Takemura, Yuji Kohashi, Tsuyoshi Suenaga, Jun Takamatsu, and Tsukasa Ogasawara. Poster.

  • Eye Tracking with the Adaptive Optics Scanning Laser Ophthalmoscope
    Scott Stevenson, Austin Roorda, and Girish Kumar. Poster.

  • A Depth Compensation Method for Cross-Ratio Based Eye Tracking
    Flavio L. Coutinho and Carlos H. Morimoto. Poster.

  • Pupil Center Detection in Low Resolution Images
    Detlev Droege and Dietrich Paulus. Poster.

  • Development of Eye-Tracking Pen Display Based on Stereo Bright Pupil Technique
    Michiya Yamamoto, Takashi Nagamatsu, and Tomio Watanabe. Poster.

  • The Use of Eye Tracking for PC Energy Management
    Vasily Moshnyaga

  • Listing's and Donders' Laws and the Estimation of the Point-of-Gaze
    Elias Guestrin and Moshe Eizenman

Data processing & eye movement detection

Full papers:
  • A Vector-Based, Multi-Dimensional Scanpath Similiarty Measure
    Halszka Jarodzka, Kenneth Holmqvist, and Marcus Nyström.
    Full paper.

  • Match-Moving for Area-Based Analysis of Eye Movements in Natural Tasks
    Andrew Duchowski, Wayne Ryan, Ellen Vincent, and Dina Battisto. Full paper.

  • Fixation-Aligned Pupillary Response Averaging
    Jeff Klingner. Full paper.

    Posters:

  • Qualitative and Quantitative Scoring and Evaluation of the Eye Movement Classification Algorithms.
    Oleg Komogortsev, Sampath Jayarathna, Do Hyong Koh, and Sandeep Munikrishne Gowda
    . Poster.

  • Group-Wise Similarity and Classification of Aggregate Scanpaths
    Thomas Grindinger, Andrew Duchowski, and Michael Sawyer. Poster.

Visualization

Full papers:
  • Visual Scanpath Representation
    Joseph Goldberg and Jonathan Helfman. Full paper.

  • Scanpath Comparison Revisited
    Andrew Duchowski, Jason Driver, Sheriff Jolaoso, Beverly Ramey, Ami Robbins, and William Tan. Full paper.

  • Scanpath Clustering and Aggregation
    Joseph Goldberg and Jonathan Helfman. Full paper.

  • Space-Variant Spatio-Temporal Filtering of Video for Gaze Visualization and Perceptual Learning
    Michael Dorr, Halszka Jarodzka, and Erhardt Barth.
    Full paper.

    Posters:

  • Adapted Gaze Visualizations for Three-dimensional Virtual Environments
    Sophie Stellmach, Lennart Nacke, and Raimund Dachselt. Poster.

  • Visual Span and Other Parameters for the Generation of Heatmaps
    Pieter Blignaut. Poster.

Cognitive studies & HCI

Full papers:
  • Interpretation of Geometric Shapes - An Eye Movement Study
    Miquel Prats, Iestyn Jowers, Nieves Pedreira, Steve Garner, and Alison McKay. Full paper.

  • Understanding the Benefits of Gaze Enhanced Visual Search
    Pernilla Qvarfordt, Jacob Biehl, Gene Golovchinksy, and Tony Dunnigan. Full paper.

  • Image Ranking with Implicit Feedback from Eye Movements
    David Hardoon and Kitsuchart Pasupa. Full paper.

  • How the Interface Design Influences Users’ Spontaneous Trustworthiness Evaluations of Web Search Results: Comparing a List and a Grid Interface
    Yvonne Kammerer and Peter Gerjets. Full paper.

    Short papers:

  • Have You Seen Any of These Men? Looking at Whether Eyewitnesses Use Scanpaths to Recognize Suspects in Photo Lineups
    Sheree Josephson and Michael Holmes. Short paper.

  • Contingency Evaluation of Gaze-Contingent Displays for Real-Time Visual Field Simulations
    Margarita Vinnikov and Robert Allison. Short paper.

  • Estimation of Viewer's Response for Contextual Understanding of Tasks of Using Features of Eye-movements
    Minoru Nakayama and Yuko Hayashi. Short paper.
Posters:
  • Gaze-based Web Search: The Impact of Interface Design on Search Result Selection
    Yvonne Kammerer and Wolfgang Beinhauer. Poster

  • Visual Search in the (Un)Real World: How Head-Mounted Displays Affect Eye Movements, Head Movements and Target Detection
    Tobit Kollenberg, Alexander Neumann, Dorothe Schneider, Tessa-Karina Tews, Thomas Hermann, Helge Ritter, Angelika Dierker, and Hendrik Koesling. Poster.

  • Quantification of Aesthetic Viewing Using Eye-Tracking Technology: The Influence of Previous Training in Apparel Design
    Juyeon Park, Marilyn DeLong, and Emily Woods. Poster.

  • Visual Attention for Implicit Relevance Feedback in a Content Based Image Retrieval
    Concetto Spampinato, Alberto Faro, Daniela Giordano, and Carmelo Pino. Poster.

  • Eye and Pointer Coordination in Search and Selection Tasks
    Hans-Joachim Bieg, Lewis Chuang, Roland Fleming, Harald Reiterer, and Heinrich Bülthoff. Poster.

  • Natural Scene Statistics at Stereo Fixations
    Yang Liu, Lawrence Cormack, and Alan Bovik. Poster.

  • Measuring Situation Awareness of Surgeons in Laparoscopic Training
    Geoffrey Tien, Bin Zheng, Stella Atkins, and Colin Swindells

  • Saliency-Based Decision Support
    Roxanne Canosa. Poster.

  • Inferring Object Relevance from Gaze in Dynamic Scenes
    Melih Kandemir, Veli-Matti Saarinen, and Samuel Kaski. Poster.

  • Using Eye Tracking to Investigate Important Cues for Representative Creature Motion
    Meredith McLendon, Ann McNamara, Tim McLaughlin, and Ravindra Dwivedi

  • Estimating Cognitive Load Using Remote Eye Tracking in a Driving Simulator
    Oskar Palinko, Andrew Kun, Alexander Shyrokov, and Peter Heeman

Computer and machine control

Full papers:
  • Alternatives to Single Character Entry and Dwell Time Selection on Eye Typing
    Mario Urbina and Anke Huckauf.
    Full paper.

  • Designing Gaze Gestures for Gaming: an Investigation of Performance
    Howell Istance, Aulikki Hyrskykari, Lauri Immonen, Santtu Mansikkamaa, and Stephen Vickers. Full paper.

  • ceCursor, a Contextual Eye Cursor for General Pointing in Windows Environments
    Marco Porta, Alice Ravarelli, and Giovanni Spagnoli. Full paper.

  • BlinkWrite2: An Improved Text Entry Method Using Eye Blinks
    Behrooz Ashtiani and Scott MacKenzie
    . Full paper.

    Short papers:

  • Eye Movement as an Interaction Mechanism for Relevance Feedback in a Content-Based Image Retrieval System
    Yun Zhang, Hong FU, Zhen Liang, Zheru Chi, and Dagand Feng. Short paper.

  • Gaze Scribing in Physics Problem Solving
    David Rosengrant. Short paper.

  • Content-based Image Retrieval Using a Combination of Visual Features and Eye Tracking Data
    Zhen Liang, Hong FU, Yun Zhang, Zheru Chi, and Dagan Feng. Short paper.

  • Context Switching for Fast Key Selection in Text Entry Applications
    Carlos H. Morimoto and Arnon Amir. Short paper.

    Posters

  • Small-Target Selection with Gaze Alone
    Henrik Skovsgaard, Julio Mateo, John Flach, and John Paulin Hansen. Poster
  • What You See is Where You Go: Testing a Gaze-Driven Power Wheelchair for Individuals with Severe Multiple Disabilities
    Erik Wästlund, Kay Sponseller, and Ola Pettersson. Poster.

  • Single Gaze Gestures
    Emilie Møllenbach, Alastair Gale, Martin Lillholm, and John Paulin Hansen. Poster.

  • Using Vision and Voice to Create a Multimodal Interface for Microsoft Word 2007
    Tanya Beelders and Pieter Blignaut. Poster.

  • Towards Task-Independent Person Authentication Using Eye Movement Signals
    Tomi Kinnunen, Filip Sedlak, and Roman Bednarik

  • An Open Source Eye-gaze Interface: Expanding the Adoption of Eye-Gaze in Everyday Applications
    Craig Hennessey and Andrew Duchowski. Poster.

  • Pies with EYEs: The Limits of Hierarchical Pie Menus in Gaze Control
    Mario Urbina, Maike Lorenz, and Anke Huckauf. Poster.
  • Low-Latency Combined Eye and Head Tracking System for Teleoperating a Robotic Head in Real-Time
    Stefan Kohlbecher, Klaus Bartl, Stanislavs Bardins, and Erich Schneider. Poster.

Tuesday, August 18, 2009

COGAIN Student Competition Results

Lasse Farnung Laursen, a Ph.D student with the Department of Informatics and Mathematical Modeling at the Technical University of Denmark, won this years COGAIN student competition with the leisure application called GazeTrain.

"GazeTrain (illustrated in the screenshot below) is an action oriented puzzle game, that can be controlled by eye movements. In GazeTrain you must guide a train by placing track tiles in front of it. As you guide the train, you must collect various cargo and drop them off at the nearest city thereby earning money. For further details regarding how to play the game, we encourage you to read the tutorial accessible from the main menu. The game is quite customizable as the dwell time and several other parameters can be adjusted to best suit your play-style." (Source)

The GazeTrain game.

Runner ups, sharing the second place were

Music Editor, developed by Ainhoa Yera Gil, Public University of Navarre, Spain. Music Editor is a gaze-operated application that allows the user to compose, edit and play music by eye movements. The reviewers appreciated it that "a user can not only play but can actually create something" and that "Music Editor is well suited for gaze control".

Gaze Based Sudoku, developed by Juha Hjelm and Mari Pesonen, University of Tampere, Finland. The game can be operated by eye movements and it has three difficulty levels. Reviewers especially appreciated how "the separation between viewing and controlling and between sudoku grid and number selection panel is solved" and that the game "has no time constraints" so it is "relaxing" to play.

Monday, June 29, 2009

Video from COGAIN2009

John Paulin Hansen has posted a video showing some highlights from the annual COGAIN conference. It demonstrates three available gaze interaction solutions, the COGAIN GazeTalk interface, Tobii Technologies MyTobii and Alea Technologies IG-30. These interfaces relies on dwell-activated on-screen keyboards ( i.e. same procedure as last year).


Monday, June 1, 2009

COGAIN 2009 Proceedings now online

There is little reason to doubt the vitality of the COGAIN network. This years proceedings presents an impressive 18 papers spread out over one hundred pages. It covers a wide range of areas from low-cost eye tracking, text entry, gaze input for gaming, multimodal interaction to environment control, clinical assessments and case studies. Unfortunately I was unable to attend the event this year (recently relocated) but with the hefty proceedings being available online there is plenty of material to be read through (program and links to authors here) Thanks to Arantxa Villanuevua, John Paulin Hansen and Bjarne Kjaer Ersboll for the editorial effort.

Wednesday, May 6, 2009

COGAIN 2009 Program announced

This years Communication By Gaze Interaction conference is held on the 26th of May in Lyngby, Denmark in connection with the VisionDay (a four day event on computer vision). Registration for attending should be made on or before May 14th. Download program as pdf.

Update: the proceedings can be downloaded as pdf.


The program for May 26th
  • 08.00 Registration, exhibition, demonstrations, coffee, and rolls
SESSION I
  • 09.00 Welcome and introduction (Lars Pallesen, Rector @ DTU)
  • 09.10 Eye guidance in natural behaviour (B. W. Tatler)
  • 09.50 Achievements and experiences in the course of COGAIN (K. Raiha)
  • 10.30 Coffee, exhibition, demonstrations
SESSION II
  • 11.00 Joys and sorrows in communicating with gaze (A. Lykke-Larsen)
  • 11.30 An introduction to the 17 papers presented in the afternoon
  • 12.00 Lunch, exhibition, demonstrations, posters
SESSION III Track 1
SESSION III Track 2
14.50 Coffee, exhibition, demonstrations, posters

SESSION IV Track 1
  • 15.30 Gameplay experience in a gaze interaction game (L. Nacke, S. Stellmach, D. Sasse & C. A. Lindley)
  • 15.50 Select commands in 3D game environments by gaze gestures (S. Vickers, H. Istance & A. Hyrskykari)
  • 16.10 GazeTrain: A case study of an action oriented gaze-controlled game (L. F. Laursen & B. Ersbøll)
  • 16.30 Detecting Search and Rescue Targets in Moving Aerial Images using Eye-gaze (J. Mardell, M. Witkowski & R. Spence)
  • 16.50 Feasibility Study for the use of Eye-Movements in Estimation of Answer Correctness (M. Nakayama & Y. Hayashi)
SESSION IV Track 2
  • 15.30 Eye Tracker Connectivity (G. Daunys & V. Vysniauskas)
  • 15.50 SW tool supporting customization of eye tracking algorithms (P. Novák & O. Štepánková)
  • 16.10 Multimodal Gaze-Based Interaction (S. Trösterer & J. Dzaack)
  • 16.30 Gaze Visualization Trends and Techniques (S. Stellmach, L. Nacke, R. Dachselt & C. A. Lindley)
19.00 COGAIN2009 dinner at Brede Spisehus

Tuesday, February 10, 2009

COGAIN 2009 (26th May) "Gaze interaction for those who want it most".

"The 5th international COGAIN conference on eye gaze interaction emphasises user needs and future applications of eye tracking technology. Robust gaze interaction methods have been available for some years, with substantial amounts of applications to support communication, learning and entertainment already being used. However, there are still some uncertainties about this new technology among communication specialists and funding institutions. The 5th COGAIN conference will focus on spreading the experiences of people using gaze interaction in their daily life to potential users and specialists who have yet to benefit from it. Case studies from researchers and manufacturers working on new ways of making gaze interaction available for all, as well as integrating eye gaze with other forms of communication technology are also particularly welcome. We also encourage papers and posters which reach beyond the special case of eye control for people with disabilities into mainstream human-computer interaction development, for instance using eye tracking technology to enhance gaming experience and strategic play."

Themes:

  • Gaze-based access to computer applications
  • Gaze and environmental control
  • Gaze and personal mobility control
  • User experience studies
  • Innovations in eyetracking systems
  • Low cost gaze tracking systems
  • Attentive interfaces and inferring user intent from gaze
  • Gaze-based interaction with virtual worlds
  • Gaze and creativity
  • Gaming using gaze as an input modality
  • Gaze interaction with wearable displays
  • Using gaze with other modalities including BCI

"Papers which deal with the use of eye gaze to study the usability of mainstream applications and websites are not normally considered for inclusion in the conference". For more information see the COGAIN 2009 Call for Papers

Important dates:

Paper submission, 28th February. Notification on acceptance, 15th April. The conference will be held on the 26th of May at the Danish Technical University in connection with the Visionday event.

Monday, September 15, 2008

COGAIN 2008 Proceedings now online




Contents

Overcoming Technical Challenges in Mobile and Other Systems
  • Off-the-Shelf Mobile Gaze Interaction
    J. San Agustin and J. P. Hansen, IT University of Copenhagen, Denmark
  • Fast and Easy Calibration for a Head-Mounted Eye Tracker
    C. Cudel, S Bernet, and M Basset, University of Haute Alsace, France
  • Magic Environment
    L. Figueiredo, T. Nunes, F. Caetano, and A. Gomes, ESTG/IPG, Portugal
  • AI Support for a Gaze-Controlled Wheelchair
    P. Novák, T. Krajník, L. Přeučil, M. Fejtová, and O. Štěpánková. Czech Technical University, Czech Republic)
  • A Comparison of Pupil Centre Estimation Algorithms
    D. Droege, C Schmidt, and D. Paulus University of Koblenz-Landau, Germany

Broadening Gaze-Based Interaction Techniques
  • User Performance of Gaze-Based Interaction with On-line Virtual Communities
    H. Istance, De Montfort University, UK, A. Hyrskykari, University of Tampere, Finland, S. Vickers, De Montfort University, UK and N. Ali, University of Tampere, Finland

  • Multimodal Gaze Interaction in 3D Virtual Environments
    E. Castellina and F. Corno, Politecnico di Torino, Italy
  • How Can Tiny Buttons Be Hit Using Gaze Only?
    H. Skovsgaard, J. P. Hansen, IT University of Copenhagen, Denmark. J. Mateo, Wright State University, Ohio, US
  • Gesturing with Gaze
    H. Heikkilä, University of Tampere, Finland
  • NeoVisus: Gaze Driven Interface Components
    M. Tall, Sweden

Focusing on the User: Evaluating Needs and Solutions
  • Evaluations of Interactive Guideboard with Gaze-Communicative Stuffed-Toy Robot
    T. Yonezawa, H. Yamazoe, A. Utsumi, and S. Abe, ATR Intelligent Robotics and Communications Laboratories, Japan
  • Gaze-Contingent Passwords at the ATM
    P. Dunphy, A. Fitch, and P. Oliver, Newcastle University, UK
  • Scrollable Keyboards for Eye Typing
    O Špakov and P. Majaranta, University of Tampere, Finland
  • The Use of Eye-Gaze Data in the Evaluation of Assistive Technology Software for Older People.
    S. Judge, Barnsley District Hospital Foundation, UK and S. Blackburn, Sheffield University, UK
  • A Case Study Describing Development of an Eye Gaze Setup for a Patient with 'Locked-in Syndrome' to Facilitate Communication, Environmental Control and Computer Access.
    Z. Robertson and M. Friday, Barnsley General Hospital, UK

Friday, September 12, 2008

COGAIN 2008 Video

Some highlights from the visit to COGAIN 2008 last week in Prague which was a great event. It demonstrates the mobile solution integrating a head mounted display and an eye tracker by Javier San Agustín. A sneak peak of the NeoVisus iTube interface running on the SMI IViewX RED. A demonstration of the Neural Impulse Actuator from OCZ Technolgies by Henrik Skovsgaard. Demo of the gaze controlled wheelchair developed by Falck Igel and Alea Technologies. Thanks to John Paulin Hansen for creating the video.

Wednesday, July 9, 2008

GazeTalk 5

The GazeTalk system is one of the most comprehensive open solutions for gaze interaction today. It has been developed with the disabled users in mind and supports a wide range of everyday tasks. It dramatically increases the quality of life for the disabled suffering from ALS or similar conditions. The following information is quoted from the COGAIN website.

Information about Gazetalk 5 eye communication system

GazeTalk is a predictive text entry system that has a restricted on-screen keyboard with ambiguous layout for severely disabled people. The main reason for using such a keyboard layout is that it enables the use of an eye tracker with a low spatial resolution (e.g., a web-camera based eye tracker).

The goal of the GazeTalk project is to develop an eye-tracking based AAC system that supports several languages, facilitates fast text entry, and is both sufficiently feature-complete to be deployed as the primary AAC tool for users, yet sufficiently flexible and technically advanced to be used for research purposes. The system is designed for several target languages, initially Danish, English, Italian, German and Japanese.

Main features

  • type-to-talk
  • writing
  • email
  • web – browser
  • Multimedia – player
  • PDF – reader
  • letter and word prediction, and word completion
  • speech output
  • can be operated by gaze, headtracking, mouse, joystick, or any other pointing device
  • supports step-scanning (new!)
  • supports users with low precision in their movements, or trackers with low accuracy
  • allows the user to use Dasher inside GazeTalk and to transfer the text written in Dasher back to GazeTalk

GazeTalk 5.0 has been designed and developed by the Eye Gaze Interaction Group at the IT University of Copenhagen and the IT-Lab at the Royal School of Library and Information Science, Copenhagen.


gazetalk v5 screen shot gazetalk v5, linked with Dasher - screen shot

more info Read more About Gazetalk or view GazeTalk manual PDF icon

Short manual on data recording in Gazetalk Short manual on data recording in GazeTalk PDF icon

GazeTalk Videos

more info Download Gazetalk

Thursday, March 13, 2008

COGAIN 2008: Communication, Environment and Mobility Control by Gaze

This years COGAIN conference will be held in Prague, 2-3 September. Detailed information on paper submission, program, dates, venues etc can be found at http://www.cogain.org/cogain2008

Themes this year are
  • Text entry by means of gaze
  • Gaze and environmental control
  • Gaze and personal mobility control
  • Direct interaction with gaze aware real world objects
  • User experience studies
  • Innovations in eyetracking systems
  • Low cost gaze tracking systems
  • Attentive interfaces and inferring user intent from gaze
  • Gaze-based interaction with virtual worlds
  • Gaze and creativity
  • Gaming using gaze as an input modality
  • Using gaze with other modalities including BCI

Wednesday, February 20, 2008

Inspiration: StarGazer (Skovsgaard et al, 2008)

A major area of research for the COGAIN network is to enable communication for the disabled. The Innovative Communications group at IT University of Copenhagen continuously work on making gaze-based interaction technology more accessible, especially in the field of assistive technology.

The ability to enter text into the system is crucial for communication, without hands or speech this is somewhat problematic. The StartGazer software aims at solving this by introducing a novel 3D approach to text entry. In December I had the opportunity to visit ITU and try the StarGazer (among other things) myself, it is astonishingly easy to use. Within just a minute I was typing with my eyes. Rather than describing what it looks like, see the video below.
The associated paper is to be presented at the ETRA08 conference in March.



This introduces an important solution to the problem of eye tracker inaccuracy namely zooming interfaces. Fixating on a specific region of the screen will display an enlarged version of this area where objects can be earlier discriminated and selected.

The eyes are incredibly fast but from the perspective of eye trackers not really precise. This is due to the physiology properties of our visual system, in specific the foveal region of the eye. This retinal area produces the sharp detailed region of our visual field which in practice covers about the size of a thumbnail on an armslenght distance. To bring another area into focus a saccade will take place which moves the pupil, thus our gaze, this is what is registered by the eye tracker. Hence the discrimination of most eye trackers are in the 0.5-1 degree (in theory that is)

A feasible solution to deal with this limitation in accuracy is to use the display space dynamically and zoom into the areas of interest upon glancing. The zooming interaction style solves some of the issues with inaccuracy and jitter of the eye trackers but in addition it has to be carefully balanced so that it still provides a quick and responsive interface.

However, the to me the novelty in the StarGazer is the notion of traveling through a 3D space, the sensation of movement really catches ones attention and streamlines the interaction. Since text entry is really linear character by character, flying though space by navigating to character after character is a suitable interaction style. Since the interaction is nowhere near the speed of two hand keyboard entry the employment of linguistic probabilities algorithms such as those found in cellphones will be very beneficial (ie. type two or three letters and the most likely words will display in a list) Overall, I find the spatial arrangement of gaze interfaces to be a somewhat unexplored area. Our eyes are made to navigate in a three dimensional world while the traditional desktop interfaces mainly contains a flat 2D view. This is something I intend to investigate further.

Inspiration: COGAIN

Much of the developments seen in the field of gaze interaction stems from the assistive technology field where users whom are unable to use regular computer interfaces are provided tools to empower their everyday life in a wide range of activities such as communication, entertainment, home control etc. For example they can use the eye to type words and sentences which then are synthetically translated into spoken language by software, thus enabling communication beyond blinking. A major improvement in the quality of life.

"COGAIN (Communication by Gaze Interaction) integrates cutting-edge expertise on interface technologies for the benefit of users with disabilities. COGAIN belongs to the eInclusion strategic objective of IST. COGAIN focuses on improving the quality of life for those whose life is impaired by motor-control disorders, such as ALS or CP. COGAIN assistive technologies will empower the target group to communicate by using the capabilities they have and by offering compensation for capabilities that are deteriorating. The users will be able to use applications that help them to be in control of the environment, or achieve a completely new level of convenience and speed in gaze-based communication. Using the technology developed in the network, text can be created quickly by eye typing, and it can be rendered with the user's own voice. In addition to this, the network will provide entertainment applications for making the life of the users more enjoyable and more equal. COGAIN believes that assistive technologies serve best by providing applications that are both empowering and fun to use."

A short introduction by Dr Richard Bates, a research fellow at the School of Computing Sciences at the De Montfort University in Leicester, can be downloaded either as presentation slides or paper.

The COGAIN network is a rich source of information on gaze interaction. A set of tools developed within the network has been made publicly software available for download. Make sure to check out the video demonstations of various gaze interaction tools.

Participating organizations within the COGAIN network.