Showing posts with label ITU. Show all posts
Showing posts with label ITU. Show all posts

Monday, May 9, 2011

"Read my Eyes" - A presentation of the ITU Gaze Tracker

During the last month the guys at IT University of Copenhagen has been involved in the making of a video that's intended to introduce the ITU Gaze Tracker, an open source eye tracker, to a wider audience. The production has been carried out in collaboration with the Communication Department at the university and  features members of the group, students of the HCI class and Birger Bergmann Jeppesen who has had ALS since 1996. Many thanks to all involved, especially Birger & co for taking interest and participating in evaluation of the system.

Friday, April 29, 2011

GazeGroup's Henrik Skovsgaard wins "Stars with brains" competiton

During the Danish Research Day 2011 Henrik Skovsgaard, PhD candidate at @ ITU Copenhagen, won the competition "Stars with Brains" (Stjerner med hjerner). Several high profile individuals (stars) were present including the Minister of Science, Princess Marie and Mayor Frank Jensen. The competition consisted of eight doctoral students (brains) from universities across Denmark who presented their research in a layman terms. The audience voted on their favorite candidate using SMS messaging whereby a panel of judges evaluated the participants. Later in the day Henrik was invited to an interview on the Aftenshow on national TV. Henriks research at the IT University of Copenhagen focuses primarily on gaze-based interaction as a communication tool for disabled and have participated in the development of the Gazegroup.org software. A big congrats to Henrik for the award, excellent public outreach and associated stardom!

PhD student Henrik Skovsgaard won the "Stars with brains". Photo: Tariq Mikkel Khan (source)

From right: Mayor Frank Jensen, HRH Princess Marie and Minister of Science Charlotte Sahl-Madsen. Photo: Tariq Mikkel Khan (source)



Wednesday, March 2, 2011

Head-mounted eye-tracking application for driving

Nicolas Schneider have for his masters thesis modified the ITU Gaze Tracker for eye tracking in an automotive setting. It incorporates a scene camera and software that calibrates and integrates it in the platform. The project was carried out at Schepens Eye Research Institute at Harvard and there is a good chance it will be released open source. A fine piece of work and an awesome addition to the framework. We're impressed by the results. More info to follow, for now enjoy this video.



  • Nicolas Schneider, Peter Bex, Erhardt Barth, and Michael Dorr. 2011. An open-source low-cost eye-tracking system for portable real-time and offline tracking. In Proceedings of the 1st Conference on Novel Gaze-Controlled Applications (NGCA '11). ACM, New York, NY, USA, , Article 8 , 4 pages. (Full text: PDF Online)


Thursday, October 28, 2010

Gaze Tracker 2.0 Preview

On my 32nd birthday I'd like to celebrate by sharing this video highlighting some of the features in the latest version of the GT2.0 that I've been working on with Javier San Agustin and the GT forum. Open source eye tracking have never looked better. Enjoy!


HD video available (click 360p and select 720p)

Wednesday, April 14, 2010

Open-source gaze tracker awarded Research Pearls of ITU Copenhagen

The open-source eye tracker ITU Gaze Tracker primarily developed by Javier San Augustin, Henrik Skovsgaard and myself has been awarded the Research Pearls of the IT University of Copenhagen. A presentation will be held at ITU on May 6th at 2pm. The software released one year ago have seen more than 5000 downloads by students and hobbyist around the world. It's rapidly approaching a new release which will offer better performance and stability for remote tracking and many bug fixes in general. The new version adds support for a whole range of new HD web cameras. These provides a vastly improved image quality that finally brings hope for a low-cost, open, flexible and reasonably performing solution. The ambitious goal strives to make eye tracking technology available for everyone, regardless of available resources. Follow the developments at the forum. Additional information is available at the ITU Gaze Group.

"The Open-Source ITU Gaze Tracker"

Abstract:
Gaze tracking offers them the possibility of interacting with a computer by just using eye movements, thereby making users more independent. However, some people (for example users with a severe disability) are excluded from access to gaze interaction due to the high prices of commercial systems (above 10.000€). Gaze tracking systems built from low-cost and off-the-shelf components have the potential of facilitating access to the technology and bring prices down.

The ITU Gaze Tracker is an off-the-shelf system that uses an inexpensive web cam or a video camera to track the user’s eye. It is free and open-source, offering users the possibility of trying out gaze interaction technology for a cost as low as 20€, and to adapt and extend the software to suit specific needs.

In this talk we will present the open-source ITU Gaze Tracker and show the different scenarios in which the system has been used and evaluated.

Friday, December 11, 2009

PhD Defense: Off-the-Shelf Gaze Interaction

Javier San Agustin will defend his PhD thesis on "Off-the-Shelf Gaze Interaction" at the IT University of Copenhagen on the 8th of January from 13.00 to (at most) 17.00. The program for the event consists of a one hour presentation which is followed by a discussion with the committee, formed by Andrew Duchowski, Bjarne Kjær Ersbøll, and Arne John Glenstrup. Whereby a traditional reception with snacks and drinks will be held.

Update: The thesis is now available as PDF, 179 pages, 3.6MB.

Abstract of the thesis:


People with severe motor-skill disabilities are often unable to use standard input devices such as a mouse or a keyboard to control a computer and they are, therefore, in strong need for alternative input devices. Gaze tracking offers them the possibility to use the movements of their eyes to interact with a computer, thereby making them more independent. A big effort has been put toward improving the robustness and accuracy of the technology, and many commercial systems are nowadays available in the market.

Despite the great improvements that gaze tracking systems have undergone in the last years, high prices have prevented gaze interaction from becoming mainstream. The use of specialized hardware, such as industrial cameras or infrared light sources, increases the accuracy of the systems, but also the price, which prevents many potential users from having access to the technology. Furthermore, the different components are often required to be placed in specific locations, or are built into the monitor, thus decreasing the flexibility of the setup.

Gaze tracking systems built from low-cost and off-the-shelf components have the potential to facilitate access to the technology and bring the prices down. Such systems are often more flexible, as the components can be placed in different locations, but also less robust, due to the lack of control over the hardware setup and the lower quality of the components compared to commercial systems.

The work developed for this thesis deals with some of the challenges introduced by the use of low-cost and off-the-shelf components for gaze interaction. The main contributions are:
  • Development and performance evaluation of the ITU Gaze Tracker, an off-the-shelf gaze tracker that uses an inexpensive webcam or video camera to track the user's eye. The software is readily available as open source, offering the possibility to try out gaze interaction for a low price and to analyze, improve and extend the software by modifying the source code.
  • A novel gaze estimation method based on homographic mappings between planes. No knowledge about the hardware configuration is required, allowing for a flexible setup where camera and light sources can be placed at any location.
  • A novel algorithm to detect the type of movement that the eye is performing, i.e. fixation, saccade or smooth pursuit. The algorithm is based on eye velocity and movement pattern, and allows to smooth the signal appropriately for each kind of movement to remove jitter due to noise while maximizing responsiveness.

Monday, November 23, 2009

ITU GazeTracker in the wild

Came across these two Youtube videos from students out there using the ITU GazeTracker in their HCI projects. By now the software has been downloaded 3000 times and the forum has seen close to three hundred posts. It's been a good start, better yet, a new version is in the makings. It offers a complete network API for third party applications, improved tracking performance, better camera control and a number of bugfixes (thanks for your feedback). It will be released when it's ready.







Thanks for posting the videos!

Wednesday, May 13, 2009

Hi –fi eyetracking with a lo-fi eyetracker: An experimental usability study of an eyetracker built from a standard web camara (Barret, M., 2009)

Marie Barret, a masters student at the ITU Copenhagen have now finished her thesis. It evaluates eye typing performance using the ITU Gaze Tracker (low-cost web cam eye tracker) in the Stargazer and GazeTalk interfaces. The thesis in written in Danish (113 pages) but I took the freedom of translating two charts from the thesis found below. The results will be presented in English at the COGAIN 2009 conference, May 26th (session three, track one at 1:50PM) For now I quote the abstract:

"Innovation has facilitated sufficient mainstream technology to build eyetrackers from off-the-shelf-components. Prices for standard eyetrackers start at around € 4000. This thesis describes an experimental usabilty study of gazetyping with a new input device built from a standard web camera without hardware modifications. Cost: € 20. Mainstreaming of assistive technologies holds potential for faster innovation, better service, lower prices and increased accessibility. Off-the-shelf-eyetrackers must be usability competitive to standard eyetrackers in order to be adopted, as eyetracking - even with expensive hardware - presents usability issues. Usability is defined as effectiveness, efficiency and user satisfaction (ISO 9242-11, 1998).

Results from the 2 * 2 factors experiment significantly indicate how the new input device can reach the usability standards of expensive eyetrackers. This study demonstrates that the off-the-shelf-eyetracker can achieve efficiency similar to an expensive eyetracker with no significant effect from any of the tested factors. All four factors have significant impact on effectiveness. A factor that can eliminate the effectiveness difference between the standard hardware and an expensive eyetracker is identified. Another factor can additionally improve effectiveness.

Two gazetyping systems specifically designed for noisy conditions e.g. due to bad calibration and jolting are tested. StarGazer uses a zooming interface and GazeTalk uses large buttons in a static graphic user interface. GazeTalk is significantly more effective than StarGazer. The large onscreen buttons and static interface of GazeTalk with dwell time activation absorb the noise from the input device and typing speeds obtained are comparable to prior research with a regular eyetracker. Clickactivation has for years (Ware & Mikaelian 1987) proved to improve efficiency of gazebased interaction. This experiment demonstrates that this result significantly applies to off-the-shelf eyetrackers as well. The input device relies on the user to compensate for off-set with head movements. The keyboards should support this task with a static graphic user interface." Download thesis as pdf (in Danish)

Friday, May 1, 2009

Gaze Controlled Driving

This is the paper on using eye trackers for remote robot navigation I had accepted for the CHI09 conference. It has now appeared on the ACM website. Note that the webcam tracker referred to in the paper is the ITU Gaze Tracker in an earlier incarnation. The main issue while using it is that head movements affect the gaze position and creates an offset. This is easier to correct and counterbalance on a static background than moving image (while driving!)

Abstract
"We investigate if the gaze (point of regard) can control a remote vehicle driving on a racing track. Five different input devices (on-screen buttons, mouse-pointing low-cost webcam eye tracker and two commercial eye tracking systems) provide heading and speed control on the scene view transmitted from the moving robot. Gaze control was found to be similar to mouse control. This suggests that robots and wheelchairs may be controlled ―hands-free‖ through gaze. Low precision gaze tracking and image transmission delays had noticeable effect on performance."

  • Tall, M., Alapetite, A., San Agustin, J., Skovsgaard, H. H., Hansen, J. P., Hansen, D. W., and Møllenbach, E. 2009. Gaze-controlled driving. In Proceedings of the 27th international Conference Extended Abstracts on Human Factors in Computing Systems (Boston, MA, USA, April 04 - 09, 2009). CHI EA '09. ACM, New York, NY, 4387-4392. DOI= http://doi.acm.org/10.1145/1520340.1520671

Monday, April 27, 2009

ITU Gaze Tracker: Low-cost gaze interaction: ready to deliver the promises (San Agustin, J et al., 2009)

The research paper on the ITU Gaze Tracker that Javier San Agustin presented at CHI09 is now available at the ACM website. It evaluates a previous version of the gaze tracker in two tasks, target acquisition and eye typing in comparison with mouse, SMI IViewX RED and the Tobii 1750.

Abstract
"Eye movements are the only means of communication for some severely disabled people. However, the high prices of commercial eye tracking systems limit the access to this technology. In this pilot study we compare the performance of a low-cost, web cam-based gaze tracker that we have developed with two commercial trackers in two different tasks: target acquisition and eye typing. From analyses on throughput, words per minute and error rates we conclude that a low-cost solution can be as efficient as expensive commercial systems."














  • San Agustin, J., Skovsgaard, H., Hansen, J. P., and Hansen, D. W. 2009. Low-cost gaze interaction: ready to deliver the promises. In Proceedings of the 27th international Conference Extended Abstracts on Human Factors in Computing Systems (Boston, MA, USA, April 04 - 09, 2009). CHI EA '09. ACM, New York, NY, 4453-4458.
    Download at ACM website.

A brief users guide to the ITU Gaze Tracker

Today we release a short users guide for the open source eye tracker we presented some weeks ago. Hopefully it will assist first time users to configure the software and understanding the limitations of the initial version. Comments and suggestions appreciated.


Friday, April 17, 2009

IDG Interview with Javier San Agustin

During the CHI09 in Boston last week Nick Barber from the IDG Network stopped by to record an interview with Javier San Agustin, member of the ITU GazeGroup. The video has now surfaced on several IDG sites around the world, clearly there is an interest for easy to use, low cost eye tracking. After the initial release of ITU Gaze Tracker we have setup a community forum at forum.gazegroup.org, with the ambition to connect users of open source eye tracking. If you like to be part of project, please join in promoting and developing an alternative. It´s open and accessible for all (platform documentation to be released in next week)

Hopefully, ideas and contributions to platform through the community makes the platform take off. Considering the initial release to be a Beta version, there are of course additional improvements to make. Additional cameras needs to be verified and bugs in code to be handled.

If you experience any issues or have ideas for improvements please post at http://forum.gazegroup.org



Computerworld.com.au

WebWereld.nl

PCAdvisor.co.uk

TechWorld.nl

IDG.no/ComputerWorld

ComputerWorld.dk

ComputerWorld.hu

ARNnet.com.au

Tuesday, March 31, 2009

Radio interview with DR1

Thomas Behrndtz from the Danish Radio (DR1) came by the other day to do an interview on the upcoming ITU Gaze Interaction platform. It resulted in a five minute episode on the "Videnskaben kort", a radio program on interesting progress in science. Lately we have been working hard on the software package which is to be released at CHI09 in Boston next week. It includes a number of applications and tools that are to be released for free download including source code under the GPL licence. In short, these are exciting times for low-cost eye tracking and gaze interaction. Stay tuned..

Click on image to hear the radio interview (in Danish/Swedish)

Friday, November 21, 2008

Eye movement control of remote robot

Yesterday we demonstrated our gaze navigated robot at the Microsoft Robotics event here at ITU Copenhagen. The "robot" transmits a video which is displayed on a client computer. By using an eye tracker we can direct the robot towards where the user is looking. The concept allows for a human-machine interaction with a direct mapping of the users intention. The Danish National TV (DR) came by today and recorded a demonstration. It will be shown tonight at the nine o´ clock news. Below is a video that John Paulin Hansen recorded yesterday which demonstrates the system. Please notice that the frame-rate of the video stream was well below average at the time of recording. It worked better today. In the coming week we'll look into alternative solutions (suggestions appreciated) The projects has been carried out in collaboration with Alexandre Alapetite from DTU. His low-cost, LEGO-based rapid mobile robot prototype, gives interesting possibilities to test some human-computer and human-robot interaction.



The virgin tour around the ITU office corridor (on YouTube)



Available on YouTube

Friday, September 12, 2008

COGAIN 2008 Video

Some highlights from the visit to COGAIN 2008 last week in Prague which was a great event. It demonstrates the mobile solution integrating a head mounted display and an eye tracker by Javier San Agustín. A sneak peak of the NeoVisus iTube interface running on the SMI IViewX RED. A demonstration of the Neural Impulse Actuator from OCZ Technolgies by Henrik Skovsgaard. Demo of the gaze controlled wheelchair developed by Falck Igel and Alea Technologies. Thanks to John Paulin Hansen for creating the video.

Friday, March 7, 2008

Inspiration: All Eyes on the Monitor (Mollenbach et al, 2008)

Going further with the Zooming User Interface (ZUI) is the prototype descibed in the "All Eyes on the Monitor: Gaze Based Interaction in Zoomable, Multi-Scaled Information-Space" (E. Mollenbach, T. Stefansson, J-P Hansen) developed at the Loughborough University in the U.K and the ITU INC, Denmark. It employes the gaze based pan/zoom interaction style which is suitable for gaze interaction to resolve the inaccuracy (target sizes increase when zooming in to them) Additionally, the results indicate that for certain tasks gaze based interaction is faster than traditional mouse operation.



ABSTRACT
The experiment described in this paper, shows a test environment constructed with two information spaces; one large with 2000 nodes ordered in semi-structured groups in which participants performed search and browse tasks; the other was smaller and designed for precision zooming, where subjects performed target selection simulation tasks. For both tasks, modes of gaze- and mouse-controlled navigation were compared. The results of the browse and search tasks showed that the performances of the most efficient mouse and gaze implementations were indistinguishable. However, in the target selection simulation tasks the most efficient gaze control proved to be about 16% faster than the most efficient mouse-control. The results indicate that gaze-controlled pan/zoom navigation is a viable alternative to mouse control in inspection and target exploration of large, multi-scale environments. However, supplementing mouse control with gaze navigation also holds interesting potential for interface and interaction design. Download paper (pdf)

The paper was presented at the annual International Conference for Intelligent Interfaces (IUI) that was held in Maspalomas, Gran Canaria between 13-16th January 2008.

Wednesday, February 20, 2008

Inspiration: ZoomNavigator (Skovsgaard, 2008)

Following up on the StartGazer text entry interface presented in my previous post, another approach to using zooming interfaces is employed in the ZoomNavigator (Skovsgaard, 2008) It addresses the well known issue of using gaze as input on traditional desktop systems, namely inaccuracy and jitter. Interesting solution which relies on dwell-time execution compared to the EyePoint system (Kumar&Winograd, 2007) which is described in the next post.

Abstract
The goal of this research is to estimate the maximum amount of noise of a pointing device that still makes interaction with a Windows interface possible. This work proposes zoom as an alternative activation method to the more well-known interaction methods (dwell and two-step-dwell activation). We present a magnifier called ZoomNavigator that uses the zoom principle to interact with an interface. Selection by zooming was tested with white noise in a range of 0 to 160 pixels in radius on an eye tracker and a standard mouse. The mouse was found to be more accurate than the eye tracker. The zoom principle applied allowed successful interaction with the smallest targets found in the Windows environment even with noise up to about 80 pixels in radius. The work suggests that the zoom interaction gives the user a possibility to make corrective movement during activation time eliminating the waiting time found in all types of dwell activations. Furthermore zooming can be a promising way to compensate for inaccuracies on low-resolution eye trackers or for instance if people have problems controlling the mouse due to hand tremors.


The sequence of images are screenshots from ZoomNavigator showing
a zoom towards a Windows file called ZoomNavigator.exe.

The principles of ZoomNavigator are shown in the figure above. Zooming is used to focus on the attended object and eventually make a selection (unambiguous action). ZoomNavigator allows actions similar to those found in a conventional mouse. (Skovsgaard, 2008) The system is described in a conference paper titled "Estimating acceptable noise-levels on gaze and mouse selection by zooming" Download paper (pdf)

Two-step zoom
The two-step zoom activation is demonstrated in the video below by IT University of Copenhagen (ITU) research director prof. John Paulin Hansen. Notice how the error rate is reduced by the zooming style of interaction, making it suitable for applications with need for detailed discrimination. It might be slower but error rates drops significantly.



"Dwell is the traditional way of making selections by gaze. In the video we compare dwell to magnification and zoom. While the hit-rate is 10 % with dwell on a 12 x 12 pixels target, it is 100 % for both magnification and zoom. Magnification is a two-step process though, while zoom only takes on selection. In the experiment, the initiation of a selection is done by pressing the spacebar. Normally, the gaze tracking system will do this automatically when the gaze remains within a limited area for more than approx. 100 ms"

For more information see the publications of the ITU.

Inspiration: StarGazer (Skovsgaard et al, 2008)

A major area of research for the COGAIN network is to enable communication for the disabled. The Innovative Communications group at IT University of Copenhagen continuously work on making gaze-based interaction technology more accessible, especially in the field of assistive technology.

The ability to enter text into the system is crucial for communication, without hands or speech this is somewhat problematic. The StartGazer software aims at solving this by introducing a novel 3D approach to text entry. In December I had the opportunity to visit ITU and try the StarGazer (among other things) myself, it is astonishingly easy to use. Within just a minute I was typing with my eyes. Rather than describing what it looks like, see the video below.
The associated paper is to be presented at the ETRA08 conference in March.



This introduces an important solution to the problem of eye tracker inaccuracy namely zooming interfaces. Fixating on a specific region of the screen will display an enlarged version of this area where objects can be earlier discriminated and selected.

The eyes are incredibly fast but from the perspective of eye trackers not really precise. This is due to the physiology properties of our visual system, in specific the foveal region of the eye. This retinal area produces the sharp detailed region of our visual field which in practice covers about the size of a thumbnail on an armslenght distance. To bring another area into focus a saccade will take place which moves the pupil, thus our gaze, this is what is registered by the eye tracker. Hence the discrimination of most eye trackers are in the 0.5-1 degree (in theory that is)

A feasible solution to deal with this limitation in accuracy is to use the display space dynamically and zoom into the areas of interest upon glancing. The zooming interaction style solves some of the issues with inaccuracy and jitter of the eye trackers but in addition it has to be carefully balanced so that it still provides a quick and responsive interface.

However, the to me the novelty in the StarGazer is the notion of traveling through a 3D space, the sensation of movement really catches ones attention and streamlines the interaction. Since text entry is really linear character by character, flying though space by navigating to character after character is a suitable interaction style. Since the interaction is nowhere near the speed of two hand keyboard entry the employment of linguistic probabilities algorithms such as those found in cellphones will be very beneficial (ie. type two or three letters and the most likely words will display in a list) Overall, I find the spatial arrangement of gaze interfaces to be a somewhat unexplored area. Our eyes are made to navigate in a three dimensional world while the traditional desktop interfaces mainly contains a flat 2D view. This is something I intend to investigate further.