Friday, December 11, 2009

PhD Defense: Off-the-Shelf Gaze Interaction

Javier San Agustin will defend his PhD thesis on "Off-the-Shelf Gaze Interaction" at the IT University of Copenhagen on the 8th of January from 13.00 to (at most) 17.00. The program for the event consists of a one hour presentation which is followed by a discussion with the committee, formed by Andrew Duchowski, Bjarne Kjær Ersbøll, and Arne John Glenstrup. Whereby a traditional reception with snacks and drinks will be held.

Update: The thesis is now available as PDF, 179 pages, 3.6MB.

Abstract of the thesis:


People with severe motor-skill disabilities are often unable to use standard input devices such as a mouse or a keyboard to control a computer and they are, therefore, in strong need for alternative input devices. Gaze tracking offers them the possibility to use the movements of their eyes to interact with a computer, thereby making them more independent. A big effort has been put toward improving the robustness and accuracy of the technology, and many commercial systems are nowadays available in the market.

Despite the great improvements that gaze tracking systems have undergone in the last years, high prices have prevented gaze interaction from becoming mainstream. The use of specialized hardware, such as industrial cameras or infrared light sources, increases the accuracy of the systems, but also the price, which prevents many potential users from having access to the technology. Furthermore, the different components are often required to be placed in specific locations, or are built into the monitor, thus decreasing the flexibility of the setup.

Gaze tracking systems built from low-cost and off-the-shelf components have the potential to facilitate access to the technology and bring the prices down. Such systems are often more flexible, as the components can be placed in different locations, but also less robust, due to the lack of control over the hardware setup and the lower quality of the components compared to commercial systems.

The work developed for this thesis deals with some of the challenges introduced by the use of low-cost and off-the-shelf components for gaze interaction. The main contributions are:
  • Development and performance evaluation of the ITU Gaze Tracker, an off-the-shelf gaze tracker that uses an inexpensive webcam or video camera to track the user's eye. The software is readily available as open source, offering the possibility to try out gaze interaction for a low price and to analyze, improve and extend the software by modifying the source code.
  • A novel gaze estimation method based on homographic mappings between planes. No knowledge about the hardware configuration is required, allowing for a flexible setup where camera and light sources can be placed at any location.
  • A novel algorithm to detect the type of movement that the eye is performing, i.e. fixation, saccade or smooth pursuit. The algorithm is based on eye velocity and movement pattern, and allows to smooth the signal appropriately for each kind of movement to remove jitter due to noise while maximizing responsiveness.

Tuesday, December 8, 2009

Scandinavian Workshop on Applied Eye-tracking (SWAET) 2010.

The first call for papers for the annual Scandinavian Workshop on Applied Eye-Tracking (SWAET) organized by Kenneth Holmqvist and the team at the Lund University Humanities laboratory was just announced. The SWAET 2010 will be held in Lund, Sweden between May 5-7th. The invited speaker is Gerry Altmann (blog) from the Dept. of Psychology at University of York, UK and Ignace Hooge (s1, s2) from the Dept. of Psychology at Utrecht University, Holland.

Visit the SWAET website for more information.

Update: Download the abstracts (pdf, 1Mb)

Tuesday, November 24, 2009

Remote tracker and 6DOF using a webcam

The following video clips demonstrates a Masters thesis project from the AGH University of Science and Technology in Cracow, Poland. The method developed provides 6 degrees of freedom head tracking and 2D eye tracking using a simple, low resolution 640x480 webcam. Under the hood it's based on the Lucas-Kanade optical flow and POSIT. A great start as the head tracking seems relatively stable. Imagine it with IR illumination, a camera with slightly higher resolution and a narrow angle lens. And of course, pupil + glint tracking algorithms for calibrated gaze estimation.


Monday, November 23, 2009

ITU GazeTracker in the wild

Came across these two Youtube videos from students out there using the ITU GazeTracker in their HCI projects. By now the software has been downloaded 3000 times and the forum has seen close to three hundred posts. It's been a good start, better yet, a new version is in the makings. It offers a complete network API for third party applications, improved tracking performance, better camera control and a number of bugfixes (thanks for your feedback). It will be released when it's ready.







Thanks for posting the videos!

Wednesday, October 21, 2009

Nokia near-eye display gaze interaction update

The Nokia near-eye gaze interaction platform that I tried in Finland last year has been further improved. The cap used to support the weight has been replaced with a sturdy frame and the overall prototype seems lighter and also incorporates headphones. The new gaze based navigation interface support photo browsing based on the Image Space application, allowing location based accesses to user generated content. See the concept video at the bottom for their futuristic concept. Nokia research website. The prototype will be displayed at the International Symposium on Mixed and Augmented Reality conference in Orlando, October 19-22.






Medical Image Perception Society 2009 - Day three

Session 10. Displays and Tools. Chair: Kevin Berbaum, PhD
  • Objective methodology to compare clinical value of computed tomography artifact reduction algorithms. G Spalla, C Marchessoux, M Vaz, A Ricker, & T Kimpe
  • LCD Spatial Noise Suppression: Large-field vs. ROI Image Processing. WJ Dallas, H Roehrig, J Fan, EA Krupinski, & J Johnson
Session 11. Displays and Tools. Chair: Miguel Eckstein, PhD
  • Stereoscopic Digital mammography: Improved Accuracy of Lesion Detection in Breast Cancer Screening. DJ Getty, CJ D’Orsi, & RM Pickett
  • Detectability in tomosynthesis projections, slices and volumes: Comparison of human observer performance in a SKE detection task. I Reiser, K Little, & RM Nishikawa
Thanks Craig, Miguel and Elisabeth for a wonderful event, learned so much in just three days. Plenty of inspiration for future research.

Medical Image Perception Society 2009 - Day two

Session 6. Performance Measurement II. Chair: Matthew Freedman, MD, MBA
  • Coding of FDG Intensity as a 3-D Rendered Height Mapping to Improve Fusion Display of Co-Registered PET-CT Images. RM Shah, C Wood, YP Hu, & LS Zuckier
  • Estimation of AUC from Normally Distributed Rating Data with Known Variance Ratio. A Wunderlich & F Noo
  • Using the Mean-to-Variance Ratio as a Diagnostic for Unacceptably Improper Binormal ROC Curves. SL Hillis & KS Berbaum
Session 7. Performance Measurement II. Chair: Stephen Hillis, PhD
  • BI-RADS Data Should Not be Used to Estimate ROC Curves. Y Jiang & CE Metz

  • Estimating the utility of screening mammography in large clinical studies. CK Abbey, JM Boone, & MP Eckstein

  • Issues Related to the Definition of Image Contrast, DL Leong & PC Brennan
Session 8. Models of Perceptual processing. Chair: Yulei Jiang, PhD
  • Channelized Hotelling Observers for Detection Tasks in Multi-Slice Images. L Platiša, B Goossens, E Vansteenkiste, A Badano & W Philips

  • Channelized Hotelling observers adapted to irregular signals in breast tomosynthesis detection tasks. I Diaz, P Timberg, CK Abbey, MP Eckstein, FR Verdun, C Castella, FO Bochud

  • Detecting Compression Artifacts in Virtual Pathology Images Using a Visual Discrimination Model. J Johnson & EA Krupinski

  • Automatic MRI Acquisition Parameters Optimization Using HVS-Based Maps. J Jacobsen, P Irarrázabal, & C Tejos

  • Parametric Assessment of Lesion Detection Using a Pre-whitened Matched Filter on Projected Breast CT Images. N Packard, CK Abbey, & JM Boone

  • Model Observers for Complex Discrimination Tasks: Deployment Assessment of Multiple Coronary Stents. S Zhang, CK Abbey, X Da, JS Whiting, & MP Eckstein
Session 9. Special Invited Session on Neuroscience and Medical Image Perception. Chair: Miguel Eckstein, PhD
  • Decoding Information Processing When Attention Fails: An Electrophysiological Approach. B Giesbrecht
  • Some Neural Bases of Radiological Expertise. SA Engel

Tuesday, October 20, 2009

Medical Image Perception Society 2009 - Day one

The first day of the Medical Image Perception Society conference, held biannual, this year in Santa Barbara was filled with interesting talks. Plenty of research utilizing eye tracking as a means of obtaining data. The conference is hosted by Craig Abbey and Miguel Eckstein at the Department of Psychology at the University of California, Santa Barbara in cooperation with Elizabeth Krupinski (book1 , book2) from University of Arizona whom has performed extensive research on eye movements (among other things) in relation to medical imaging and radiology.

Session 1. Visual Search. Chair: Claudia Mello-Thoms, PhD
Session 2. Visual Search. Chair: Elizabeth Krupinski, PhD
  • Visual Search Characteristics of Pathology Residents Reading Dermatopathology Slides. J Law & C Mello-Thoms
  • Are you a good eye-witness? Perceptual differences between physicians and lay people. C Mello-Thoms
  • Eye movements and computer-based mammographic interpretation training. Y Chen & A Gale
Session 3. Perceptual Effects. Chair: David Manning, PhD
  • Nuisance levels of noise effects Radiologists Performance. MF Mc Entee, A O'Beirne, J Ryan, R Toomey, M Evanoff, D Chakraborty, D Manning, & PC. Brennan
  • Observer Performance in Stroke Interpretation: The Influence of Experience and Clinical Information in Multidimensional Magnetic Resonance Imaging. L Cooper, A Gale, J Saada, S Gedela, H Scott, & A Toms
  • Interpretation of wrist radiographs: A comparison between final year medical and radiography students. L Hutchinson, P Brennan & L Rainford
  • Tumor measurement for revised TNM staging of lung cancer. FL Jacobson, A Sitek, D Getty, & SE Seltzer
  • Does Reader Visual Fatigue Impact Performance? EA Krupinski & KS Berbaum
  • Ambient Temperature is an Important Consideration in the Radiology Reading Room. MF Mc Entee & S Gafoor
Session 4. Performance Measurement I. Chair: Dev Chakraborty, PhD
  • Perceptual indicators of the holistic view in pulmonary nodule detection. MW Pietrzyk, DJ Manning, T Donovan, & Alan Dix
  • An e-learning tutorial demonstrates significant improvements in ROC performance amongst naive observers in breast image interpretation. PBL Soh, PC Brennan, A Poulos, W Reed
  • Is n ROC-type response Truly always better than A Binary Response? D Gur, AI Bandos, HE Rockette, ML Zuley, CM Hakim, DM Chough, MA Ganott
  • Recognition of Images in Reader Studies: How Well Can We Predict Which Will Be Remembered? T Miner Haygood, P O’Sullivan, J Ryan, E Galvan, J-M Yamal, M Evanoff, M McEntee, J Madewell, C Sandler, E Lano, & P Brennan
Session 5. Performance Measurement I. Chair: Alastair Gale, PhD
  • New classes of models with monotonic likelihood ratios. F Samuelson
  • Sample size estimation procedure for free-response (FROC) studies. DP Chakraborty & M Bath
  • Comparison of Four Methods (ROC, JAFROC, IDCA, and ROI) for Analysis of Free Response Clinical Data. F Zanca, DP Chakraborty, J Jacobs, G. Marchal, and H Bosmans
Feel free to post additional links in the comments. Slides will be posted as they become available.

Thursday, October 8, 2009

DoCoMo EOG update

While eye movement detection using EOG is nothing new the latest demonstration by Japanese NTT DoCoMo illustrates recent developments in the field. The innovation here is the form factor which is quite impressive. Typically EOG is detected using electrodes placed around the eyes as in Andreas Bullings prototype demonstrated at CHI 09 in Boston. Now it can be done using tiny sensors inside the ear. Just compare it to the prototype demonstrated last year!







Thanks Roman for the links!

Monday, September 28, 2009

Wearable Augmented Reality System using Gaze Interaction (Park, Lee & Choi)

Came across this paper on a wearable system that employs a small eye tracker and a head mounted display for augmented reality. I've previously posted a video on the same system. It's a future technology with great potential, only imagination sets the limit here. There is a lot of progress in image/object recognition and location awareness taking place right now (with all the associated non-trivial problems to solve!)


Abstract
"Undisturbed interaction is essential to provide immersive AR environments. There have been a lot of approaches to interact with VEs (virtual environments) so far, especially in hand metaphor. When the user‟s hands are being used for hand-based work such as maintenance and repair, necessity of alternative interaction technique has arisen. In recent research, hands-free gaze information is adopted to AR to perform original actions in concurrence with interaction. [3, 4]. There has been little progress on that research, still at a pilot study in a laboratory setting. In this paper, we introduce such a simple WARS(wearable augmented reality system) equipped with an HMD, scene camera, eye tracker. We propose „Aging‟ technique improving traditional dwell-time selection, demonstrate AR gallery – dynamic exhibition space with wearable system."
  • Park, H. M., Seok Han Lee, and Jong Soo Choi 2008. Wearable augmented reality system using gaze interaction. In Proceedings of the 2008 7th IEEE/ACM international Symposium on Mixed and Augmented Reality - Volume 00 (September 15 - 18, 2008). Symposium on Mixed and Augmented Reality. IEEE Computer Society, Washington, DC, 175-176. DOI= http://dx.doi.org/10.1109/ISMAR.2008.4637353