Showing posts with label Midas Touch. Show all posts
Showing posts with label Midas Touch. Show all posts

Tuesday, August 11, 2009

ALS Society of British Columbia announces Engineering Design Awards (Canadian students only)

"The ALS Society of British Columbia has established three Awards to encourage and recognize innovation in technology to substantially improve the quality of life of people living with ALS (Amyotrophic Lateral Sclerosis, also known as Lou Gehrig’s Disease). Students at the undergraduate or graduate level in engineering or a related discipline at a post-secondary institution in British Columbia or elsewhere in Canada are eligible for the Awards. Students may be considered individually or as a team. Mentor Awards may also be given to faculty supervising students who win awards" (see Announcement)


Project ideas:
  • Low-cost eye tracker
    • Issue: Current commercial eye-gaze tracking systems cost thousands to tens of thousands of dollars. The high cost of eye-gaze trackers prevents potential users from accessing eye- gaze tracking tools. The hardware components required for eye-gaze tracking do not justify the price and a lower-cost alternative is desirable. Webcams may be used for low-cost imaging, along with simple infrared diodes for system lighting. Alternatively, visible light systems may also be investigated. Opensource eye-gaze tracking software is also available. (ed: ITU GazeTracker, OpenEyes, Track Eye, OpenGazer and MyEye (free, no source)
    • Goal: The goal of this design project is to develop a low-cost and usable eye-gaze tracking system based on simple commercial-of-the-shelf hardware.
    • Deliverables: A working prototype of a functional, low-cost (< $200), eye-gaze tracking system.
  • Eye-glasses compensation
    • Deliverables: A working prototype of a functional, low-cost (< $200), eye-gaze tracking system
    • Issue: The use of eye-glasses can cause considerable problems in eye-gaze tracking. The issue stems from reflections off the eye-glasses due to the use of controlled infrared lighting (on and off axis light sources) used to highlight features of the face. The key features of interest are the pupils and glints (or reflections of the surface of the cornea). Incorrectly identifying the pupils and glints then results in invalid estimation of the point-of-gaze.
    • Goal: The goal of this design project is to develop techniques for either: 1) avoiding image corruption with eye-glasses on a commercial eye-gaze tracker, or 2) developing a controlled lighting scheme to ensure valid pupil and glints identification are identified in the presence of eye-glasses.
    • Deliverables: Two forms of deliverables are possible: 1) A working prototype illustrating functional eye-gaze tracking in the presence of eye-glasses with a commercial eye-gaze tracker, or 2) A working prototype illustrating accurate real-time identification of the pupil and glints using controlled infrared lighting (on and off axis light sources) in the presence of eye-glasses.
  • Innovative selection with ALS and eye gaze
    • Issue: As mobility steadily decreases in the more advanced stages of ALS, alternative techniques for selection are required. Current solutions include head switches, sip and puff switches and dwell time activation depending on the degree of mobility loss to name a few. The use of dwell time requires no mobility other than eye motion, however, this technique suffers from ‘lag’ in that the user must wait the dwell time duration for each selection, as well as the ‘midas touch’ problem in which unintended selection if the gaze point is stationary for too long.
    • Goal: The goal of this design project is to develop a technique for improved selection with eye-gaze for individuals with only eye-motion available. Possible solutions may involve novel HCI designs for interaction, including various adaptive and predictive technologies, the consideration of contextual cues, and the introduction of ancillary inputs, such as EMG, EEG.
    • Deliverables: A working prototype illustrating eye-motion only selection with a commercial eye-gaze tracking system.
  • Novel and valuable eye-gaze tracking applications and application enhancements
    • Issue: To date, relatively few gaze-tracking applications have been developed. These include relatively simplistic applications such as the tedious typing of words, and even in such systems, little is done to ease the effort required, e.g., systems typically do not allow for the saving and reuse of words and sentences.
    • Goal: The goal of this design project is to develop one or more novel applications or application enhancements that take gaze as input, and that provide new efficiencies or capabilities that could significantly improve the quality of life of those living with ALS.
    • Deliverables: A working prototype illustrating one or more novel applications that take eye-motion as an input. The prototype must be developed and implemented to the extent that an evaluation of the potential efficiencies and/or reductions in effort can be evaluated by persons living with ALS and others on an evaluation panel.

    See the Project Ideas for more information. For contact information see page two of the announcement.

Wednesday, March 26, 2008

Last week of prototyping

Being back from a short easter holiday the final week of developing the prototype is here. The functionality of the interface is OK but there is major tweaking and bug testing to be performed. The deadline for any form of new features is Monday 31st of March. I intend to use all of April for setting up the evaluation experiments and procedures. As always there is so much more that I would like to incorporate, every day brings new ideas.

The final version will include:
  • A configurable dwell button component. This enables drag and drop dwell buttons into Windows projects with individual configuration on dwell-time and icons etc.

  • A novel gaze based menu system that utilized saccade selection (two step dwell) This highly configurable interface component displays itself when activated. It aims at solving the midas touch problem while utilizing the screen real estate in a better way. The two steps in the activation process can be set to specific activation-speeds (dwells) More on this later..

  • A gaze-based memory game using a 36 card layout. By perfoming a dwell the cards "turn" over and shows its symbol (flag). The user the selects another card. If matching then remove them. If different, turn them back over. Got some nice graphical effects.

  • A gaze based picture viewer that zooms into the photos in gaze of the user. More on this later.

  • A gaze based media/music player. This component will scan the computer for artists, albums and songs. These items are then accessible by a gaze driven interface where the user can create play-lists and perform the usual functions (volume+-, pause, stop, next, previous etc.)

  • Perhaps just one little surprise more..

Eight weeks so far. Curiosity, dedication and plain ol´ hard work, nothing else to it.

Monday, March 17, 2008

Inspiration: Takehiko Ohno

Working out of NTT Cyber Solutions Laboratories in Japan Takehiko interests lies mainly in eye tracking technology and human computer interaction. He has published several papers on eye tracking technology and interaction methods. The QuickGlance selection method aims to solve the well known Midas-touch problem. The interface contains a specific selection area next to each choice/item which must be fixated to activate the function. There are two major advantages with this. First, the user can look around at the menu items without worrying about accidentally activating something. Second, advanced users can go for the activation area directly without even reading the menu text. Just like most people know the order/location of items on the Windows Start-menu. On the downside this means that all options are displayed on the screen all the time.

Takehiko additionally have published several articles on FreeGaze, a remote based system which allows the user to move around his head freely. The FreeGaze eye tracker at NTT has a 0.28 degree of accuracy and is based on a rather wide stereoscopic corneal reflection method using serveral image processing algorithms described in the papers, which are well written and worth reading.

His research highlights the importance of providing feedback to the user as the major method of reducing error rates. Something that I've taken to heart.

Monday, March 10, 2008

Inspiration: Professor Rob Jacobs (Mr Midas Touch)


Rob Jacobs, currenly at Tufts University, was early on interested in gaze based interaction. Jacobs have a long list of both honors, publications and presentations in the HCI field. Being the founder of the "Midas Touch" analogy and many other fundamental aspects of gaze interaction the guy clearly deserves an introduction.


From his homepage
"Robert Jacob is a Professor of Computer Science at Tufts University, where his research interests are new interaction media and techniques and user interface software. He is currently also a visiting professor at the Universite Paris-Sud, and he was a visiting professor at the MIT Media Laboratory, in the Tangible Media Group, and continues collaboration with that group. Before coming to Tufts, he was in the Human-Computer Interaction Lab at the Naval Research Laboratory. He received his Ph.D. from Johns Hopkins University, and he is a member of the editorial board of Human-Computer Interaction and the ACM Transactions on Computer-Human Interaction. He was Papers Co-Chair of the CHI 2001 conference, Co-Chair of UIST 2007, and Vice-President of ACM SIGCHI. He was elected to the ACM CHI Academy in 2007, an honorary group of people who have made extensive contributions to the study of HCI and have led the shaping of the field."

Research topics:
Human-Computer Interaction
New Interaction Techniques and Media
Tangible User Interfaces
Virtual Environments
User Interface Software
Information Visualization

Must-read:

R.J.K. Jacob, “The Use of Eye Movements in Human-Computer Interaction Techniques: What You Look At is What You Get,” ACM Transactions on Information Systems, vol. 9, no. 3, pp. 152-169 (April 1991) [link]

L.E. Sibert and R.J.K. Jacob, “Evaluation of Eye Gaze Interaction,” Proc. ACM CHI 2000 Human Factors in Computing Systems Conference, pp. 281-288, Addison-Wesley/ACM Press (2000). [link]

Thursday, February 7, 2008

Midas touch, Dwell time & WPF Custom controls

How do you make a distinction between users glancing at objects and fixations with the intention to start a function? This is the well known problem usually referred to as the "Midas touch problem". The interfaces that rely on gaze as the only means input must support this style of interaction and be capable of making distinction between glances when the user is just looking around and fixations with the intention to start a function.

There are some solutions to this. Frequently occurring is the concept of "dwell-time" where you can activate functions simply by a prolonged fixation of an icon/button/image. Usually in the range of 4-500ms or so. This is a common solution when developing gaze interfaces for assistive technology for users suffering from Amyotrophic lateral sclerosis (ALS) or other "paralyzing" conditions where no other modality the gaze input can be used. It does come with some issues, the prolonged fixation means that the interaction is slower since the user has to sit through the dwell-time but mainly it adds stress to the interaction since everywhere you look seems to activate a function.

As part of my intention to develop a set of components for the interface a dwell-based interaction style should be supported. It may not be the best method but I do wish to have it in the drawer just to have the opportunity to evaluate it and perform experiments with it.

The solution I´ve started working on goes something like this; upon a fixation on the object an event is fired. The event launches a new thread which aims at determining if the fixation is within the area long enough for it to be considered to be a dwell (the gaze data is noisy) Half way through it measures if the area have received enough fixations to continue, otherwise aborts the thread. At the end it measures if fixations have resided within the area for more than 70% of the time, in that case, it activates the function.

Working with threads can be somewhat tricky and a some time for tweaking remains. In addition getting the interaction to feel right and suitable feedback is important. I'm investigating means of a) providing feedback on which object is selected b) indications that the dwell process has started and its state. c) animations to help the fixation to remain in the center of the object.

--------------------------------------------------------------
Coding> Windows Presentation Foundation and Custom Controls

Other progress has been made in learning Windows Presentation Foundation (WPF) user interface development. The Microsoft Expression Blend is a tool that enables a graphical design of components. By creating generic objects such as gaze-buttons the overall process will benefit in the longer run. Instead of having all of the objects defined in a single file it is possible to break it into separate projects and later just include the component DLL file and use one line of code to include it in the design.

Additional styling and modification on the objects size can then be performed as if it were a default button. Furthermore, by creating dependency properties in the C# code behind each control/component specific functionality can be easily accessed from the main XAML design layout. It does take a bit longer to develop but will be worth it tenfold later on. More on this further on.

Microsoft Expression Blend. Good companion for WPF and XAML development.

Windows Presentation Foundation (WPF) has proven to be more flexible and useful than it seemed at fist glance. You can really do complex things swiftly with it. The following links have proven to be great resources for learning more about WPF.

Lee Brimelow, experienced developer who took part in the new Yahoo Messenger client. Excellent screencasts that takes you through the development process.
http://www.contentpresenter.com and http://www.thewpfblog.com

>Kirupa Chinnathambi, introduction to WPF, Blend and a nice blog.http://www.kirupa.com/blend_wpf/index.htm and http://blog.kirupa.com/

Microsoft MIX07, 72 hours of talks about the latest tools and tricks.http://sessions.visitmix.com/