Showing posts with label noisy gaze data. Show all posts
Showing posts with label noisy gaze data. Show all posts

Monday, November 1, 2010

ScanMatch: A novel method for comparing fixation sequences (Cristino et al, 2010)

Using algorithms designed to compare DNA sequence in eye movement comparison. Radical, with MATLAB source code, fantastic! Appears to be noise tolerant and outperform traditional Levenshtein-distance.

Abstract
We present a novel approach to comparing saccadic eye movement sequences based on the Needleman–Wunsch algorithm used in bioinformatics to compare DNA sequences. In the proposed method, the saccade sequence is spatially and temporally binned and then recoded to create a sequence of letters that retains fixation location, time, and order information. The comparison of two letter sequences is made by maximizing the similarity score computed from a substitution matrix that provides the score for all letter pair substitutions and a penalty gap. The substitution matrix provides a meaningful link between each location coded by the individual letters. This link could be distance but could also encode any useful dimension, including perceptual or semantic space. We show, by using synthetic and behavioral data, the benefits of this method over existing methods. The ScanMatch toolbox for MATLAB is freely available online (www.scanmatch.co.uk).
  • Filipe Cristino, Sebastiaan Mathôt, Jan Theeuwes, and Iain D. Gilchrist
    ScanMatch: A novel method for comparing fixation sequences
    Behav Res Methods 2010 42:692-700; doi:10.3758/BRM.42.3.692
    Abstract   Full Text (PDF)   References





Friday, September 11, 2009

An Adaptive Algorithm for Fixation, Saccade, and Glissade Detection in Eye-Tracking Data (Nyström M. & Holmqvist K, 2009)

From Markus Nyström and Kenneth Holmqvist at the Lund University Humanities Lab (HumLab) in Sweden comes an interesting paper on a novel algorithm that is capable of detecting glissades (aka dynamic overshoot) in eye tracker data. These are wobbling eye movements often found at the end of saccades and has previously been considered errors in saccadic programming with limited value. What ever their function is the phenomena does exists and should be accounted for. The paper reports finding glissades following half of all saccades while reading or viewing scenes, and has an average duration of 24 ms. This is work is important as it extends the default categorization of eye movement e.g. fixation, saccade, smooth pursuit, and blink. The algorithm is based on velocity saccade detection and is driven by data while containing a limited number of subjective settings. The algorithm contains a number of improvements such as thresholds for peak- and saccade onset/offset detection, adaptive threshold adjustment based on local noise levels, physical constraints on eye-movements to exclude noise and jitter, and new recommendations for minimum allowed fixation and saccade duration. Also, important to note that the data was obtained using a high-speed 1250 Hz SMI system, how the algorithm performs on a typical remote tracker running at 50-250Hz has yet to be defined.

Monday, March 3, 2008

Zooming and Expanding Interfaces / Custom componenets

The inspiration I got from the reviewed papers on using a zooming interaction style to developing a set of zoom based interface components. The interaction style is suitable for gaze to overcome the inaccuracy and jitter of eye movements. My intention is that the interface components should be completely standalone, customizable and straightforward to use. Ideally included in new projects by importing one file and writing one line of code.

The first component is a dwell-based menu button that on fixation will a) provide a dwelltime indicator by animating a small glow effect surrounding the button image and b) after 200ms expand an ellipse that houses the menu options. This produces a two step dwell activation while making use of the display area in a much more dynamic way. The animation is put in place to keep the users fixation remained at the button for the duration of the dwell time. The items in the menu are displayed when the ellipse has reached its full size.

This gives the user a feedback in his parafoveal region and at the same time the glow of the button icon has stopped indicating a full dwelltime execution. (bit hard to describe in words, perhaps easier to understand from the images below) The parafoveal region of our visual field is located just outside the foveal region (where the full resolution vision takes place). The foveal area is about the size of a thumbnail on an armslengths distance, items in the parafoveal region still can be seen but the resolution/sharpness is reduced. We do see them but have to make a short saccade for them to be in full resolution. In other words the menu items pop out at a distance that attracts a short saccade which is easily discriminated by the eye tracker. (Just4fun test your visual field)

Before the button has received focus


Upon fixation the button image displays an animated glow effect indicating the dwell process. The image above illustrates how the menu items pops out on the ellipse surface at the end of the dwell. Note that the ellipse grows in size during a 300ms period, exact timing is configurable by passing a parameter in the XAML design page.

The second prototype I have been working on is also inspired by the usage of expanding surfaces. The purpose is a gaze driven photo gallery where thumbnail sized image previews becomes enlarged upon glancing at them. The enlarged view displays an icon which can be fixated to make the photo appear in full size.

Displaying all the images in the users "My pictures" folder.


Second step, glancing at the photos. Dynamically resized. Optionally further enlarged.

Upon glancing at the thumbnails they become enlarged which activates the icon at the bottom of each photo. This enables the user to make a second fixation on it to bring the photo into a large view. This view has to two icons to navigate back and forth (next photo). By fixating outside the photo the view goes back to the overview.

Tuesday, February 5, 2008

Day 3:1 - Quick fix for noisy data

Yesterday I was able to plot the recorded gaze data and visually illustrate it by drawing small rectangles where my gaze was positioned. As mentioned the gaze data was rather noisy. Upon fixating on a specific area the points would still spray out within an area about the size of a soda cap. I decided to have a better look at the calibration process. This is done in the IView program supplied by SMI. By running the process in a full size window I was able to get a better calibration, increasing the number of calibration points gives a better result.

Still the gaze data is rather noisy and it is a well known problem within the field. This has previously been solved by applying an algorithm ro smoothen the X and Y position. My initial solution is to compare the received data with the last reading. If it is within a radius of 20 pixels it will be considered to be the same spot as the previous fixation.


if (isSmoothGazeDataOn)
{
// If new gaze X point is outside plus/minus the smooth-radius, set new gaze pos.
if (newGazeX > this.gazePositionX + this.smoothRadius ||
newGazeX < this.gazePositionX - this.smoothRadius)

this.gazePositionX = newGazeX;


// If new gaze Y point is outside plus/minus the smooth-radius, set new gaze pos.
if (newGazeY > this.gazePositionY + this.smoothRadius ||
newGazeY < this.gazePositionY - this.smoothRadius)

this.gazePositionY = newGazeY;
}
else // Gaze position is equal to pure data. No filtering.
{
this.gazePositionX = newGazeX;
this.GazePositionY = newGazeY;
}
}


It is a very simple function for stabilizing the gaze data (somewhat). A solution with a buffers and a function to averaging over more readings might be better but for now this does the trick (the gaze plotter became more stable upon fixating on one specific area)