Showing posts with label evaluation. Show all posts
Showing posts with label evaluation. Show all posts

Wednesday, May 13, 2009

Hi –fi eyetracking with a lo-fi eyetracker: An experimental usability study of an eyetracker built from a standard web camara (Barret, M., 2009)

Marie Barret, a masters student at the ITU Copenhagen have now finished her thesis. It evaluates eye typing performance using the ITU Gaze Tracker (low-cost web cam eye tracker) in the Stargazer and GazeTalk interfaces. The thesis in written in Danish (113 pages) but I took the freedom of translating two charts from the thesis found below. The results will be presented in English at the COGAIN 2009 conference, May 26th (session three, track one at 1:50PM) For now I quote the abstract:

"Innovation has facilitated sufficient mainstream technology to build eyetrackers from off-the-shelf-components. Prices for standard eyetrackers start at around € 4000. This thesis describes an experimental usabilty study of gazetyping with a new input device built from a standard web camera without hardware modifications. Cost: € 20. Mainstreaming of assistive technologies holds potential for faster innovation, better service, lower prices and increased accessibility. Off-the-shelf-eyetrackers must be usability competitive to standard eyetrackers in order to be adopted, as eyetracking - even with expensive hardware - presents usability issues. Usability is defined as effectiveness, efficiency and user satisfaction (ISO 9242-11, 1998).

Results from the 2 * 2 factors experiment significantly indicate how the new input device can reach the usability standards of expensive eyetrackers. This study demonstrates that the off-the-shelf-eyetracker can achieve efficiency similar to an expensive eyetracker with no significant effect from any of the tested factors. All four factors have significant impact on effectiveness. A factor that can eliminate the effectiveness difference between the standard hardware and an expensive eyetracker is identified. Another factor can additionally improve effectiveness.

Two gazetyping systems specifically designed for noisy conditions e.g. due to bad calibration and jolting are tested. StarGazer uses a zooming interface and GazeTalk uses large buttons in a static graphic user interface. GazeTalk is significantly more effective than StarGazer. The large onscreen buttons and static interface of GazeTalk with dwell time activation absorb the noise from the input device and typing speeds obtained are comparable to prior research with a regular eyetracker. Clickactivation has for years (Ware & Mikaelian 1987) proved to improve efficiency of gazebased interaction. This experiment demonstrates that this result significantly applies to off-the-shelf eyetrackers as well. The input device relies on the user to compensate for off-set with head movements. The keyboards should support this task with a static graphic user interface." Download thesis as pdf (in Danish)

Tuesday, November 11, 2008

Gaze vs. Mouse in Games: The Effects on User Experience (Gowases T, Bednarik R, Tukiainen M)

Tersia Gowases, Roman Bednarik (blog) and Markku Tukiainen at the Department of Computer Science and Statistics, University of Joensuu, Finland got a paper published in the proceedings for the 16th International Conference on Computers in Education (ICCE).

"We did a simple questionnaire-based analysis. The results of the analysis show some promises for implementing gaze-augmented problem-solving interfaces. Users of gaze-augmented interaction felt more immersed than the users of other two modes - dwell-time based and computer mouse. Immersion, engagement, and user-experience in general are important aspects in educational interfaces; learners engage in completing the tasks and, for example, when facing a difficult task they do not give up that easily. We also did analysis of the strategies, and we will report on those soon. We could not attend the conference, but didn’t want to disappoint eventual audience. We thus decided to send a video instead of us. " (from Romans blog)




Abstract
"The possibilities of eye-tracking technologies in educational gaming are seemingly endless. The question we need to ask is what the effects of gaze-based interaction on user experience, strategy during learning and problem solving are. In this paper we evaluate the effects of two gaze based input techniques and mouse based interaction on user experience and immersion. In a between-subject study we found that although mouse interaction is the easiest and most natural way to interact during problemsolving, gaze-based interaction brings more subjective immersion. The findings provide a support for gaze interaction methods into computer-based educational environments." Download paper as PDF.


Some of this research has also been presented within the COGAIN association, see:
  • Gowases Tersia (2007) Gaze vs. Mouse: An evaluation of user experience and planning in problem solving games. Master’s thesis May 2, 2007. Department of Computer Science, University of Joensuu, Finland. Download as PDF

Sunday, August 24, 2008

GaCIT in Tampere, day 5.

On Friday, the last day of GaCIT, Ed Cutrell from Microsoft Research gave a talk concerning usability evaluation and how eye tracking can give a deliver a deeper understanding. While it has been somewhat abused to convince the managment with pretty pictures of heat maps it adds value to a design inquiry as an additional source of behavioral evidence. Careful consideration of the experiment design is needed. Sometimes studies in the lab lacks the ecological validity of the real in-the-field research, more on this further on.

The ability to manipulate independent variables, enforce consistency and control are important concerns. For example running a web site test against the site online may produce faulty data since the content of the site may change for each visit. This is referred to as the stimuli sensitivity and increses in-between power since all subjects are exposed to exactly the same stimuli. Another issue is the task sensitivity. The task must reflect what the results are supposed to illustrate (ie. reading a text does not contain elements of manipulation. People are in general very task oriented, instructed to read they will ignore certain elements (eg. banners etc.)

A couple of real world examples including the Fluent UI (Office 2008), Phlat and Search Engine Results Pages (SERP) were introduced.

The Fluent UI is the new interface used in Office 2008. It resembles a big change compared with the traditional Office interface. The Fluent UI is task and context dependent compared to the rather static traditional setup of menubars and icons cluttering the screen.

Example of the Fluent UI (Microsoft, 2008)

The use of eye trackers illustrated how users interacted with the interface. This may not always occur in the manner the designer intended. Visualization of eye movement gives developers and designers a lot of instant aha-experiences.

At Microsoft it is common to work around personas in multiple categories. These are abstract representations of user groups that help to illustrate the lifes and needs for "typical" users. For example, Nicolas, is a tech-savvy IT professional while Jennifer is a young hip girl who spend a lot of time on YouTube or hang around town with her shiny iPod (err.. Zune that is)

More information on the use of personas as a design method:
  • J. Grudin, J. Pruitt (2002) Personas, Participatory Design and Product Development: An Infrastructure for Engagement (Microsoft Research) Download as Word doc.

  • J. Grudin (2006) Why Personas Work: The Psychological Evidence (Microsoft Research) Download as Word doc.
Moving on, the Phlat projects aims at solving the issues surrounding navigating and searching large amounts of personal data, sometimes up to 50GB of data. Eye trackers were used to evaluate the users behavior agains the interface. Since the information managed using the application is personal there were several privacy issues. To copy all the information onto the computers in the lab was not a feasible solution. Instead the participants used the Remote Desktop functionality which allowed the lab computers to be hooked up with the participants personal computers. The eye trackers then recorded the local monitor which displayed the remote computer screen. This gives much higher ecological validity since the information used has personal/affective meaning.

Phlat - interface for personal information navigation and search (Microsoft)
The use of eye trackers for evaluating websites has been performed in several projects. Such as J. Nielsens F-Shaped Pattern For Reading Web Content and Enquiros Search Engine Results (Golden Triangle). Ed Cutrell decided to investigate how search engine results pages are viewed and what strategies users had. The results gave some interesting insight in how the decision making process goes and which links are see vs clicked. Much of the remaining part of the talk was concerned with the design, execution and results of the study, great stuff!

Further reading:
Unfortunately I had to catch a flight back home in the afternoon so I missed Howell Istance last talk. However, I´ll get a new opportunity to hear one of his excellent presentation in a weeks time at COGAIN2008.

Saturday, August 23, 2008

GaCIT in Tampere, day 3.

In the morning Howell Istance of De Montford University, currently at University of Tampere, gave a very intersting lecture concerning gaze interaction, it was divided into three parts 1) games 2) mobile devices 3) stereoscopic displays

Games
This is an area for gaze interaction which have a high potential and since the gaming industry has grown to be a hugh industy it may help to make eye trackers accessible/affordable. The development would be benificial for users with motor impairments. A couple of examples for implementations were then introduced. The first one was a first person shoother running on a XBOX360:
The experimental setup evaluation contained 10 repeated trials to look at learning (6 subjects). Three different configurations were used 1) gamepad controller moving and aiming (no gaze) 2) gamepad controller moving and gaze aiming and 3) gamepad controller moving forward only, gaze aiming and steering of the movement.
Results:
However, twice as many shots were fired that missed in the gaze condition which can be described as a "machine gun" approach. Noteworthy is that no filtering was applied to the gaze position.
Howell have conducted a analysis of common tasks in gaming, below is a representation of the amount of actions in the Guild Wars game. The two bars indicate 1) novices and 2) experienced users.

Controlling all of these different actions requires switching of task mode. This is very challenging considering only on input modality (gaze) with no method of "clicking".

There are several ways a gaze interface can be constructed. From a bottom up approach. First the position of gaze can be used to emulate the mouse cursor (on a system level) Second, a transparent overlay can be placed on top of the application. Third, a specific gaze interface can be developed (which has been my own approach) This requires a modification of the original application which is not always possible.

The Snap/Clutch interaction method developed by Stephen Vickers who is working with Howell operates on the system level to emulate the mouse. This allows for specific gaze gestures to be interpretated which is used to switch mode. For example a quick glace to the left of the screen will activate a left mouse button click mode. When a eye fixation is detected in a specific region a left mouse click will be issued to that area.

When this is applied to games such as World of Warcraft (demo) specific regions of the screen can be used to issue movement actions towards that direction. The image below illustrates these regions overlaid on the screen. When a fixation is issued in the A region an action to move towards that direction is issued to the game it self.

Stephen Vickers gaze driven World of Warcraft interface.

After lunch we had a hands-on session with the Snap/Clutch interaction method where eight Tobii eye trackers were used for a round multiplayer of WoW! Very different from a traditional mouse/keyboard setup and takes some time to get used to.

  • Istance, H.O.,Bates, R., Hyrskykari, A. and Vickers, S. Snap Clutch, a Moded Approach to Solving the Midas Touch Problem. Proceedings of the 2008 symposium on Eye Tracking Research & Applications; ETRA 2008. Savannah, GA. 26th-28th March 2008. Download
  • Bates, R., Istance, H.O., and Vickers, S. Gaze Interaction with Virtual On-Line Communities: Levelling the Playing Field for Disabled Users. Proceedings of the 4th Cambridge Workshop on Universal Access and Assistive Technology; CWUAAT 2008. University of Cambridge, 13th-16th April 2008. Download


The second part of the lecture concerned gaze interaction for mobile phones. This allows for ubiquitous computing where the eye tracker is integrated with a wearable display. As a new field it is surrounded with certain issues (stability, processing power, variation in lightning etc.) but all of which will be solved over time. The big question is what the "killer-application" will be. ( entertainment?) A researcher from Nokia attended the lecture and introduced a prototype system. Luckily I had the chance to visit their research department the following day to get a hands-on with their head mounted display with a integrated eye tracker (more on this in another post)

The third part was about stereoscopic displays which adds a third dimension (depth) to the traditional X and Y axis. There are several projects around the world working towards making this everyday reality. However, tracking the depth of gaze fixation is limited. The vergence (as seen by the distance between both pupils) eye movements are hard to measure when the distance to objects move above two meters.

Calculating convergence angles
d = 100 cm tan θ = 3.3 / 100; θ = 1.89 deg.
d = 200 cm tan θ = 3.3 / 200; θ = 0.96 deg.


Related papers on stereoscopic eye tracking:
The afternoon was spent with a guided tour around Tampere followed by a splendid dinner at a "viking" themed restaurant.

Tuesday, July 15, 2008

Passive eye tracking while playing Civilization IV

While the SMI iView X RED eye tracker used in this video is not used for driving the interaction it showcases how eye tracking can be used for usability evaluations in interaction design (Civilization does steal my attention on occations, Sid Meier is just a brilliant game designer)

Saturday, May 3, 2008

Interface evaluation procedure

The first evaluation of the prototype is now completed. The procedure was designed to test the individual components as well as the whole prototype (for playing music etc.). Raw data on selection times, error rates etc. was collected by custom development on the Microsoft .NET platform. Additional forms were displayed on-screen in between the steps of the procedure combined with standardized form based questionnaires to gather a rich set of data.

A more subjective approach was taken during the usage of the prototype as a whole to capture the aspects which could not be confined by automatic data collection or forms. Simply by observing participants and asking simple questions in normal spoken language. While perhaps being less scientifically valid this type of information is very valuable for understanding how the users think and react. Information that is crucial for improving the design for the next iteration. There is sometimes a difference the results gained by verbal utterance, questionnaires and measurable performance. For example, interfaces can be very efficient and fast but at the same time extremely demanding and stressful. Just measuring one performance factor would not tell the whole story.

This is why I chosen to use several methods to combine raw performance data, form based questionnaires and unconstrained verbal interviewing. Hoping that it can provide multiple aspects with potential for gathering a rich source of data.

For the evaluation I used a basic on-screen demographic form gathering age, sex, computer experience, conditions of vision etc. In-between the evaluation of the individual components I used the NASA Task Load Index as a quick reflection form and at the end of the session I handed out both a IBM Computer Usability Satisfaction Questionnaire and a Q.U.I.S Generic User Interface Questionnaire. The only modification I performed was to remove a few questions that would not apply to my prototype (why ask about help pages when the prototype contains none)

I´ve found the 230 Tips and Tricks for Better Usability Testing guide to be really useful and should be read by anyone conduction HCI evaluation.

Questionnaires used in my evaluation, in PDF format:

Monday, April 21, 2008

Open invitation to participate in the evaluation

The invitation to participate in the evaluation of the prototype is out. If you have the possibility to participate I would be most thankful for your time. The entire test takes roughly 30 minutes.


The invitation can be downloaded as pdf.

If you wish to participate send me a message. Thank you.

More information on the procedure and structure of the evaluation as well as gained experience will be posted once the testing is completed. The final results will be published in my master thesis by the end of May. Stay tuned.