Tuesday, October 2, 2012

Panasonic in-flight eye control demo

From the APEX 2012 here's a video where Steve Sizelove from Panasonic demonstrates their eye and gesture control systems for future in-flight entertainment systems. Even if this is a futuristic concept it is clear that Panasonic is pushing the envelope on in-flight systems. Their X-series system is state-of-the-art, just take a look at the upcoming eX3, a touch-enabled Android platform with an associated app store, support for the Unity 3D engine, fast internet etc. Great stuff for those transatlantic flights that seem to take forever.

Thursday, August 16, 2012

Tough decisions, big plans and a bright future

Browsed through my blog today. Realized I hadn't written much about what I've been up to. There's been a reason for that. One year ago I left my position at Duke University. It wasn't an easy decision. The Radiology eye tracking project I was involved with (and still is) was making good progress. I had been working long days since it started at Stanford in 2009 and we were doing pretty neat stuff with volumetric medical image datasets. 

The Stanford/Duke Radiology eye tracking project and our novel approach to volumetric gaze data.

At the same time I spent nights and weekends working on the open source ITU Gaze Tracker together with Javier San Agustin. Somewhere I always had the feeling that we should get back together, great things just seemed to happen when we did. So after my grand tour of the US and countless Skype meetings over six months we had a plan. The four former PhD students from the ITU Gazegroup was to start an eye tracking company. At first we called it Senseye but later changed it to The Eye Tribe due to trademark issues. 

The Eye Tribe as of Spring 2012 at the US embassy reception. 

We decided early not to go for the established market. It's a red sea with a couple of fairly big players that have been working on their high tech creations for years, it's a low volume/high margin game with intricate and expensive solutions primarily for the research and disabled markets. 

The Eye Tribe intends to innovate and disrupt by bringing eye tracking to post-pc devices in the consumer market. It just doesn't happen with devices that costs several thousand dollars.  

After twelve months of executing our plan we recently raised funds from a group of European investors to accelerate (as covered by The Next Web). The team has grown and we are looking to make additional hires in a near future. Perhaps you would like join the tribe and be part of something great? There's some very interesting things happening in a near future, for the skilled it's always best to get on early.

One year ago I traded a warm North Carolina for a cold Copenhagen, a relationship for loneliness, a big house for a small apartment and a sport car for a bicycle. Time will tell if that was the right thing to do, with big plans, full commitment and funding in place, it is so far, so good.

Monday, July 30, 2012

What the mind can conceive, it can achieve.

Today marks a historic day as the ITU GazeGroup.org open source Gaze Tracker has been downloaded over 30,000 times. Although the current version was released in October 2010 we're still seeing approximately 1000 downloads per month. We're really happy to see how widely distributed it has become, reaching all corners of the planet. When we released the first version, back in 2009, we had no idea it would reach distant places such as Kyrgyzstan, Suriname or Burkina Faso. The objective was, and still is, to "democratize and provide access to eye tracking technology regardless of means or nationality". This milestone is an achievement that I'd like to thank everyone involved for.


Top 10 Countries
10. Brazil 720  
9. Denmark 803  
8. France 865  
7. China 888  
6. India 1,063  
5. Italy 1,170  
4. Japan 1,226  
3. United Kingdom 1,359  
2. Germany 2,266  
1. United States 4,647

Top 10 total: 15,007 (50%) Full stats.

Thursday, June 28, 2012

Dual scene-camera head-mounted eye tracking rig from Lancaster Uni.

From the Pervasive 2012 conference held last week in Newcastle comes a demo of a dual scene-camera head-mounted eye tracking rig that enables users to move objects between two displays using the gaze position. The larger display acts as the "public" display (digital signage etc.) while the smaller represents the personal handheld tablet/smartphone. Nifty idea from Jayson TurnerAndreas Bulling and Hans Gellersen, all from the Embedded Interactive Systems group at Lancaster University

Tuesday, June 19, 2012

The Eye Tribe presents worlds first eye controlled Windows 8 tablet

It slices, it dices! The Eye Tribe from Copenhagen introduces the worlds first Windows 8 eye tracking tablet. The small, lightweight add-on connects via USB, no additional cables or batteries needed. For the time being the specs are 30Hz, accuracy of 0.5 degrees and an exceptionally large tracking range. More info to follow.

 


The Eye Tribe, formerly known as Senseye, have made significant progress in recent months. In January they won the Danish Venture Cup. Then went on to participate in the Rice RBPC, the worlds premier business plan competition, made it to the semi-finals and was awarded "Most Disruptive Technology" while being mentioned in Fortune Magazine and Houston Chronicle. In May the team won the eHealth Innovation Contest followed by the audience award at the Danish Accelerace whereby they were selected to participate at the Tech All Stars event which gives the most promising European startups the opportunity to pitch at the LeWeb conference in London on June 20th.

Friday, June 8, 2012

Eyecatcher - A 3D prototype combining Eyetracking with a Gestural Camera

Eyecatcher is a prototype combining eyetracking with a gestural camera on a dual screen setup. Created for the Oilrig process industry, this project was a collaborative exploration between ABB Corporate Research and Interactive Institute Umeå (blog).


Sunday, June 3, 2012

Copenhagen Business School: PhD position available

Copenhagen Business School invites applications for a vacant PhD scholarship in empirical modeling of eye movements in reading, writing and translation. The PhD position is offered at the Department of International Business Communication at the Copenhagen Business School (CBS). The Department of International Business Communication is a new department at CBS whose fields of interest include the role of language(-s) in interlingual and intercultural communication, the role of language and culture competences in organizations, the role of language and culture in communication technology and social technologies, as well as the teaching of language skills. The Department is dedicated to interdisciplinary and problem-oriented research.

Considerable progress has been made in eye-tracking technology over the past decade, allowing to capture  gaze behavior with free head movements. However, the imprecision of the measured signal makes it difficult to analyze the eye-gaze movement in reading tasks where a precise local resolution of the gaze samples is required to track the reader's gaze path over a text. The PhD position will investigate methods to cancel out the noise from the gaze signal. The PhD candidate will investigate, design and implement empirically-based models of eye-gaze movements in reading which take into account physical properties of the visual system in addition to background information, such as the purpose of the reading activity, the structure of the text, the quality of the gaze signal, etc. The PhD candidate should have:
  • an interest in cognitive modeling of human reading, writing and translation processes
  • a basic understanding of browser and eye-tracking technology
  • knowledge of probability theory and statistical modeling
  • advanced programming skills
More information available here.

Friday, June 1, 2012

Temporal Control In the EyeHarp Gaze-Controlled Musical Interface


The EyeHarp that I wrote about last summer is a gaze controlled musical instrument build by Zacharias Vamvakousis. In the video below he demonstrates how the interface is driven by the ITU Gaze Tracker and used to compose a loop which then improvise upon. On the hardware side a modified PS3 camera is used in combination with two infrared light sources. This setup was presented in New Interfaces for Musical Expression (NIME 2012) conference in Detroit a week ago, while it will be exhibited in Sonar, Barcelona on 14-16, June 2012. Great to see that such innovative interface being made open source and combined with the ITU tracker.

  • Vamvakousis, Z. and Ramirez, R. (2012) Temporal Control In the EyeHarp Gaze-Controlled Musical Interface. In the proceedings on the 12th International Conference on New Interfaces for Musical Expression. 21-23 May 2012. Ann Arbor, Michigan, USA. (PDF)

Monday, April 23, 2012

Noise Challenges in Monomodal Gaze Interaction (Skovsgaard, 2011)


Henrik Skovsgaard of the ITU Gaze Group successfully defended his PhD thesis “Noise Challenges in Monomodal Gaze Interaction” at the IT University of Copenhagen on the 13th December 2011. The PhD thesis can be downloaded here.  

ABSTRACT
Modern graphical user interfaces (GUIs) are designed with able-bodied users in mind. Operating these interfaces can be impossible for some users who are unable to control the conventional mouse and keyboard. An eye tracking system offers possibilities for independent use and improved quality of life via dedicated interface tools especially tailored to the users’ needs (e.g., interaction, communication, e-mailing, web browsing and entertainment). Much effort has been put towards robustness, accuracy and precision of modern eye-tracking systems and there are many available on the market. Even though gaze tracking technologies have undergone dramatic improvements over the past years, the systems are still very imprecise. This thesis deals with current challenges of monomodal gaze interaction and aims at improving access to technology and interface control for users who are limited to the eyes only. Low-cost equipment in eye tracking contributes toward improved affordability but potentially at the cost of introducing more noise in the system due to the lower quality of hardware. This implies that methods of dealing with noise and creative approaches towards getting the best out of the data stream are most wanted. The work in this thesis presents three contributions that may advance the use of low-cost monomodal gaze tracking and research in the field:
  • An assessment of a low-cost open-source gaze tracker and two eye tracking systems through an accuracy and precision test and a performance evaluation. 
  • Development and evaluation of a novel innovative 3D typing system with high tolerance to noise that is based on continuous panning and zooming.
  • Development and evaluation of novel selection tools that compensate for noisy input during small-target selections in modern GUIs. 
This thesis may be of particular interest for those working on the use of eye trackers for gaze interaction and how to deal with reduced data quality. The work in this thesis is accompanied by several software applications developed for the research projects that can be freely downloaded from the eyeInteract appstore (http://www.eyeinteract.com).

SUPERVISORS

ASSESSMENT COMMITTEE 

Monday, March 12, 2012

SMI RED-M

Well, well, look here. A constellation of eye tracking manufacturers are joining in on the affordable market, perhaps defined some time ago by Mirametrix who launched at @ $5k. Tobii have their PC Eye, perfectly fine but at a cool $7k and is showcasing the new IS2 chipset but apparently can't do CEBIT12 demos. The new player is Sensomotoric Instruments, known for their high quality hardware and finely tuned algorithms. Their new contribution is the RED-M (M is for mini?). Even if the price hasn't been announced I would assume it's less than it's high speed fire-wire sibling, perhaps similar to the PCEye pricing?

The M-version is a small device made out of plastics that connects via USB 2.0 (assuming two plugs, one for power), it measures 240x25x33mm - that's pretty small and it's only 130 grams. This is a big difference from their prior models which have been very solid and made out of high quality materials and professional components. The accuracy is specified to 0.5deg, 50-75cm distance where the box is 320x210mm @ 60cm with a sample rate of 60/120Hz, in essence it's the low end version of the RED series where the top model is the super fast RED500 . Although it has yet to be demonstrated in operational state some material has appeared online. Below is the animated setup guide, you can find more information on their website. Looking good!