Showing posts with label technology. Show all posts
Showing posts with label technology. Show all posts

Tuesday, April 26, 2011

Development of a head-mounted, eye-tracking system for dogs (Williams et al, 2011)

Fiona Williams, Daniel Milss and Kun Guo at the University of Lincoln have developed a head mounted eye tracking system for our four legged friends. Using a special construct based on a head strap and a muzzle the device was mounted on the head of the dog where a dichroic mirror placed in front of one of the eyes reflects the IR image back to the camera.


The device was adapted from a VisionTrack system by IScan/Polhemus and contains two miniature cameras, one for the eye and one for the scene which is connected to a host workstation. When used with human subject such setup provides 0.3 deg. of accuracy according to the manufacturer. Williams et al obtained an accuracy of 2-3 deg. from a single dog when using a special calibration method containing five points located on a cross which was mounted at the tip of the muzzle. Using positive reenforcement the dog was gradually trained to wear and fixate targets which I'm sure wasn't an easy task.


Abstract:
Growing interest in canine cognition and visual perception has promoted research into the allocation of visual attention during free-viewing tasks in the dog. The techniques currently available to study this (i.e. preferential looking) have, however, lacked spatial accuracy, permitting only gross judgements of the location of the dog’s point of gaze and are limited to a laboratory setting. Here we describe a mobile, head-mounted, video-based, eye-tracking system and a procedure for achieving standardised calibration allowing an output with accuracy of 2–3◦. The setup allows free movement of dogs; in addition the procedure does not involve extensive training skills, and is completely non-invasive. This apparatus has the potential to allow the study of gaze patterns in a variety of research applications and could enhance the study of areas such as canine vision, cognition and social interactions.

  • Fiona J. Williams, Daniel S. Mills, Kun Guo, Development of a head-mounted, eye-tracking system for dogs, Journal of Neuroscience Methods, Volume 194, Issue 2, 15 January 2011, Pages 259-265, ISSN 0165-0270, DOI: 10.1016/j.jneumeth.2010.10.022. (available from ScienceDirect)

Wednesday, April 20, 2011

Fraunhofer CMOS-OLED Headmounted display with integrated eye tracker

"The Fraunhofer IPMS works on the integration of sensors and microdisplays on CMOS backplane for several years now. For example the researchers have developed a bidirectional microdisplay, which could be used in Head-Mounted Displays (HMD) for gaze triggered augmented-reality (AR) aplications. The chips contain both an active OLED matrix and therein integrated photodetectors. The combination of both matrixes in one chip is an essential possibility for system integrators to design smaller, lightweight and portable systems with both functionalities." (Press release)
"Rigo Herold, PhD student at Fraunhofer IPMS and participant of the development team, declares: This unique device enables the design of a new generation of small AR-HMDs with advanced functionality. The OLED microdisplay based Eyetracking HMD enables the user on the one hand to overlay the view of the real world with virtual contents, for example to watch videos at jog. And on the other hand the user can select the next video triggered only by his gaze without using his hands." (Press release)

Sensor integrates both OLED display and CMOS imaging sensor. 

Rigo Herold will present the system at the SID 2011 exhibitor forum at May 17, 2011 4:00 p.m.: Eyecatcher: The Bi-Directional OLED Microdisplay with the following specs:
  • Monochrome 
  • Special Eyetracking-Algorithm for HMDs based on bidirectional microdisplays
  • Front brightness: > 1500 cd/m²

Poster was presented at ISSCC 2011 : Industry Demonstration Session (IDS). Click to enlarge

In addition there is a paper titled "Bidirectional OLED microdisplay: Combining display and image sensor functionality into a monolithic CMOS chip" published with the following abstract:. 

"Microdisplays based on organic light-emitting diodes (OLEDs) achieve high optical performance with excellent contrast ratio and large dynamic range at low power consumption. The direct light emission from the OLED enables small devices without additional backlight, making them suitable for mobile near-to-eye (NTE) applications such as viewfinders or head-mounted displays (HMD). In these applications the microdisplay acts typically as a purely unidirectional output device [1–3]. With the integration of an additional image sensor, the functionality of the microdisplay can be extended to a bidirectional optical input/output device. The major aim is the implementation of eye-tracking capabilities in see-through HMD applications to achieve gaze-based human-display-interaction." Available at IEEE Xplore

Wednesday, February 16, 2011

A self-calibrating, camera-based eye tracker for the recording of rodent eye movements (Zoccolan et al, 2010)

Came across an interesting methods article in "Frontiers in Neuroscience" published in late November last  year which involves the development of a fully automated eye tracking system which is calibrated without requiring co-operation from the subject. This is done by fixing the location of the eye and moving the camera to establish a geometric model (also see Stahl et al, 2000, 2004). Apparently they attempted to use a commercial EyeLink II device first but found it not suitable for rodent eye tracking due to thresholding implementation, illumination conditions and failing corneal reflection tracking when the rodent was chewing. So the authors built their own solution using a Prosilica camera and a set of algorithms (depicted below). Read the paper for implementation details. I find it to be a wonderful piece of work, different  from human eye tracking for sure but still relevant and fascinating.

Schematic diagram of the eye-tracking system

 Illustration of the algorithm to track the eye’s pupil and corneal
reflection spot.

Eye coordinate system and measurements


Horizontal and vertical alignment of the eye with the center of
the camera’s sensor.


Abstract:

"Much of neurophysiology and vision science relies on careful measurement of a human or animal subject’s gaze direction. Video-based eye trackers have emerged as an especially popular option for gaze tracking, because they are easy to use and are completely non-invasive. However, video eye trackers typically require a calibration procedure in which the subject must look at a series of points at known gaze angles. While it is possible to rely on innate orienting behaviors for calibration in some non-human species, other species, such as rodents, do not reliably saccade to visual targets, making this form of calibration impossible. To overcome this problem, we developed a fully automated infrared video eye-tracking system that is able to quickly and accurately calibrate itself without requiring co-operation from the subject. This technique relies on the optical geometry of the cornea and uses computer-controlled motorized stages to rapidly estimate the geometry of the eye relative to the camera. The accuracy and precision of our system was carefully measured using an artificial eye, and its capability to monitor the gaze of rodents was verified by tracking spontaneous saccades and evoked oculomotor reflexes in head-fixed rats (in both cases, we obtained measurements that are consistent with those found in the literature). Overall, given its fully automated nature and its intrinsic robustness against operator errors, we believe that our eye-tracking system enhances the utility of existing approaches to gaze-tracking in rodents and represents a valid tool for rodent vision studies."


  • Zoccolan DF, Graham BJ, Cox DD (2010) A self-calibrating, camera-based eye tracker for the recording of rodent eye movements. Frontiers in Neuroscience Methods. doi:10.3389/fnins.2010.00193 [link]

Thursday, February 3, 2011

EyeTech Digital Systems

Arizona-based EyeTech Digital Systems offers several interesting eye trackers where the new V1 caught my attention with its extended track-box of 25 x 18 x 50cm. The rather large depth range is provided through a custom auto focus mechanism developed in cooperation with Brigham Young University Dept. of Mechanical Engineering. This makes the device particularly suitable for larger displays such as public displays/digital signage, still the I'd imaging the calibration procedure to remain, ideally you'd want to walk up and interact/collect data automatically without any wizards or intervention. In any case, a larger trackbox is always welcome and it certainly opens up new opportunities. EyeTechs V1 offers 20cm more than most.





Tuesday, December 14, 2010

Method for Automatic Mapping of Eye Tracker Data to Hypermedia Content

Came across the United States Patent Application 20100295774 which has been filed by Craig Hennessey of Mirametrix. Essentially the system creates Regions Of Interest based on the HTML code (div-tags) to do an automatic mapping between gaze X&Y and the location of elements. This is done by accessing the Microsoft Document Object Model of an Intenet Explorer browser page to establish the "content tracker", a piece of software that generates the list of areas, their sizes and location on-screen which then are tagged with keywords (e.g logo, ad etc) This software will also keep track of several browser windows, their position and interaction state. 
"A system for automatic mapping of eye-gaze data to hypermedia content utilizes high-level content-of-interest tags to identify regions of content-of-interest in hypermedia pages. User's computers are equipped with eye-gaze tracker equipment that is capable of determining the user's point-of-gaze on a displayed hypermedia page. A content tracker identifies the location of the content using the content-of-interest tags and a point-of-gaze to content-of-interest linker directly maps the user's point-of-gaze to the displayed content-of-interest. A visible-browser-identifier determines which browser window is being displayed and identifies which portions of the page are being displayed. Test data from plural users viewing test pages is collected, analyzed and reported."
To conclude the idea is to have multiple clients equipped with eye trackers that communicates with a server. The central machine coordinates studies and stores the gaze data from each session (in the cloud?). Overall a strategy that makes perfect sense if your differentiating factor is low-cost. 

Tuesday, November 2, 2010

Optimization and Dynamic Simulation of a Parallel Three Degree-of-Freedom Camera Orientation System (T. Villgrattner, 2010)

Moving a camera 2500 degrees per second is such an awesome accomplishment that I cannot help myself, shamelessly long quote from IEEE Spectrum:


German researchers have developed a robotic camera that mimics the motion of real eyes and even moves at superhuman speeds. The camera system can point in any direction and is also capable of imitating the fastest human eye movements, which can reach speeds of 500 degrees per second. But the system can also move faster than that, achieving more than 2500 degrees per second. It would make for very fast robot eyes. Led by Professor Heinz Ulbrich at the Institute of Applied Mechanics at theTechnische Universität München, a team of researchers has been working on superfast camera orientation systems that can reproduce the human gaze.

In many experiments in psychology, human-computer interaction, and other fields, researchers want to monitor precisely what subjects are looking at. Gaze can reveal not only what people are focusing their attention on but it also provides clues about their state of mind and intentions. Mobile systems to monitor gaze include eye-tracking software and head-mounted cameras. But they're not perfect; sometimes they just can't follow a person's fast eye movements, and sometimes they provide ambiguous gaze information.

In collaboration with their project partners from the Chair for Clinical Neuroscience, Ludwig-Maximilians Universität MünchenDr. Erich Schneider, and Professor Thomas Brand the Munich team, which is supported in part by the CoTeSys Cluster, is developing a system to overcome those limitations. The system, propped on a person's head, uses a custom made eye-tracker to monitor the person's eye movements. It then precisely reproduces those movements using a superfast actuator-driven mechanism with yaw, pitch, and roll rotation, like a human eyeball. When the real eye move, the robot eye follows suit.

The engineers at the Institute of Applied Mechanics have been working on the camera orientation system over the past few years. Their previous designs had 2 degrees of freedom (DOF). Now researcher Thomas Villgrattner is presenting a system that improves on the earlier versions and features not 2 but 3 DOF. He explains that existing camera-orientation systems with 3 DOF  that are fast and lightweight rely on model aircraft servo actuators. The main drawback of such actuators is that they can introduce delays and require gear boxes.

So Villgrattner sought a different approach. Because this is a head-mounted device, it has to be lightweight and inconspicuous -- you don't want it rattling and shaking on the subject's scalp. Which actuators to use? The solution consists of an elegant parallel system that uses ultrasonic piezo actuators. The piezos transmit their movement to a prismatic joint, which in turns drives small push rods attached to the camera frame. The rods have spherical joints on either end, and this kind of mechanism is known as a PSS, or prismatic, spherical, spherical, chain. It's a "quite nice mechanism," says Masaaki Kumagai, a mechanical engineering associate professor at Tohoku Gakuin University, in Miyagi, Japan, who was not involved in the project. "I can't believe they made such a high speed/acceleration mechanism using piezo actuators."

The advantage is that it can reach high speeds and accelerations with small actuators, which remain on a stationary base, so they don't add to the inertial mass of the moving parts. And the piezos also provide high forces at low speeds, so no gear box is needed. Villgrattner describes the device's mechanical design and kinematics and dynamics analysis in a paper titled "Optimization and Dynamic Simulation of a Parallel Three Degree-of-Freedom Camera Orientation System," presented at last month's IEEE/RSJ International Conference on Intelligent Robots and Systems.




The current prototype weighs in at just 100 grams. It was able to reproduce the fastest eye movements, known as saccades, and also perform movements much faster than what our eyes can do.  The system, Villgrattner tells me, was mainly designed for a "head-mounted gaze-driven camera system," but he adds that it could also be used "for remote eye trackers, for eye related 'Wizard of Oz' tests, and as artificial eyes for humanoid robots." In particular, this last application -- eyes for humanoid robots -- appears quite promising, and the Munich team is already working on that. Current humanoid eyes are rather simple, typically just static cameras, and that's understandable given all the complexity in these machines. It would be cool to see robots with humanlike -- or super human -- gaze capabilities.

Below is a video of the camera-orientation system (the head-mount device is not shown). First, it moves the camera in all three single axes (vertical, horizontal, and longitudinal) with an amplitude of about 30 degrees. Next it moves simultaneously around all three axes with an amplitude of about 19 degrees. Then it performs fast movements around the vertical axis at 1000 degrees/second and also high dynamic movements around all axes. Finally, the system reproduces natural human eye movements based on data from an eye-tracking system." (source)

Wednesday, October 21, 2009

Nokia near-eye display gaze interaction update

The Nokia near-eye gaze interaction platform that I tried in Finland last year has been further improved. The cap used to support the weight has been replaced with a sturdy frame and the overall prototype seems lighter and also incorporates headphones. The new gaze based navigation interface support photo browsing based on the Image Space application, allowing location based accesses to user generated content. See the concept video at the bottom for their futuristic concept. Nokia research website. The prototype will be displayed at the International Symposium on Mixed and Augmented Reality conference in Orlando, October 19-22.






Thursday, October 8, 2009

DoCoMo EOG update

While eye movement detection using EOG is nothing new the latest demonstration by Japanese NTT DoCoMo illustrates recent developments in the field. The innovation here is the form factor which is quite impressive. Typically EOG is detected using electrodes placed around the eyes as in Andreas Bullings prototype demonstrated at CHI 09 in Boston. Now it can be done using tiny sensors inside the ear. Just compare it to the prototype demonstrated last year!







Thanks Roman for the links!

Tuesday, May 26, 2009

Toshiba eye tracking for automotive applications

Seen this one coming for a while. Wonder how stable it would be in a real-life scenario..
Via Donald Melanson at Engadget:
"We've seen plenty of systems that rely on facial recognition for an interface, but they've so far been a decidedly rarer occurrence when it comes to in-car systems. Toshiba looks set to change that, however, with it now showing off a new system that'll not only let you control the A/C or radio with the glance of your eye, but alert you if you happen to take your eyes off the road for too long. That's done with the aid of a camera mounted above the steering wheel that's used to identify and map out the driver's face, letting the car (or desktop PC in this demonstration) detect everything from head movement and eye direction to eyelid blinks, which Toshiba says could eventually be used to alert drowsy drivers. Unfortunately, Toshiba doesn't have any immediate plans to commercialize the technology, although it apparently busily working to make it more suited for embedded CPUs." (source)

Wednesday, January 21, 2009

SMI gets the International Forum Design Award

Congratulations to the guys at SensoMotoric Instruments (SMI) for winning the International Forum 2009 Product Design Award with their iView X™ RED eye tracker.

"The unobtrusive yet elegant design for the stand-alone as well as for the monitor-attached configuration of the eye tracking system convinced the jury. "

The award will be presented at the first day of CeBIT (3rd of March) in Hanover. The system will also be on display for those of you who are attending CeBIT. More information on the International Forum Award.

Sunday, August 24, 2008

Nokia Research: Near Eye Display with integrated eye tracker

During my week in Tampere I had the opportunity to visit Nokia Research to get a hands on with a prototype that integrates a head mounted display with an eye tracker. Due to a NDA I am unable to reveal the contents of the discussion but it does work and it was a very neat experience with great potential. Would love to see a commercial application down the road. For more information there is a paper available:
Hands-On with the Nokia NED w/ integrated eye tracker

Paper abstract:
"Near-to-Eye Display (NED) offers a big screen experience to the user anywhere, anytime. It provides a way to perceive a larger image than the physical device itself is. Commercially available NEDs tend to be quite bulky and uncomfortable to wear. However, by using very thin plastic light guides with diffractive structures on the surfaces, many of the known deficiencies can be notably reduced. These Exit Pupil Expander (EPE) light guides enable a thin, light, user friendly and high performing see-through NED, which we have demonstrated. To be able to interact with the displayed UI efficiently, we have also integrated a video-based gaze tracker into the NED. The narrow light beam of an infrared light source is divided and expanded inside the same EPEs to produce wide collimated beams out from the EPE towards the eyes. Miniature video camera images the cornea and eye gaze direction is accurately calculated by locating the pupil and the glints of the infrared beams. After a simple and robust per-user calibration, the data from the highly integrated gaze tracker reflects the user focus point in the displayed image which can be used as an input device for the NED system. Realizable applications go from eye typing to playing games, and far beyond."

Friday, March 7, 2008

Technology: Consumer-grade EEG devices

Not exactly eye tracking but interesting as a combined modality are the upcoming consumer grade Electroencephalography (EEG) devices sometimes referred to as "brain-mouse". The devices are capable of detecting brain activity by electrodes placed on the scalp.

The company OCZ Technology, mainly known for it's computer components such a memory and power supplies, have announced the "Neural Impulse Actuator (NIA)". While this technology itself is nothing new the novelty lies in the accessibility of the device, priced somewhere around $200-250 when introduced next week.

Check out the quick mini-demo by the guys at Anandtech from the Cebit exhibition in Hannover, 2008
This technical presentation (in German) goes into a bit more detail.


From the press release:
"Recently entering mass production, the final edition of the Neural Impulse Actuator (NIA) will be on display for users to try out the new and electrifying way of playing PC games. The final version of the NIA uses a sleek, metal housing, a USB 2.0 interface, a streamlined headband with carbon interface sensors, and user-friendly software. The NIA is the first commercially available product of its kind, and gamers around the world now have access to this forward-thinking technology that’s had the industry buzzing since its inception."




These devices do have the potential for taking the whole hands-free computing to the next level. They could be a feasible candidate for solving the midas touch problem by proving a device that enables the user to perform activations by subtle facial gestures etc. I have yet to discover how sensitive this device is and what the latency is. Additionally, does it come with an API?

I've tried research grade EEG devices as means for interaction while at the University of California, San Diego and pure thoughts of actions are hard to detect in a stable manner. It is well known in the neuroscience community that thought of actions activates the same regions in the brain as would be activated by actually performing them. We even have mirror neurons that are activated by observing other people performing goal-directed actions (picking up that banana) The neural activation of pure thought alone is subtle and hard to detect compared to performing actual movements, such as lifting ones arm. So, I do not expect it to be an actual Brain Computer Interface (BCI) capable of detecting thoughts (ie. thinking of kicking etc.) but more a detector of subtle motions in my forehead muscles (eye and eye brown movements, facial expressions etc.)

The firm Emotive has their own version of a consumer grade EEG which is named Epoc NeuroHeadset, it has been around for a little longer and seem to have a more developed software, but still mainly demonstration applications.


The Emotive NeuroHeadset