0% found this document useful (0 votes)
89 views2 pages

An Eye Tracking Computer User Interface: V Arie E. Kaufman, Amit Bandopadhay, and Bernard D. Shavi

1) The document describes an inexpensive eye tracking system using electro-oculography (EOG) for human-computer interaction, such as controlling cursors, menus, and commands. 2) Experiments were conducted with a 3x2 menu that could be navigated using eye movements. Correct selection was around 73% accurate but improved when focusing on corner boxes versus center boxes. 3) Head movements and muscle interference created noise that needs to be reduced, but EOG provides a low-cost option for basic eye-based control applications like wheelchairs or screen gazing.

Uploaded by

Ashish Redekar
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
89 views2 pages

An Eye Tracking Computer User Interface: V Arie E. Kaufman, Amit Bandopadhay, and Bernard D. Shavi

1) The document describes an inexpensive eye tracking system using electro-oculography (EOG) for human-computer interaction, such as controlling cursors, menus, and commands. 2) Experiments were conducted with a 3x2 menu that could be navigated using eye movements. Correct selection was around 73% accurate but improved when focusing on corner boxes versus center boxes. 3) Head movements and muscle interference created noise that needs to be reduced, but EOG provides a low-cost option for basic eye-based control applications like wheelchairs or screen gazing.

Uploaded by

Ashish Redekar
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

An Eye Tracking Computer User Interface

Arie E. Kaufman, Amit Bandopadhay, and Bernard D. Shaviv


Computer Science Department
State University of New York at Stony Brook, NY 11794-4400

Abstract measure the vertical motion of only one eye and the
We describe an inexpensive eye movement controlled horizontal motion of both eyes. If the orientation of the
user interface for 2D and 3D interaction. It is based on eyes is measured, it is possible to locate the 3D position
electro-oculography (EOG) rather than the very expen- of a fixated target object by triangulation.
sive reflectance based methods. We have built the The signal quality of the EOG output data has been well
hardware and software to demonstrate the viability of documented in neurophysiology and in electro-
EOG for human-computer communication. Our experi- nystagmography (ENG), the study of eye movements
ments indicate that EOG provides the basis for an ade- [Coh86]. Straightforward signal processing steps have
quate input interaction device. Being very inexpensive, been devised to condition the data so it can be reliably
the system is applicable for many virtual reality systems interpreted by a technician. Some of the noise patterns,
and video games as well as for the handicapped. such as the line frequency, can be easily removed using
a notch filter. Other noise artifacts are mostly transient,
1. Introduction caused, for example, by the turning of an electrical
Many computer interface hardware and software switch, contraction of the facial or neck muscles, slip-
designs, such as mice and touch screens are a nice page of the electrode due to sweat and eye blinking.
improvement over keyboard for some tasks, but cannot Our research has been focusing on creating a workable
be utilized by quadriplegics. Yet, it is the physically eye controlled user interface. In a 19" monochrome
disabled that have the most to gain from and the display, a typical pixel configuration is 1024×768 at 72
greatest dependence on computer and electronic aids for dpi, for an active display area of 14.22×10.67 inches.
work, recreation, environmental control, and even for When centrally viewed from a distance of 2’, this region
the most basic communication needs. Although several subtends an angle of 25° vertically, and 33° horizon-
computer interfaces have been devised for the handi- tally. Maximum EOG or reflectometry resolution is
capped (e.g., Erica [Hut89], EWP [YaF87]), there are about 1-2 degrees [YoS75]. With menu boxes gen-
not any inexpensive and non-intrusive systems that erously separated by 3 degrees, the 19 inch display still
deliver the true power and ease of today’s computers. has sufficient room for a 10×4 matrix of directly select-
Our main goal is to develop an inexpensive hardware- able keys – leaving the bottom half of the screen avail-
software system for use in the most challenging cases: able for a text display area and other controls.
the estimated 150,000 disabled persons able to control Recognizing blinks as legitimate actions distinct from
only the muscles of their eyes. This encompasses the eye movement would also allow their use for rapid
construction of an EOG eye-tracking hardware and its invocation of important global commands, such as cal-
fine-tuning in software, as well as the definition of ack- ling an attendant, and in each module as context-
nowledgeable eye behavior and the establishment of sensitive command shortcuts. The EOG system can
basic protocols governing on-screen object selection and potentially recognize ‘‘eye gestures,’’ such as left and
manipulation. Such a device can also be used for many right winking and blinking, or any combination thereof.
virtual reality systems and video games. The eye gesture command language could even be
Electro-oculography depends on the corneoretinal poten- extensible and programmable by the user himself. For
tial that creates an electrical field in the front of the example, during text entry or while scanning read-only
head. This field changes in orientation as the eyeballs text, a left blink rapidly followed by a right blink could
rotate. The electrical changes can be detected by elec- be a page-up command; right followed by a left would
trodes placed on the skin near the eyes. In clinical be a page-down, etc.
practice, the detected voltage changes are amplified and
used to drive a plotting device, whereby a tracing of eye 2. System Design
position is obtained. We have produced in-house electro-oculographic equip-
It is possible to obtain independent measurements from ment from inexpensive, off-the-shelf components, and
the two eyes. However, the two eyes move in conjunc- set up to detect horizontal and vertical eye movement.
tion in the vertical direction. Hence, it is sufficient to The potential across two electrodes placed
posteriolaterally to the outer canthi is measured relative Although accurate results are difficult to achieve (73%),
to a ground lead strapped around the wrist or clipped to many of the errors are tied to each other. When a
the auricle. The resulting voltage is amplified and sent wrong choice is made, there is a high tendency for both
though a custom-built, 8-bit analog to digital converter a horizontal and vertical selection error. Note that
filtered to remove high-frequency electrical noise. The results improve when only the four corner squares are
converter fits into an IBM PC expansion slot, and looked at, as opposed to only the two center squares.
transmits the digitized data through the PC serial port to There are several problems, related to head and muscle
a SUN workstation for X window based display. movement interference, signal drift, and channel cross-
The principal software modules in the system and their talk, which must be overcome to improve the perfor-
functions are: mance measures reported above. Whether the user
1. Signal smoothing and filtering to eliminate noise. makes a choice or sits idle, there is always some una-
Calculation of quantitative parameters from the signal voidable minor head movement. In applications where
channels (two for horizontal movements, one for each only a rough resolution is used, such as in driving a
eye, and one for vertical movement of the eyes). These wheel-chair (e.g., forward, left, right, stop), or when
parameters are angular positions, angular velocities, and gazing at the computer screen, head movements are
angular accelerations of the eyes. negligible. However, in other applications head tracking
can be incorporated into the system to compensate for
2. Extraction of symbolic tokens from the signal.
the head movement.
These tokens indicate the direction of the movement of
the gaze (e.g., North, South, NE, etc.), the magnitude of 4. Conclusions
the move, and the type of eye movement, such as
There are many ways to measure eye movement, some
smooth pursuit, saccade, or a blink.
far more accurate than EOG, but these are expensive.
3. Graphical user interface. This includes control algo- Furthermore, the eye tracking method is just a means,
rithms to manipulate cursor motion and decision algo- one in which pinpoint accuracy is not really necessary;
rithms to drive the overall interface. It automatically the provided service and ease of use of the eye-
decides when the user is actually engaged/disengaged in controlled interface is the true goal. Our experiments
interacting with the system. The graphical user inter- have shown that EOG is a viable and inexpensive
face has been developed within the framework of our method for human-computer interaction.
Cube 3D user interface [KaY90].
Acknowledgements
3. Experimental Results This project has been supported by a grant from the
The experiments were performed using a 3×2 boxed New York State Science and Technology Foundation
menu driven by eye selections. We experimented with and by a National Science Foundation grant IRI-
a two level menu. There are several parameters which 9008109. We wish to express special thanks to George
give the software the ability to make a correct choice: Piligian, MD, for his help with this project.
number of calibration repetitions, number of data sam-
ples necessary for absolute choice determination, 5. References
different thresholds, etc. The above parameters can be [Coh86] Cohen, A., Biomedical Signal Processing, CRC
set manually, or ‘‘automatically’’ by an auto-calibration Press, Inc., Boca Raton, FL, 1986.
mode. Once the parameters are set, a second calibration [Hut89] Hutchinson et al., "Human-Computer Interac-
mechanism is invoked before a menu selection. The tion Using Eye Gaze Input", IEEE Trans. on Systems,
user follows a box which horizontally moves back and Man & Cybernetics, 19, 6 (1989) 1527-1534.
forth on the screen, until calibration occurs.
[KaY90]Kaufman, A. and Yagel, R., "Towards a
The following performance measures of correct selec- Three-Dimensional User Interface", in Augmented
tions have been recorded after repeated use of this pro- Interactive Computer Capabilities, A. Klinger (ed.), Ple-
gram by two experienced subjects (accuracy within 5%): num Publishing, 1990, 255-267.
g Menu selections: 73% [YaF87] Yamada, M. and Fukuda, T., "Eye Word Pro-
g Menu selections (4 corners only): 90% cessor and Peripheral Controller for the ALS Patient",
g Horizontal detection: 75% IEEE Proceedings A, 134, 4 (1987) 328-330.
g Horizontal detection (4 corners only): 99% [YoS75] Young, L. R. and Sheena, D., "Survey of Eye
g Vertical detection: 92% Movement Recording Methods", Behavior Research
g Vertical detection (2 centers only): 92% Methods & Instrumentation, 7, 5 (1975), 397-429.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy