Aboutalib and Ramroth (1997)
Aboutalib and Ramroth (1997)
Aboutalib and Ramroth (1997)
NFRARED
SOURCE
MAGING
APPARATUS
NFRARED
SOURCE
1O
MAGING
APPARATUS
18
24 CORROBORATING
ALERTNESS UNIT
CORRELATION
COEFFICIENT
(AWAKE)
TIME
FIG 9A
CORRELATION
COEFFICENT
(DROWSY)
TIME
FIG 9B
U.S. Patent Feb. 2, 1999 Sheet 2 of 15 5,867,587
WAS
ACTUAL
EYE LOCATION
DETERMINED
ARE
ALL ACTUAL
EYE LOCATIONS
"LOST
FIG 2
U.S. Patent Feb. 2, 1999 Sheet 3 of 15 5,867,587
302
AVERAGE THE PIXEL VALUES OF A FIRST Mx BY My
PIXEL BLOCK FOR EACH OF THREE My HIGH ROWS
OF THE DIGITIZED IMAGE, STARTING IN THE UPPER
LEFT-HAND CORNER, THEREBY CREANG THE
FIRST COLUMN OF AN OUTPUT MATRIX
304
AVERAGE THE PIXEL VALUES OF A Mx BY My PIXEL
BLOCK OFFSET HORIZONTALLY TO THE RIGHT BY ONE
PIXEL COLUMN FOR EACH OF THE THREE My HIGH
ROWS, THEREBY CREATING THE NEXT COLUMN
OF THE OUTPUT MATRIX
DOES LAST
AVERAGED PIXEL
BLOCKS INCLUDE PXELS
FROM THE RIGHT-HAND END
COLUMN OF THE
MAGE
YES
3O6
REPEAT STEPS 302 AND 304, EXCEPT OFFSET THE
Mx BY My PIXEL BLOCKS VERTICALLY DOWNWARD ONE
PIXEL ROW, THEREBY CREATING ANOTHER OUTPUT MATRIX
DOES LAST
AVERAGED PXEL
BLOCKS INCLUDE PXELS
FROM THE BOTTOM
ROW OF THE
IMAGE
d
YES
FIG 3 GD
U.S. Patent Feb. 2, 1999 Sheet 4 of 15 5,867,587
3O8
COMPARE EACH ELEMENT, N EACH OUTPUT MATRIX
WITH A DESRED THRESHOLD RANGE
31 O
FLAG THOSE MATRIX ELEMENTS WHICH EXCEED THE
LOWER LIMIT OF THE THRESHOLD RANGE AND ARE LESS
THAN THE UPPER LIMIT OF THE THRESHOLD RANGE
312
COMPARE THE AVERAGE PIXEL INTENSITY OF EACH
Mx BY My PIXEL BLOCK WHICH SURROUNDS THE PIXEL
BLOCK ASSOCATED WITH EACH FLAGGED OUTPUT
MATRIX ELEMENT TO THRESHOLD VALUES
DOES EACH
SURROUNDING PXEL, BLOCKS
AVERAGE PXEL INTENSTY MEET
NO THE CRITERIA OF THE THRESHOLD YES
VALUE FOR A PARTICULAR PXEL, BLOCK
ASSOCATED WITH A FLAGGED
MATRIX ELEMENT
?
318
EXAMINE FLAGGED MATRIX ELEMENTS ASSOCATED WITH
POTENTIAL EYE LOCATIONS WHICH HAVE CORRESPONDING
PXEL, BLOCKS WITH COMMON PIXELS
ARE TWO
POTENTIAL EYE
RENTALIZE LOCATIONS WITHIN
SYSTEM THE REQUIRED
AREA
?
502
DENTIFY CUT-OUT BLOCKS IN A PRESENT MAGE FRAME
WHICH RESPECTIVELY INCLUDE A Mx BY My PIXEL
BLOCK IDENTIFIED AS A POTENTIAL EYE LOCATION IN
THE PRECEDNG MAGE FRAME
504
506
508
FIG 5
U.S. Patent Feb. 2, 1999 Sheet 8 of 15 5,867,587
DOES
AT LEAST
NO ONE ELEMENT IN A YES
CORRELATION COEFFICIENT
MATRIX EXCEED THE
THRESHOLD
512 510
HAS
A "NO-CORRELATION"
CONDITION BEEN NOTED
NO FOR A PRESCRIBED NUMBER
OF CONSECUTIVE FRAMES
AND THEN THE POTENTAL EYE
LOCATION DESIGNATION
RETURNS
YES 514
516
FIG. 5 (cont)
U.S. Patent Feb. 2, 1999 Sheet 9 of 15 5,867,587
YES 702
ASSIGN A LOW CONFIDENCE STATUS
TO THE POTENTIAL EYE LOCATION
HAS A "NO-CORRELATION"
CONDITION EXISTED AT A
NO POTENTIAL EYE LOCATION FOR
MORE THAN A PRESCRIBED
NUMBER OF MAGE FRAMES
YES 704
ASSIGN A LOW CONFIDENCE STATUS
TO THE POTENTIAL EYE LOCATION
HAS A "NO-CORRELATION"
CONDITION EXISTED AT AN
ACTUAL EYE LOCATION FOR NO
MORE THAN A PRESCRIBED
NUMBER OF IMAGE FRAMES
IS THERE
A SECOND ACTUAL EYE
LOCATION WITHIN A PRESCRIBED
GEOMETRIC AREA IN RELATON
TO THE "LOST ACTUAL EYE
LOCATION
NO
FIG 7 O G2)
U.S. Patent Feb. 2, 1999 Sheet 10 of 15 5,867,587
FIG. 7 (cont)
ARE THERE
NO ONLY LOW CONFIDENCE YES
POTENTIAL EYE LOCATIONS
REMAINING
?
FIG 1O
HAS A
CORROBORATION
INDICATION OF OPERATOR
IMPAREDNESS BEEN
MADE
848
INITIATE IMPARED OPERATOR WARNING
U.S. Patent Feb. 2, 1999 Sheet 11 of 15 5,867,587
FG, 8E
804
IDENTIFY AND STORE MAXIMUM CORRELATON
COEFFICIENT FROM EACH MATRIX
-806
AVERAGE FIRST N CONSECUTIVE MAXIMUM
CORRELATION COEFFICIENT VALUES FOR
EACH EYE LOCATION
-808
AVERAGE PREVIOUS N CONSECUTIVE MAXIMUM
CORRELATION COEFFICIENT VALUES UPON
GENERATION OF NEXT MAGE FRAME FOR EACH
EYE LOCATION (i.e., UPON GENERATION OF
NEXT CORRELATION COEFFICIENT MATRIX FOR
AN EYE LOCATION)
FIG 8A
U.S. Patent Feb. 2, 1999 Sheet 12 of 15 5,867,587
FG, 8A FG, 8A
810
COMPARE AVERAGED MAXIMUM CORRELATION
COEFFICIENT VALUE INDICATIVE OF EYE
BLINK DURATION TO THE ALERT OPERATOR
DURATION THRESHOLD
IS
NO THE AVERAGE
VALUE LESS THAN
THE THRESHOLD
2
YES
812
PROVIDE INDICATION OPERATOR MAY BE IMPARED
F.G. 1 O
FIG 8B
U.S. Patent Feb. 2, 1999 Sheet 13 of 15 5,867,587
F.G. 8A FG. 8A
-814
COUNT NUMBER OF MINIMUM AVERAGED MAXIMUM
CORRELATION COEFFICIENT VALUES ASSOCATED
WITH AN EYE BLINK OCCURRING OVER THE
PRESCRIBED PERIOD OF TME AND DIVIDE THE
COUNTED NUMBER BY THE PRESCRBED PERIOD
OF TIME TO DETERMINE THE FREQUENCY OF EYE
BLINKS FOR AN EYE LOCATION
S
THE DETERMINED
EYE BLINK FREQUENCY
FOR AN EYE LOCATION
LESS THAN THE
THRESHOLD
YES
818
PROVIDE INDICATION OPERATOR MAY BE IMPARED
FIG 8C S.
U.S. Patent Feb. 2, 1999 Sheet 14 of 15 5,867,587
FG, 8A FG. 8A
82O
IDENTIFY EACH MINIMUM AVERAGED MAXIMUM
CORRELATION COEFFICIENT VALUE ASSOCATED
WITH AN EYE BLINK FOR EACH EYE LOCATION
822
DENTIFY THE NEXT OCCURING MAXIMUM
AVERAGED MAXIMUM CORRELATION COEFFICIENT
VALUE FOLLOWING EACH IDENTIFIED MINIMUM VALUE
-824
DETERMINE THE ABSOLUTE DIFFERENCE BE WEEN
EACH ASSOCATED MAXIMUM AND MINIMUM
AVERAGED MAXIMUM CORRELATION COEFFICIENT
VALUES TO DETERMINE AN EYE BLINK AMPLITUDE
-826
COMPARE THE DETERMINED EYE BLINK AMPLTUDE
FOR AN EYE LOCATION TO THE ALERT OPERATOR
AMPLTUDE THRESHOLD
S THE
DETERMINED EYE
NO BLINK AMPLITUDE FOR
AN EYE LOCATION LESS
THAN THE THRESHOLD
f
YES 828
PROVIDE INDICATION OPERATOR MAY BE IMPARED
FIG 8D S.
U.S. Patent Feb. 2, 1999 Sheet 15 of 15 5,867,587
YES
DOES
DFF. EXCEED
NO THRESHOLD
YES
84.4
5,867.587
1 2
IMPARED OPERATOR DETECTION AND average the first N consecutive correlation coefficients gen
WARNING SYSTEM EMPLOYING erated to generate a first average correlation coefficient,
EYEBLINKANALYSIS where the N corresponds to at least the number of images
required to image a blink of the operator's eyes. After the
BACKGROUND production of the next image by the imaging apparatus, the
impaired operator detection unit averages the previous N
1. Technical Field consecutive correlation coefficients generated to create a
This invention relates to a System and method for detect next average correlation coefficient. This process is repeated
ing when an operator performing tasks which require for each image frame produced by the imaging apparatus.
alertneSS, Such as a vehicle operator, air traffic controller, and Next, the impaired operator detection unit analyzes the
the like, is impaired due to drowsiness, intoxication, or other average correlation coefficients associated with each eye to
physical or mental conditions. More particularly, the present extract at least one parameter attributable to an eyeblink of
invention employs an eyeblink analysis to accomplish this the operator's eyes. These extracted parameters are com
impaired operator detection. Further, this System and pared to an alert operator threshold associated with that
method includes provisions for providing a warning when an 15
parameter. This threshold is indicative of an alert operator.
operator is determined to be impaired. An impaired operator warning unit is used to indicate that
2. Background Art the operator may be impaired if any extracted parameters
deviate from the associated threshold in a prescribed way.
Heretofore, a detection System employing an analysis of Preferably, the aforementioned analyzing Step performed
a blink of an operator's eye to determine impairedness has by the impaired operator detection unit includes extracting
been proposed which uses a eyeblink waveform Sensor. The parameters indicative of one or more of the duration,
Sensor is of a conventional type, Such as an electrode frequency, and amplitude of an operator's eyeblinks. The
presumably attached near the operator's eye which produces Subsequent comparing process can then include comparing
electrical impulses whenever the operator blinkS. Thus, the an extracted duration parameter to an alert operator duration
Sensor produces a Signal indicative an eyeblink. The pro threshold which corresponds to a maximum eyeblink dura
posed System records an eyeblink parameter pattern derived 25
tion expected to be exhibited by an alert operator's eye,
from the eyeblink waveform of an alert individual, and then comparing an extracted frequency parameter to an alert
monitors subsequent eyeblinks. Parameters derived from the operator frequency threshold which corresponds to a mini
eyeblink waveforms generated during the monitoring phase mum eyeblink frequency expected to be exhibited by an
are compared to the recorded awake-State parameters, and an alert operator's eye, and comparing an extracted amplitude
alarm Signal is generated if an excessive deviation exists. parameter to an alert operator amplitude threshold which
Another impaired operator detection System has been corresponds to a minimum eyeblink amplitude expected to
proposed which uses two illuminator and reflection Sensor be exhibited by an alert operator's eye. Further, the com
pairs. Essentially the eye of the operator is illuminated from paring process can include determining the difference
two different directions by the illuminators. The sensors are 35 between at least one of the extracted parameters associated
used to detect reflection of the light from the illuminated with a first eye and a like extracted parameter associated
eye. A blink is detected by analyzing the amount of light with the other eye, to establish a consistency factor for the
detected by each sensor. The number and duration of the extracted parameter. Then, the established parameter con
detected blinks are used to determine whether the monitored Sistency factor is compared to an alert operator consistency
operator is impaired. 40
threshold associated with that parameter.
Although these prior art Systems may work for their Preferably, the impaired operator warning unit operates
intended purpose, it is a primary object of the present Such that an indication is made that the operator may be
invention to provide an improved and more reliable impaired whenever one or more of the following is deter
impaired operator detection and warning System which can mined:
provide a determination of whether an operator is impaired 45 (1) the extracted duration parameter exceeds the alert
on the basis of eyeblink characteristics and thereafter initiate operator duration threshold;
an impaired operator warning with greater reliability and (2) the extracted frequency parameter is less than the alert
accuracy, and via less intrusive methods, than has been operator frequency threshold;
achieved in the past. (3) the extracted amplitude parameter is less than the alert
SUMMARY 50 operator amplitude threshold; and
(4) an established parameter consistency factor exceeds
The above-described objectives are realized with embodi an associated alert operator consistency threshold.
ments of the present invention directed to a System and The System and method can also involve the use of a
method for detecting and warning of an impaired operator. corroborating operator alertneSS indicator unit which gener
The System and method employ an imaging apparatus which 55 ates a corroborating indicator of operator impairedness
produces consecutive digital images including the face and whenever measured operator control inputs are indicative of
eyes of an operator. Each of these digital images has an array the operator being impaired. If Such a unit is employed, the
of pixels representing the intensity of light reflected from the impaired operator warning unit can be modified Such that an
face of the Subject. There is also an eye finding unit which indication is made that the operator is impaired whenever at
determines the location of the operator's eyes within each 60 least one of the extracted parameter deviates from the
digital image, and generates correlation coefficients corre asSociated threshold in the prescribed way, and the corrobo
sponding to each eye. Each correlation coefficient quantifies rating indicator is generated.
the degree of correspondence between pixels associated with In addition to the just described benefits, other objectives
the location of the operator's eye in an immediately preced and advantages of the present invention will become appar
ing image in comparison to pixels associated with the 65 ent from the detailed description which follows hereinafter
location of the operator's eye in a current image. The System when taken in conjunction with the drawing figures which
and method employ an impaired operator detection unit to accompany it.
5,867.587
3 4
DESCRIPTION OF THE DRAWINGS frame preferably consists of an 640 by 480 array of pixels
The Specific features, aspects, and advantages of the each having one of 256 (i.e. 0 to 255) gray tones represen
tative of the intensity of reflected light from a portion of the
present invention will become better understood with regard Subject's face. The output signal from the imaging apparatus
to the following description, appended claims, and accom is fed into an eye finding and tracking unit 14. The unit 14
panying drawings where: processes each image frame produced by the imaging appa
FIG. 1 is a Schematic diagram showing one embodiment ratus 10 to detect the position of the subject's eye and to
of an impaired operator detection and warning System in track these eye positions over time. The eye finding and
accordance with the present invention. tracking unit 14 can employ a digital computer to accom
FIG. 2 is a preferred overall flow diagram of the process plish the image processing task, or alternately, the process
used in the eye finding and tracking unit of FIG. 1. ing could be performed by logic circuitry Specifically
designed for the task. Optionally, there can also be an
FIG. 3 is a flow diagram of a process for identifying infrared light Source 16 positioned So as to illuminate the
potential eye locations (and optionally actual eye locations) Subject's face. The eye finding and tracking unit 14 would be
within an image frame produced by the imaging apparatus of used to control this light source 16. The infrared light source
FIG. 1. 15
16 is activated by the unit 14 whenever it is needed to
FIG. 4 is an idealized diagram of the pixels in an image effectively image the Subject's face. Specifically, the light
frame including various exemplary pixel block designations Source would be activated to illuminate the Subject's face at
applicable to the process of FIG. 3. night or when the ambient lighting conditions are too low to
FIG. 5 is a flow diagram of a process for tracking eye obtain an image. The unit 14 includes a Sensor capable of
locations in Successive image frames produced by the imag determining when the ambient lighting conditions are inad
equate. In addition, the light Source would be employed
ing apparatus of FIG. 1, as well as a process of detecting a when the Subject 12 is wearing non-reflective Sunglasses, as
blink at a potential eye location to identify it as an actual eye these types of Sunglasses are transparent to infrared light.
location. The Subject could indicate that Sunglasses are being worn,
FIG. 6 is a diagram showing a cut-out block of an image 25 Such as by depressing a control Switch on the eye finding and
frame applicable to the process of FIG. 5. tracking unit 14, thereby causing the infrared light Source 16
FIG. 7 is a flow diagram of a process for monitoring to be activated. Alternately, the infrared light source 16
potential and actual eye locations and to reinitialize the eye could be activated automatically by the unit 14, for example,
finding and tracking System if all monitored eye locations when the subject’s eyes cannot be found otherwise. Of
are deemed low confidence locations. course, if an infrared light Source 16 is employed, the
FIGS. 8A-E are flow diagrams of the preferred processes imaging apparatuS 10 would be of the type capable of
used in the impaired operator detection unit of FIG. 1. Sensing infrared light.
FIGS. 9A-B are graphs representing the average corre The above-described system also includes an impaired
operator detection unit 18 connected to an output of the eye
lation coefficients determined via the process of FIG. 8A 35 finding and tracking unit 14, and an impaired operator
over time for the right eye of an alert operator (FIG.9A) and warning unit 20 connected to an output of the detection unit
the same operator when drowsy (FIG. 9B). 18. The impaired operator detection unit 18 processes the
FIGS. 10 is a flow diagram of the preferred process used output of the of the eye finding and tracking unit 14, which,
in the impaired operator warning unit of FIG. 1. as will be discussed in detail later, includes an indication that
DETAILED DESCRIPTION OF THE 40 an actual eye location has been identified and provides
correlation data associated with that location for each Suc
PREFERRED EMBODIMENTS
cessive image frame produced by the imaging apparatuS 10.
In the following description of the preferred embodiments This output is processed by the impaired operator detection
of the present invention, reference is made to the accompa unit 18 in Such a way that eyeblink characteristics are
nying drawings which form a part hereof, and in which is 45 identified and compared to characteristics associated with an
shown by way of illustration specific embodiments in which alert operator. This comparison data is provided to the
the invention may be practiced. It is understood that other impaired operator warning unit 20 which makes a determi
embodiments may be utilized and structural changes may be nation whether the comparison data indicates the operator
made without departing from the Scope of the present being monitored is impaired in Some way, e.g. drowsy,
invention. 50 intoxicated, or the like. The impaired operator detection unit
The present invention preferably employs at least a por 18 and impaired operator warning unit 20 can employ a
tion of the a unique eye finding and tracking System and digital computer to accomplish their respective processing
method as disclosed in a co-pending application entitled tasks, or alternately, the processing could be performed by
EYE FINDING AND TRACKING SYSTEM, having the logic circuitry Specifically designed for these tasks. If a
Same inventors as the present application and assigned to a 55 computer is employed, it can be the same one potentially
common assignee. This co-pending application was filed on used in connection with the eye finding and tracking unit 14.
May 19, 1997 and assigned Ser. No. 08/858,841. The It is noted that the detection of an impaired operator may
disclosure of the co-pending application is hereby incorpo also involve processing inputs from at least one other device,
rated by reference. Generally, as shown in FIG. 1, this eye Specifically a corroborating operator alertneSS indicator unit
finding and tracking System involves the use of an imaging 60 24, which provides additional “non-eyeblink determined”
apparatus 10 which may be a digital camera, or a television indications of the operatorS alertneSS level. For example, a
camera connected to a frame grabber device as is known in device which provides an indication of a vehicle or machine
the art. The imaging apparatus 10 is located in front of a operator's alertneSS level based on an analysis of the opera
Subject 12, So as to image his or her face. Thus, the output tors control actions could be employed in the appropriate
of the imaging apparatus 10 is a signal representing digitized 65 circumstances.
images of a Subject's face. Preferably, the digitized images The warning unit 20 also controls a warning device 22
are provided at a rate of about 30 frames per Second. Each used to warn the operator, or Some other cognizant authority,
5,867.587
S 6
of the operator's impaired condition. If the warning device step 304 is to create the next column of the output matrix.
22 is used to warn the operator of his or her impairedness, This is accomplished by averaging the intensity representing
it could be an alarm of any type which will rouse the values of a M. by M, pixel block which is offset horizontally
operator, and can be directed at any one or more of the to the right by one pixel column from the first pixel block for
operator's Senses. For example, an audible alarm might be each of the three aforementioned M, high rows, as shown by
Sounded alone or in conjunction with flashing lights. Other the broken line boxes 18 in FIG. 4. This process is repeated,
examples of alarm mechanisms that might be used include moving one pixel column to the right during each iteration,
those producing a vibration or shock to the operator. Even until the ends of the three M, high rows in the upper portion
Smells might be employed. It is known certain Scents induce of the image frame are reached. The result is one completed
alertneSS. The warning device 22 could also be of a type that output matrix. The next step 306 in the process is to repeat
alerts Someone other than the operator of the operator's steps 302 and 304, except that the M by M, pixel blocks
impaired condition. For example, the Supervisor in an air being averaged are offset vertically downward from the
traffic control center might be warned of a controller's previous pixel blocks by one pixel row, as depicted by the
inability to perform adequately due to an impaired condi dashed and dotted line boxes 19 in FIG. 4. This produces a
tion. If Such a remote alarm is employed it can be of any type 15 Second complete output matrix. This process of offsetting
which attracts the attention of the person monitoring the the blocks vertically downward by one pixel row is then
operator's alertness, e.g. an audible alarm, flashing lights, continued until the bottom of the image frame is reached,
and the like. thereby forming a group of output matrices.
FIG. 2 is an overall flow diagram of the preferred process In step 308, each element of each output matrix in the
used to find and track the location of a Subject's eyes. At Step group of generated output matrices is compared with a
202, a first image frame of the subject's face is inputted from threshold range. Those matrix elements which exceed the
the imaging apparatus to the eye finding and tracking unit. lower limit of the threshold range and are less than the upper
At Step 204, the inputted image frame is processed to limit of this range, are flagged (Step 310). The upper limit of
identify potential eye locations. This is preferably the threshold range corresponds to a value which represents
accomplished, as will be explained in detail later, by iden 25 the maximum expected average intensity of a M. by M,
tifying features within the image frame which exhibit pixel block containing an image of the iris and pupil of a
attributes consistent with those associated with the appear Subject's eye for the illumination conditions that are present
ance of a Subject's eye. This process is implemented in a at the time the image was captured. The maximum average
recursive manner for efficiency. However, in the context of intensity of block containing the image of the Subject's pupil
the present invention non-recursive, conventional proceSS and at least a portion of the iris will be lower than the same
ing techniques could be employed to determine eye Size portion of most other areas of the Subject's face because
locations, as long as the process results in an identification the pupil absorbs a Substantial portion of the light impinging
of potential eye locations within a digitized video image thereon. Thus, the upper threshold limit is a good way of
frame. Next, in step 206, a determination is made as to which eliminating portions of the image frame which cannot be the
of the potential eye locations is an actual eye of the Subject. 35 Subject's eye. However, it must be noted that there are Some
This is generally accomplished by monitoring Successive things that could be in the image of the Subject's face which
image frames to detect a blink. If a blink is detected at a do absorb more light than the pupil. For example, black hair
potential eye location, it is deemed an actual eye location. can under Some circumstances absorb more light. In
This monitoring and blink detection proceSS will also be addition, if the image is taken at night, the background
described in detail later. At step 208, the now determined 40 Surrounding the Subject's face could be almost totally black.
actual eye locations are continuously tracked and updated The lower threshold limit is employed to eliminate these
using Successive image frames. In addition, if the location of portions of the image frame which cannot be the Subject's
the actual eye locations are not found or are lost, the proceSS eye. The lower limit corresponds to a value which represents
is reinitialized by returning to Step 202 and repeating the eye the minimum expected average intensity of a M. by M, pixel
finding procedure. 45 block containing an image of the pupil and at least a portion
FIG. 3 is a flow diagram of the preferred process used to of the Subject's iris. Here again, this minimum is based on
identify potential eye locations in the initial image frame, as the illumination conditions that are present at the time the
disclosed in the aforementioned co-pending application. The image is captured.
first step 302 of the preferred process involves averaging the Next, in step 312, the average intensity value of each M.
digitized image values which are representative of the pixel 50 by M, pixel block which surrounds the M by M, pixel block
intensities of a first M by M, block of pixels for each of asSociated with each of the flagged output matrix elements
three M, high rows of the digitized image, starting in the is compared to an output matrix threshold value. In one
upper left-hand corner of the image frame, as depicted by the embodiment of the present invention, this threshold value
solid line boxes 17 in FIG. 4. The three averages obtained in represents the lowest expected average intensity possible for
step 302 are used to form the first column of an output 55 the pixel block sized areas immediately adjacent the portion
matrix. The M variable represents a number of pixels in the of an image frame containing the Subject's pupil and iris.
horizontal direction of the image frame, and the M, variable Thus, if the average intensity of the Surrounding pixel
represents a number of pixels in the vertical direction of the blocks exceeds the threshold value, then a reasonably high
image frame. These variables are chosen So that the resulting probability exists that the flagged block is associated with
M, by M, pixel block has a size which just encompasses the 60 the location of the subject's eye. Thus, the pixel block
minimum expected Size of the iris and pupil portions of a asSociated with the flagged element is designated a potential
Subject's eye. In this way, the pixel block would contain an eye location (step 314). However, if one or more of the
image of the pupil and at least a part of the iris of any average intensity values for the blockS Surrounding the
Subject's eye. flagged block falls below the threshold, then the flagged
Once the first column of the output matrix has been 65 block is eliminated as a potential eye location (step 316).
created by averaging the first three M. by M, pixel blocks This comparison concept is taken further in a preferred
in the upper right-hand portion of the image frame, the next embodiment of the present invention where a separate
5,867.587
7 8
threshold value is applied to each of the Surrounding pixel element representing the block having the minimum average
block averages. This has particular utility because Some of intensity among the examined group of elements, or which
the areas immediately Surrounding the iris and pupil exhibit is centered within the group, remain flagged. The others are
unique average intensity values which can be used to de-Selected and no longer considered potential eye locations
increased the confidence that the flagged pixel block is good (step 320).
prospect for a potential eye location. For example, the areas Actual eye locations are identified from the potential eye
immediately to the left and right of the iris and pupil include locations by observing Subsequent image frames in order to
the white parts of the eye. Thus, these areas tend to exhibit detect a blink, i.e. a good indication a potential eye location
a greater average intensity than most other areas of the face. is an actual eye location. A preliminary determination in this
Further, it has been found that the areas directly above and blink detecting process (and as will be seen the eye tracking
below the iris and pupil are often in shadow. Thus, the process) is to identify the image pixel in the original image
average intensity of these areas is expected to be less than frame which constitutes the center of the pupil of each
many other areas of the face, although greater than the identified potential eye location. AS the pixel block associ
average intensity of the portion of the image containing the ated with the identified potential eye location should be
iris and pupil. Given the aforementioned unique average 15 centered on the pupil, finding the center of the pupil can be
intensity profile of the areas Surrounding the iris and pupil, approximated by Simply Selecting the pixel representing the
it is possible to chose threshold values to reflect these traits. center of the pixel block. Alternately, a more intensive
For example, the threshold value applied to the average process can be employed to ensure a the accuracy of the
intensity value of the pixel blocks directly to the left and identified pupil center location. This is accomplished by
right of the flagged block would be just below the minimum first, comparing each of the pixels in an identified block to
expected average intensity for these relatively light areas of a threshold value, and flagging those pixels which fall below
the face, and the threshold value applied to the average this threshold value. The purpose of applying the threshold
intensity values associated with the pixel block directly value is to identify those pixel of the image which corre
above and below the flagged block would be just above the spond to the pupil of the eye. AS the pixels associated with
maximum expected average intensity for these relative dark 25 the pupil image will have a lower intensity than the Sur
regions of the face. Similarly, the pixel blockS diagonal to rounding iris, the threshold value is chosen to approximate
the flagged block would be assigned threshold values which the highest intensity expected from the pupil image for the
are just below the minimum expected average intensity for illumination conditions present at the time the image was
the block whenever the average intensity for the block is captured. This ensures that only the darker pupil pixels are
generally lighter than the rest of the face, and just above the Selected and not the pixels imaging the relatively lighter
maximum expected average intensity for a particular block Surrounding iris Structures. Once the pixels associated with
if the average intensity of the block is generally darker than the pupil are flagged, the next Step is to determine the
the rest of the face. If the average intensity of the “lighter” geographic center of the Selected pixels. This geographic
blockS exceeds the respectively assigned threshold value, or center will be the pixel of the image which represents the
the “darker' blocks are leSS than the respectively assigned 35 center of the pupil, as the pupil is circular in shape. The
threshold value, then the flagged pixel block is deemed a geographic center of the Selected pixels can be accomplished
potential eye location. If any of the Surrounding pixel blockS in a variety of ways. For example, the pixel block associated
do not meet this thresholding criteria, then the flagged pixel with the potential eye location can be Scanned horizontally,
block is eliminated as a potential eye location. column by column, until one of the Selected pixels is
Of course, because the output matrices were generated 40 detected within a column. This column location is noted and
using the previously-described “one pixel column and one the horizontal Scan is continued until a column containing no
pixel row offset approach, Some of the matrices will contain Selected pixels is found. This Second column location is also
rows having identical elements as others because they noted. A similar Scanning process is then conducted
characterize the same pixels of the image frame. This does vertically, so as to identify the first row in the block
not present a problem in identifying the pixel block locations 45 containing a Selected pixel and the next Subsequent row
asSociated with potential eye locations as the elements containing no Selected pixels. The center of the pupil is
flagged by the above-described thresholding proceSS in chosen as the pixel having a column location in-between the
multiple matrices which correspond to the same pixels of the noted columns and a row location in-between the noted
image frame will be identified as a Single location. If fact, rows. Any noise in the image or spots in the iris, which are
this multiplicity Serves to add redundancy to the identifica 50 dark enough to be Selected in the aforementioned thresh
tion process. However, it is preferred that the pixel block olding Step, can skew the results of the just-described
asSociated with a flagged matrix element correspond to the process. However, this possibility can be eliminated in a
portion of the image centered on the Subject's pupil. The number of way, for example by requiring there be a pre
aforementioned “offset' approach will result in some of the scribed number of pixel columns or rows following the first
matrices containing elements which represent pixel blockS 55 detection before that column or row is noted as the outside
that are one pixel column or one pixel row removed from the edge of the pupil.
block containing the centered pupil. Thus, the average A blink at a potential eye location represents itself as a
intensity value of these blockS can be quite close, or even brief period where the eyelid is closed, e.g. about 2-3 image
identical, to that of the block representing the centered pupil. frames in length based on an imaging System producing
Thus, the matrix elements representing these blockS may 60 about 30 frames per second. This would appear as a “dis
also be identified as potential eye locations via the above appearance' of a potential eye at an identified location for a
described thresholding proceSS. To compensate, the next few Successive frames, followed by its “reappearance' in the
Step 318 in the process of identifying potential eye locations next frame. The eye “disappears' from an image frame
is to examine flagged matrix elements associated with the during the blink because the eyelid which covers the iris and
previously-designated potential eye locations which corre 65 pupil will exhibit a much greater average pixel intensity.
spond to blocks having pixels in common with pixel blockS Thus, the closed eye will not be detected by the previously
asSociated with other flagged elements. Only the matrix described thresholding process. Further, it is noted that when
5,867.587
9 10
the eye opens again after the completion of the blink, it will nated as the new center of the potential eye location (Step
be in approximately the Same location as identified prior to 510). The threshold value was applied to ensure the pixel
the blink if a reasonable frame speed is employed by the intensity values in the Second frame were at least "in line'
imaging System. For example, a 30 frames per Second rate with those in the corresponding potential eye locations in the
is adequate to ensure the eye has not moved significantly in first image. Thus, the threshold is chosen So as to ensure a
the 2-3 frames it takes to blink. Any slight movement of the relatively high degree of correlation is observed. For
eye is detected and compensated for by a correlation pro example, a threshold value of at least 0.5 could be employed.
cedure to be described shortly. If none of the correlation coefficients exceeded the cor
The Subsequent image frames could be processed as relation threshold in a given iteration of the tracking
described above to re-identify potential eye locations which procedure, then this is an indication the eye has been "lost',
would then be correlated to the locations identified in or perhaps a blink is occurring. This “no-correlation' con
previous frames in order to track the potential eyes in dition is noted. Subsequent frames are then monitored and
anticipation of detecting a blink. However, processing the the number of consecutive times the “no-correlation' con
entire image in Subsequent frames requires considerable dition occurs is calculated in step 512. Whenever, a
processing power and may not provide as accurate location 15 no-correlation condition exists from a period of 2-3 frames,
data. FIG. 5 is a flow diagram of the preferred eye location and then the potential eye is detected once again, this is
tracking and blink detection proceSS used to identify and indicative of a blink. If a blink is so detected, the status of
track actual eye locations among the potential eye locations the potential eye location is upgraded to a high confidence
identified previously (i.e. steps 302 through 320 of FIG. 3). actual eye location (step 514). This is possible because an
However, as will be discussed later, this process also pro eye will always exhibit this blink response, and So the
vides correlation data which will be employed to detect an location can be deemed that of an actual eye with a high
impaired operator. This preferred proceSS uses cut-out degree of confidence. The eye tracking and blink detection
blocks in the Subsequent frames which are correlated to the process (of FIG. 5) is repeated for each successive frame
potential eye locations in the previous frame to determine a generated by the imaging apparatus with the addition that
new eye location. Processing just the cutout blockS rather 25 actual eye locations are tracked as well as the remaining
than the entire image Saves considerable processing potential eye locations (step 516). This allows the position of
resources. The first step 502 in the process involves identi the actual and potential eye locations to be continuously
fying the aforementioned cut-out blocks within the Second updated. It is noted that the pixel matrix from the immedi
image frame produced by the imaging System. This is ately preceding frame is used for the aforementioned cor
preferably accomplished by identifying cut-out pixel blockS relation procedure whenever possible. However, where a
20 in the second frame, each of which includes the pixel no-correlation condition exists in any iteration of the track
block 22 corresponding to the location of the block identi ing process, the present image is correlated using the pixel
fied as a potential eye location in the previous image frame, matrix from the last image frame where the affected eye
and all adjacent M by M, pixel blocks 24, as shown in FIG. location was updated.
6. Next, in step 504, a matrix is created from the first image 35 Referring now to FIG. 7, if a potential eye location does
for each potential eye location. This matrix includes all the not exhibit a blink response within 150 image frames, it is
represented pixel intensities in an area Surrounding the Still tracked but assigned a low confidence Status (i.e. a low
determined center of a potential eye location. Preferably, this probability it is an actual eye location) at step 702. Similarly,
area is bigger than the cut-out block employed in the Second if a potential eye location becomes “lost in that there is a
image. For example, an area having a size of 100 by 50 40 no-correlation condition for more than 150 frames, this
pixels could be employed. The center element of each matrix location is assigned a low confidence status (Step 704).
(which corresponds to the determined center of the pupil of Further, if a blink has been detected at a potential eye
the potential eye) is then “overlaid” in step 506 on each pixel location and its status upgraded to an actual eye location, but
in the associated cut-out block in the Second image frame, then this location is “lost”, its status will depend on a
Starting with the pixel in the upper left-hand corner. A 45 Secondary factor. This Secondary factor is the presence of a
correlation procedure is then performed between each Second actual eye location having a geometric relationship
matrix and the overlaid pixels of its associated cutout block. to the first, as was described previously. If Such a Second eye
This correlation is accomplished using any appropriate location exists, the high confidence Status of the "lost actual
conventional matrix correlation process. AS these correlation eye does not change. If, however, there is no Second eye
processes are known in the art, no further detail will be 50 location, then the “lost actual eye is downgraded to a low
provided herein. The result of the correlation is a correlation confidence potential eye location (step 706). The determi
coefficient representing the degree to which the pixel matrix nation of high and low confidence is important because, the
from the first image frame corresponded to the Overlaid tracking proceSS continues for all potential or actual eye
position in the associated cutout block. This proceSS is locations only for as long as there is at least one remaining
repeated for all the pixel locations in each cut-out block to 55 high confidence actual eye location or an un-designated
produce a correlation coefficient matrix for each potential potential eye location (i.e. a potential eye location which has
eye location. In step 508, a threshold value is compared to not been assigned a low confidence status) being monitored
each element in the correlation coefficient matrices, and (step 708). However, if only low confidence locations exist,
those which exceed the threshold are flagged. The flagged the System is re-initialized and the entire eye finding and
element in each of these correlation coefficient matrices 60 tracking process starts over (step 710).
which is larger than the rest of the elements corresponds to Once at least one actual eye location has been identified,
the pixel location in the Second image which most closely the impaired operator detection process, depicted in FIGS.
matches the intensity profile of the associated potential eye 8A-E, can begin. As shown in FIG. 8A, the first step 802 in
location identified in the first image, and represents the the process is to begin monitoring the correlation coefficient
center of the updated potential eye location in the Second 65 matrix associated with an identified actual eye location as
image frame. If Such a maximum value is found, the derived for each Subsequent image frame produced by the
corresponding pixel location in the Second image is desig imaging apparatus. It will be remembered that the center
5,867.587
11 12
element of each pixel matrix corresponding to a potential or with N frames encompassing an entire blink is related to the
actual eye location in a previous image frame was “overlaid” duration of the blink. Namely, the longer the duration of the
(step 506 of FIG. 5) onto each pixel in the associated cut-out blink, the lower the average. This is consistent with the
block in a current image frame, Starting with the pixel in the phenomenon that an impaired operator's blink is slower than
upper left-hand corner. A correlation procedure was per that of an alert operator. This eyeblink duration determina
formed between the matrix and the overlaid pixels of its tion and comparison proceSS is repeated for each image
asSociated cutout block. The result of the correlation was a frame produced Subsequent to the initial duration determi
correlation coefficient representing the degree to which the nation.
pixel matrix from the first image frame corresponded to the Another blink characteristic which tends to distinguish an
overlaid position in the associated cutout block. The corre alert operator from an impaired operator is the frequency of
lation process was then repeated for all the pixel locations in blinks. Typically, an impaired individual will blink less often
each cut-out block to produce a correlation coefficient than an alert individual. The frequency of an operator's
matrix for each potential or actual eye location. This is the blinkS can be imputed from the change in derived average
correlation coefficient matrix, as associated with an identi maximum correlation coefficient values over time. Referring
fied actual eye location, that is employed in step 802. In the 15 to FIG. 8C, this is accomplished by counting the number of
next step 804, the correlation coefficient having the maxi minimum average values associated with a blink for each
mum value within a correlation coefficient matrix is identi identified actual eye location that occurs over a prescribed
fied and Stored. The maximum correlation coefficient matrix period of time, and dividing this number by that period to
values from each image frame are then put through a determine the frequency of blinks (step 814). The occur
recursive analysis. ESSentially, when the first N consecutive rence of a minimum average can be determined by identi
maximum correlation coefficient values for each identified fying an average value having averages associated with the
actual eye location have been Stored, these values are previous and Subsequent few frames which are greater, and
averaged (step 806). N is chosen So as to at least correspond which is below an expected blink average. The expected
to the number of image frames it would take to image the blink average is an average corresponding to the maximum
longest expected duration of a blink. For example, in a tested 25 that would still be consistent with a blink of an alert operator.
embodiment of the present invention, N was chosen as Seven Requiring the identified minimum average to exceed this
frames which corresponded to 0.25 Seconds based on an expected average ensures the minimum average is associ
imaging frame rate of about 30 frames per Second. This ated with a blink and not just slight movement of the eyelid
averaging process is then repeated for each identified actual between blinks. The prescribed period of time is chosen so
eye location upon the production of each Subsequent image as to average out any variations in the time between blinkS.
frame (and So each new correlation coefficient matrix), For example, counting the number of minimums that occur
except that the immediately preceding N maximum corre over 60 seconds would provide a satisfactory result. Once
lation coefficient values are employed rather than the first N the frequency of blinks has been established it is compared
values (step 808). Thus, an updated average is provided to a threshold value representing the minimum blink fre
every frame. The above-described proceSS is performed 35 quency expected from an alert operator (step 816). If the
Simultaneously for each identified actual eye location, So derived frequency is less than the blink frequency threshold,
that both eyes can be analyzed independently. then this would be an indication that the operator was
FIGS. 9A-B graph the maximum correlation coefficients impaired, and in Step 818, an indication of Such is provided
identified and Stored over a period of time for the right eye to the impaired operator warning unit. This frequency deter
of an alert operator and a drowsy operator, respectively, as 40 mination and comparison proceSS would be continually
derived from a tested embodiment of the present invention. repeated for each image frame produced Subsequent to the
The dip in both graphs toward the right-hand Side represent initial frequency determination, except that only those aver
blinks. It is evident from these graphs that the average age coefficient values derived over a preceding period of
correlation coefficient value associated with an alert opera time equivalent to the prescribed period would be analyzed.
tor's blink will be significantly higher than that of a drowsy 45 Yet another blink characteristic that could be utilized to
operator's blink. It is believed that a similar divergence will distinguish an alert operator from an impaired operator is the
exist with other "non-alert States Such as when an operator completeness of an operator's blinks. It has been found that
is intoxicated. Further, it is noted that the average correlation an impaired individual’s blink will not be as complete as that
coefficient value over N frames which cover a complete of an alert individual. In other words, an impaired operator's
blink of an operator, alert or impaired, will be lower than any 50 eye will not completely close. It is believed this incomplete
other N frame average. Therefore, as depicted in FIG. 8B, closure will result in the previously-described minimum
one way of detecting an impaired operator would be to average values being greater for an impaired individual than
compare the average maximum correlation coefficient value an alert one. The completeness of an operator's blink can be
(as derived in step 806) to a threshold representing the imputed from the difference between the minimum average
average maximum correlation coefficient value which would 55 value and the maximum average value associated with a
be obtained for Nimage frames covering an alert operator's blink. Thus, a minimum average value is identified as before
blink (step 810). If the derived average was less than the (step 820) for each identified actual eye location, as shown
alert operator threshold, then this would be an indication that in FIG. 8D. Then, in step 822, the next occurring maximum
the operator may be impaired in Some way, and in Step 812 average value is ascertained. This is accomplished by iden
an indication of Such is provided to the impaired operator 60 tifying the next average value which has a few lesser values
warning unit (of FIG. 1). Further, the threshold can be made both preceding it and following it. The absolute difference
applicable to any operator by choosing it to correspond to between the minimum and maximum values is determined
the minimum expected average for any alert operator. It is in step 824. This absolute difference is a measure of the
believed the minimum average associated with an alert completeness of the blink and can be referred to as the
operator will Still be significantly higher than even a maxi 65 amplitude of the blink. Once the amplitude of a blink has
mum average associated with an impaired operator. The been established for an identified actual eye location, it is
average maximum correlation coefficient value associated compared to a threshold value representing the minimum
5,867.587
13 14
blink amplitude expected from an alert operator (step 826). of operator impairedness. For this reason, it is preferred that
If the derived amplitude is less than the blink amplitude other corroborating indications that the operator is impaired
threshold, then this would also be an indication that the be employed. For example, Some impaired operator moni
operator is impaired, and in Step 828, an indication of Such toring Systems operate by evaluating an operator's control
is provided to the impaired operator warning unit. Here too, actions. One Such example is the disclosed in a co-pending
the blink amplitude determination and comparison process is application entitled IMPAIRED OPERATOR DETECTION
repeated for each image frame produced Subsequent to the AND WARNING SYSTEM EMPLOYING ANALYSIS OF
initial frequency determination So that each Subsequent blink OPERATOR CONTROL ACTIONS, having the same
is analyzed. assignee as the present application. This co-pending appli
Still another blink characteristic that could be utilized to cation was filed on Apr. 2, 1997 and assigned serial number
distinguish an alert operator from an impaired operator is the 08/832,397 now U.S. Pat. No. 5,798,695. As shown in FIG.
consistency of blink characteristics between the left and 10, when a corroborating impairedness indicator is
right eyes of an operator. It has been found that the duration, employed, a warning would not be initiated by the warning
frequency and/or amplitude of an alert individual’s contem unit unless, at least one of the analyzed eyeblink character
poraneously occurring blinks will be be apparent consistent 15 istics indicated the operator may be impaired, and the
between eyes, whereas this consistency is leSS apparent in an corroborating indicator also indicated impairedness (Step
impaired individual’s blinks. When two actual eye locations 848).
have been identified and are being analyzed, the difference Another way of increasing the confidence that an operator
between like characteristics can be determined and com is actually impaired based on an analysis of his or her
pared to a consistency threshold. Preferably, this is done by eyeblinks, would be to require more than one of the afore
determining the difference between a characteristic occur mentioned indicators to point to an impaired operator before
ring in one eye and the next like characteristic occurring in initiating a warning. An extreme example would be a
the other eye. It does not matter which eye is chosen first. If requirement that all the impairedneSS indicators, i.e. blink
two actual eye locations have not been identified, the con duration, frequency, amplitude, and inter-eye consistency (if
Sistency analysis is postponed until both locations are avail 25 available), indicate the operator is impaired before initiating
able for analysis. Referring to FIG. 8E, the aforementioned a warning. Of course, Some indicators can be more definite
consistency analysis process preferably includes determin than others, and thus should be given a higher priority.
ing the difference between the average maximum correlation Accordingly, a voting logic could be employed which will
coefficient values (which are indicative of the duration of a assist in the determination whether an operator is impaired,
blink) for the left and right eyes (step 830) and then or not. This voting logic could result in an immediate
comparing this difference to a duration consistency thresh indication of impairedness if a more definite indicator is
old (step 832). This duration consistency threshold corre detected, but require two or more of lesser indicators to be
sponds to the expected maximum difference between the detected before a determination of impairedness is made.
average coefficient values for the left and right eye of an alert The particular indicator or combination of indicators which
individual. If the derived difference exceeds the threshold, 35 should be employed to increase the confidence of the System
then there is an indication that the operator is impaired, and could be determined empirically by analyzing alert and
in step 834, an indication of Such is provided to the impaired impaired operators in Simulated conditions. Additionally,
operator warning unit. Similar differences can be calculated evaluating changes in an indicator over time can be advan
(steps 836 and 838) and threshold comparisons made (steps tageous because temporary effects which affect the accuracy
840, 842) for the eyeblink frequency and amplitude derived 40 of the detection process, Such as the aforementioned
from the average coefficient values for each eye as described example of Squinting caused by the glare of oncoming
previously. If the differences exceed appropriate frequency headlights, can be filtered out. For example, if an indicator
and/or amplitude consistency thresholds, this too would be Such as blink duration where determined to indicate an
an indication of an impaired operator, and an indication of impaired driver over a Series of image frames, but then
Such is provided to the impaired operator warning unit (steps 45 change to indicate and alert driver, this could indicate a
844, 846). This consistency determining process is also temporary skewing factor had been present. Such a problem
repeated for each image frame produced Subsequent to the could be resolved by requiring an indicator to remain in a
respective initial characteristic determination, as long as two State indicating impairedness for Some minimum amount of
actual eye locations are being analyzed. time before the operator is deemed impaired and a decision
Whenever one or more of the analyzed blink character 50 is made to initiate a warning. Here again, the particular time
istics (i.e. duration, frequency, amplitude, and inter-eye frames can be establish empirically by evaluating operators
consistencies) indicate that the operator may be impaired, a in Simulated conditions. The methods of requiring more than
decision is made as to whether a warning should be initiated one indicator to indicate impairedness, employing voting
by the impaired operator warning unit. A warning could be logic, and/or evaluating changes in the indicators, can be
issued when any one of the analyzed blink characteristics 55 employed with or without the additional precaution of the
indicates the operator may be impaired. However, the indi aforementioned corroborating “non-eyeblink derived”
cators of impairedness when viewed in isolation may not impairedneSS indicator.
always give an accurate picture of the operator's alertneSS While the invention has been described in detail by
level. From time to time circumstances other than impaired reference to the preferred embodiment described above, it is
neSS might cause the aforementioned characteristic to be 60 understood that variations and modifications thereof may be
exhibited. For example, in the case of an automobile driver, made without departing from the true Spirit and Scope of the
the glare of headlights from an oncoming cars at night might invention.
cause a driver to Squint thereby affecting his or her eyelid Wherefore, what is claimed is:
position, blink rate, and other eye-related factors which 1. A method of detecting an impaired operator, comprising
might result in one or more of the indicators to falsely 65 the Steps of:
indicate the driver was impaired. Accordingly, when viewed (a) employing an imaging apparatus which produces
alone, any one indicator could result in a false determination consecutive digital images including the face and eyes
5,867.587
15 16
of an operator, each digital image comprising an array 5. The method of claim 1, wherein:
of pixel representing the intensity of light reflected the analyzing Step comprises extracting a parameter
from the face of the subject; indicative of the frequency of an operator's eyeblinks;
(b) determining the location of a first one of the operators the comparing Step comprises comparing the extracted
eyes within each digital image; frequency parameter to Said associated alert operator
(c) generating correlation coefficients, each of which threshold wherein the threshold corresponds to a mini
quantifies the degree of correspondence between pixels mum eyeblink frequency expected to be exhibited by
asSociated with the location of the operator's eye in an an alert operator's eye, and
immediately preceding image in comparison to pixels the indicating Step comprises indicating the operator may
asSociated with the location of the operator's eye in a be impaired whenever the extracted frequency param
current image; eter is less than the alert operator frequency threshold.
(d) averaging the first N consecutive correlation coeffi 6. The method of claim 1, wherein the step of extracting
cients generated to generate a first average correlation the frequency parameter comprises the Steps of counting the
coefficient, wherein N corresponds to at least the num 15
number of minimum average correlation coefficients occur
ber of images required to image a blink of the opera ring over a prescribed preceding period of time which are
tor's eye; less than a prescribed value, and thereafter dividing the
(e) after the production of the next image by the imaging counted number of minimum average correlation coeffi
apparatus, averaging the previous N consecutive cor cients by the prescribed period of time to determine an
relation coefficients generated to create a next average eyeblink frequency, wherein Said prescribed value corre
correlation coefficient; sponds a maximum average correlation coefficient value
(f) repeating step (e) for each image frame produced by which is would be obtained by averaging the correlation
the imaging apparatus, coefficients generated for N images capturing a complete
(g) analyzing said average correlation coefficients to blink of an alert operator's eye, Said analyzing Step being
extract at least one parameter attributable to an eye 25
repeated each time an average correlation coefficient is
blink of Said operator's eye; generated.
7. The method of claim 1, wherein:
(h) comparing each extracted parameter to an alert opera the analyzing Step comprises extracting a parameter
tor threshold associated with that parameter, Said indicative of the amplitude of an operator's eyeblinks;
threshold being indicative of an alert operator, and the comparing Step comprises comparing the extracted
(i) indicating that the operator may be impaired if any amplitude parameter to Said associated alert operator
extracted parameter deviates from the associated threshold wherein the threshold corresponds to a mini
threshold in a prescribed way. mum eyeblink amplitude expected to be exhibited by
2. The method of claim 1, further comprising the steps of: an alert operator's eye, and
(j) determining the location of the other of the operators 35 the indicating Step comprises indicating the operator may
eyes within each digital image, and be impaired whenever the extracted amplitude param
(k) performing steps (c) through (i) for the location of the eter is less than the alert operator amplitude threshold.
other of the operator'eyes. 8. The method of claim 1, wherein the step of extracting
3. The method of claim 1, wherein: the amplitude parameter comprises the Steps of identifying
the analyzing Step comprises extracting a parameter 40 each occurrence of a minimum average correlation coeffi
indicative of the duration of an operator's eyeblinks; cient which is less than a prescribed value and for each Such
the comparing Step comprises comparing the extracted occurrence determining the absolute value of the difference
duration parameter to Said associated alert operator between the minimum average correlation coefficient and
threshold wherein the threshold corresponds to a maxi the next occurring maximum average correlation coefficient,
mum eyeblink duration expected to be exhibited by an 45 wherein said absolute value is indicative of the amplitude of
alert operator's eye; and an operator's eyeblink, and wherein Said prescribed value
the indicating Step comprises indicating the operator may corresponds a maximum average correlation coefficient
be impaired whenever the extracted duration parameter value which is would be obtained by averaging the corre
exceeds the alert operator duration threshold. lation coefficients generated for N images capturing a com
4. The method of claim 3, wherein: 50 plete blink of an alert operator's eye.
the Step of extracting the duration parameter comprises 9. The method of claim 2, wherein
identifying each average correlation coefficient gener the comparing Step comprises first determining the dif
ated; ference between an extracted parameter associated with
the Step of comparing the extracted duration parameter to the first eye and a like extracted parameter associated
Said associated alert operator duration threshold com 55 with the other eye to establish a parameter consistency
prises comparing each average correlation coefficient factor for the extracted parameter, and thereafter com
to a minimum average correlation coefficient value paring the established parameter consistency factor to
which is would be obtained by averaging the correla an alert operator consistency threshold associated with
tion coefficients generated for N images capturing a that parameter.
complete blink of an alert operator's eye, and wherein, 60 10. The method of claim 9, wherein:
the extracted duration parameter exceeds the alert opera the analyzing Step comprises extracting a parameter
tor threshold whenever the average correlation coeffi indicative of the duration of an operator's eyeblinks;
cients are less than the minimum average correlation the comparing Step comprises determining the difference
coefficient value which is would be obtained by aver between the duration of each of the operator's eye
aging the correlation coefficients generated for N 65 blinks associated with the first eye to the duration of a
images capturing a complete blink of an alert operators next occurring eyeblink associated with the other eye to
eye. establish a duration consistency factor, and thereafter
5,867.587
17 18
comparing the determined duration consistency factor amplitude expected to be exhibited by an alert opera
to Said associated alert operator consistency threshold tor's eye, and
wherein the threshold corresponds to a minimum dif determining the difference between at least one of the
ference in the eyeblink duration expected to be exhib extracted parameters associated with the first eye and
ited by an alert operator's eyes, and a like extracted parameter associated with the other
the indicating Step comprises indicating that the operator eye to establish a consistency factor for the extracted
may be impaired whenever the determined duration parameter, and thereafter comparing the established
consistency factor exceeds the associated alert operator parameter consistency factor to an alert operator
duration consistency threshold. consistency threshold associated with that parameter.
11. The method of claim 9, wherein: 1O 14. The method of claim 13, wherein the indicating step
comprises indicating the operator may be impaired when
the analyzing Step comprises extracting parameters ever at least one of (1) the extracted duration parameter
indicative of the frequency of an operator's eyeblinks; exceeds the alert operator duration threshold, (2) the
the comparing Step comprises determining the difference extracted frequency parameter is less than the alert operator
between the frequency of the operator's eyeblinks 15
frequency threshold, (3) the extracted amplitude parameter
asSociated with the first eye as calculated over a pre is less than the alert operator amplitude threshold, and (4) an
Scribed period of time to the contemporaneous fre established parameter consistency factor exceeds an associ
quency of the eyeblinks associated with the other eye as ated alert operator consistency threshold.
calculated for the prescribed period of time to establish 15. The method of claim 1, further comprising the step of:
a frequency consistency factor, and thereafter compar (i) generating a corroborating indicator of operator
ing the determined frequency consistency factor to Said impairedneSS whenever operator control inputs are
asSociated alert operator consistency threshold wherein indicative of the operator being impaired; and
the threshold corresponds to a minimum difference in (k) indicating that the operator is impaired if at least one
the eyeblink frequency expected to be exhibited by an of the extracted parameter deviates from the associated
alert operator's eyes, and 25 threshold in a prescribed way, and the corroborating
the indicating Step comprises indicating that the operator indicator is generated.
may be impaired whenever the determined frequency 16. An impaired operator detection and warning System,
consistency factor exceeds the alert operator frequency comprising:
consistency threshold. an imaging apparatus which produces consecutive digital
12. The method of claim 9, wherein: images including the face and eyes of an operator, each
the analyzing Step comprises extracting parameters digital image comprising an array of pixel representing
indicative of the amplitude of an operator's eyeblinks; the intensity of light reflected from the face of the
the comparing Step comprises determining the difference Subject;
between the amplitude of each of the operator's eye an eye finding unit comprising an eye finding processor,
blinks associated with the first eye to the completeneSS 35 Said eye finding processor comprising:
of the next occurring eyeblink associated with the other a first processor portion capable of determining the
eye to establish a amplitude consistency factor, and location of a first one of the operator's eyes within
thereafter comparing the determined amplitude consis each digital image,
tency factor to Said associated alert operator consis a Second processor portion capable of generating cor
tency threshold wherein the threshold corresponds to a 40 relation coefficients each of which quantifies the
minimum difference in the amplitude of eyeblinks degree of correspondence between pixels associated
expected to be exhibited by an alert operator's eyes; with the location of the operator's eye in an imme
and diately preceding image in comparison to pixels
the indicating Step comprises indicating that the operator asSociated with the location of the operator's eye in
may be impaired whenever the determined amplitude 45 a current image,
consistency factor exceeds the alert operator amplitude an impaired operator detection unit comprising an
consistency threshold. impaired operator detection processor, Said impaired
13. The method of claim 2, wherein plural parameters operator detection processor comprising:
attributable to an eyeblink of Said operator's eye are a first processor portion capable of averaging the first N
extracted, and wherein: 50 consecutive correlation coefficients generated to
the analyzing Step comprises extracting parameters generate a first average correlation coefficient,
indicative of the duration, frequency, and amplitude of wherein N corresponds to at least the number of
an operator's eyeblinks; images required to image a blink of the operator's
the comparing Step further comprises: eye,
comparing the extracted duration parameter to an alert 55 a Second processor portion capable of, after the pro
operator duration threshold wherein the duration duction of the next image by the imaging apparatus,
threshold corresponds to a maximum eyeblink dura averaging the previous N consecutive correlation
tion expected to be exhibited by an alert operator's coefficients generated to create a next average cor
eye, relation coefficient, and repeating the averaging for
comparing the extracted frequency parameter to an 60 each image frame produced by the imaging
alert operator frequency threshold wherein the fre apparatus,
quency threshold corresponds to a minimum eye a third processor portion capable of analyzing Said
blink frequency expected to be exhibited by an alert average correlation coefficients to extract at least one
operator's eye, parameter attributable to an eyeblink of Said opera
comparing the extracted amplitude parameter to an 65 tor's eye,
alert operator amplitude threshold wherein the a fourth processor portion capable of comparing each
amplitude corresponds to a minimum eyeblink extracted parameter to an alert operator threshold
5,867.587
19 20
asSociated with that parameter, Said threshold being with the operator's other eye, wherein N corresponds to
indicative of an alert operator; and at least the number of images required to image a blink
an impaired operator warning unit comprising an of the operator's other eye,
impaired operator warning processor, Said impaired the Second processor portion of the impaired operator
operator warning processor capable of indicating that detection processor is further capable of, after the
the operator may be impaired if any extracted param production of the next image by the imaging apparatus,
eter deviates from the associated threshold in a pre averaging the previous N consecutive correlation coef
Scribed way. ficients generated and associated with the operators
17. The system of claim 16, wherein: other eye, to create a next average correlation coeffi
cient associated with the operator's other eye, and
the third processor portion of the impaired operator detec repeating the averaging for each image frame produced
tion processor is capable of extracting a parameter by the imaging apparatus,
indicative of the duration of an operator's eyeblinks; the third processor portion of the impaired operator detec
the fourth processor portion of the impaired operator tion processor is further capable of analyzing Said
detection processor is capable of comparing the 15 average correlation coefficients associated with the
extracted duration parameter to Said associated alert operator's other eye to extract at least one parameter
operator threshold wherein the threshold corresponds attributable to an eyeblink of Said operator's other eye,
to a maximum eyeblink duration expected to be exhib the fourth processor portion of the impaired operator
ited by an alert operator's eye, and detection processor is further capable of comparing
the impaired operator warning processor is capable of each extracted parameter associated with the operators
indicating the operator may be impaired whenever the other eye to an alert operator threshold associated with
extracted duration parameter exceeds the alert operator that parameter, Said threshold being indicative of an
duration threshold. alert operator;
18. The system of claim 16, wherein: the impaired operator warning processor is further
the third processor portion of the impaired operator detec 25
capable of indicating that the operator may be impaired
tion processor is capable of extracting a parameter if any extracted parameter associated with the opera
indicative of the frequency of an operator's eyeblinks; tor's other eye deviates from the associated threshold in
the fourth processor portion of the impaired operator a prescribed way.
detection processor is capable of comparing the 21. The system of claim 20, wherein
extracted frequency parameter to Said associated alert the fourth processor portion of the impaired operator
operator threshold wherein the threshold corresponds detection processor is further capable of first determin
to a minimum eyeblink frequency expected to be ing the difference between an extracted parameter
exhibited by an alert operator's eye; and asSociated with the operator's first eye and a like
the impaired operator warning processor is capable of extracted parameter associated with the operator's
indicating the operator may be impaired whenever the 35
other eye to establish a parameter consistency factor for
extracted frequency parameter is less than the alert the extracted parameter, and thereafter comparing the
operator frequency threshold. established parameter consistency factor to an alert
operator consistency threshold associated with that
19. The system of claim 16, wherein: parameter.
the third processor portion of the impaired operator detec 22. The system of claim 21, wherein:
tion processor is capable of extracting a parameter 40
the third processor portion of the impaired operator detec
indicative of the amplitude of an operator's eyeblinks; tion processor is further capable of extracting a param
the fourth processor portion of the impaired operator eter indicative of the duration of an operator's eye
detection processor is capable of comparing the blinks;
extracted amplitude parameter to Said associated alert 45 the fourth processor portion of the impaired operator
operator threshold wherein the threshold corresponds detection processor is further capable of determining
to a minimum eyeblink amplitude expected to be the difference between the duration of each of the
exhibited by an alert operator's eye, and operator's eyeblinks associated with the first eye to the
the impaired operator warning processor is capable of duration of a next occurring eyeblink associated with
indicating the operator may be impaired whenever the 50 the other eye to establish a duration consistency factor,
extracted amplitude parameter is less than the alert and thereafter comparing the determined duration con
operator amplitude threshold. Sistency factor to Said associated alert operator consis
20. The system of claim 16, wherein: tency threshold wherein the threshold corresponds to a
the first processor portion of the eye finding processor is minimum difference in the eyeblink duration expected
further capable of determining the location of an other 55 to be exhibited by an alert operator's eyes, and
one of the operator's eyes within each digital image; the impaired operator warning processor is further
the Second processor portion of the eye finding processor capable of indicating that the operator may be impaired
is further capable of generating correlation coefficients whenever the determined duration consistency factor
each of which quantifies the degree of correspondence exceeds the associated alert operator duration consis
between pixels associated with the location of the 60 tency threshold.
operator's other eye in an immediately preceding image 23. The system of claim 21, wherein:
in comparison to pixels associated with the location of the third processor portion of the impaired operator detec
the operator's other eye in a current image; tion processor is further capable of extracting param
the first processor portion of the impaired operator detec eters indicative of the frequency of an operator's eye
tion processor is further capable of averaging the first 65 blinks;
N consecutive correlation coefficients generated togen the fourth processor portion of the impaired operator
erate a first average correlation coefficient associated detection processor is further capable of determining
5,867.587
21 22
the difference between the frequency of the operator's the fourth processor portion of the impaired operator
eyeblinks associated with the first eye as calculated detection processor is further capable of:
over a prescribed period of time to the contemporane comparing each extracted duration parameter associ
ous frequency of the eyeblinks associated with the ated with the operator's eyes to an alert operator
other eye as calculated for the prescribed period of time duration threshold wherein the duration threshold
to establish a frequency consistency factor, and there corresponds to a maximum eyeblink duration
after comparing the determined frequency consistency expected to be exhibited by an alert operator's eyes,
factor to Said associated alert operator consistency comparing each extracted frequency parameter associ
threshold wherein the threshold corresponds to a mini ated with the operator's eyes to an alert operator
mum difference in the eyeblink frequency expected to 1O frequency threshold wherein the frequency threshold
be exhibited by an alert operator's eyes, and corresponds to a minimum eyeblink frequency
the impaired operator warning processor is further expected to be exhibited by an alert operator's eyes,
capable of indicating that the operator may be impaired comparing each extracted amplitude parameter associ
whenever the determined frequency consistency factor ated with the operator's eyes to an alert operator
exceeds the alert operator frequency consistency 15 amplitude threshold wherein the amplitude corre
threshold. sponds to a minimum eyeblink amplitude expected
24. The system of claim 21, wherein: to be exhibited by an alert operator's eyes, and
the third processor portion of the impaired operator detec determining the difference between at least one of the
tion processor is further capable of extracting param extracted parameters associated with the first eye and
eters indicative of the amplitude of an operator's eye a like extracted parameter associated with the other
blinks;
eye to establish a consistency factor for the extracted
parameter, and thereafter comparing the established
the fourth processor portion of the impaired operator parameter consistency factor to an alert operator
detection processor is further capable of determining consistency threshold associated with that parameter.
the difference between the amplitude of each of the 25 26. The system of claim 25, wherein the impaired operator
operator's eyeblinks associated with the first eye to the warning processor is further capable of indicating the opera
completeness of the next occurring eyeblink associated tor may be impaired whenever at least one of (1) any
with the other eye to establish a amplitude consistency extracted duration parameter exceeds the alert operator
factor, and thereafter comparing the determined ampli duration threshold, (2) any extracted frequency parameter is
tude consistency factor to Said associated alert operator less than the alert operator frequency threshold, (3) any
consistency threshold wherein the threshold corre extracted amplitude parameter is less than the alert operator
sponds to a minimum difference in the amplitude of amplitude threshold, and (4) an established parameter con
eyeblinkS expected to be exhibited by an alert opera Sistency factor exceeds an associated alert operator consis
tor's eyes, and tency threshold.
the impaired operator warning processor is further 35 27. The system of claim 16, further comprising:
capable of indicating that the operator may be impaired a corroborating operator alertneSS indicator unit capable
whenever the determined amplitude consistency factor of generating a corroborating indicator of operator
exceeds the alert operator amplitude consistency impairedneSS whenever operator control inputs are
threshold. indicative of the operator being impaired; and wherein
25. The system of claim 20, wherein plural parameters 40 the impaired operator warning processor indicates that the
attributable to an eyeblink of Said operator's eyes are operator is impaired whenever at least one of the
extracted, and wherein: extracted parameter deviates from the associated
the third processor portion of the impaired operator detec threshold in a prescribed way, and the corroborating
tion processor is further capable of extracting param indicator is generated.
eters indicative of the duration, frequency, and ampli 45
tude of an operator's eyeblinks; k k k k k