Apparatus for monitoring movement of a person's eye, e.g., to monitor drowsiness. The system includes a frame that is worn on a person's head, an array of emitters on the frame for directing light towards the person's eye, and an array of sensors on the frame for detecting light from the array of emitters. The sensors detect light that is reflected off of respective portions of the eye or its eyelid, thereby producing output signals indicating when the respective portions of the eye is covered by the eyelid. The emitters project a reference frame towards the eye, and a camera on the frame monitors movement of the eye relative to the reference frame. This movement may be correlated with the signals from the array of sensors and/or with signals from other sensors on the frame to monitor the person's level of drowsiness.

Patent
   RE42471
Priority
Aug 19 1996
Filed
Aug 27 2008
Issued
Jun 21 2011
Expiry
Aug 19 2016
Assg.orig
Entity
Small
41
79
all paid
0. 42. A system for controlling a computing device, comprising:
a frame configured to be worn on a person's head such that the frame does not interfere substantially with the person's vision along an axis extending directly ahead of a first eye of the person;
a sensor on the frame comprising a lens offset from the axis and oriented directly towards the region surrounding the first eye when the frame is worn, the sensor generating output signals representing video images of the first eye; and
a processor coupled to the sensor for processing the output signals to monitor movement of the first eye relative to a reference frame, the processor communicating with an electronic device remote from the frame and interpreting the output signals to control the electronic device.
23. A method for monitoring movement of a person's eye using a detection device including an array of emitters that are directed towards an eye of the person when the detection device is worn, and a camera oriented towards the eye away from the person when the detection device is worn, the method comprising:
emitting light from the array of one or more emitters towards the eye to project a reference frame onto the eye;
monitoring movement of the eye relative to the reference frame the person's surroundings with the camera; and
generating a graphical output of the movement monitored by the camera relative to the reference frame;
wherein the detection device further comprises one or more sensors, and wherein the method further comprises detecting light from the array of one or more emitters reflected off of the eye with the one or more sensors, the one or more sensors producing a light intensity signal indicating when the eye is open or closed.
0. 39. A method for monitoring movement of a person's eye using a detection device including one or more emitters and an array of sensors that are directed towards a first eye of the person when the detection device is worn, the method comprising:
placing the detection device on a person's head such that the detection device does not interfere substantially with the person's vision along an axis extending directly ahead of the first eye and the array of sensors are offset from the axis and oriented directly towards a region surrounding the first eye;
emitting light from one or more emitters towards the first eye;
detecting light from the one or more emitters reflected off of the first eye with the array of sensors, the array of sensors producing light intensity signals indicating when the first eye is open or closed; and
interpreting the light intensity signals to control one or more devices.
0. 29. A system for monitoring movement of a person's eye, comprising:
a device configured to be worn on a person's head such that the device does not interfere substantially with the person's vision along an axis extending directly ahead of a first eye of the person;
one or more emitters on the device for directing light towards the first eye when the device is worn;
an array of sensors on the device directed towards the first eye when the device is worn, the sensors configured for converting images of the first eye into output signals; and
a processor coupled to the sensors for interpreting the output signals to control one or more devices,
wherein the one or more emitters and the sensors are provided on the device at locations offset from the axis to generally minimize interference with the person's vision and such that the array of sensors are oriented directly towards the region surrounding the first eye.
0. 48. A system for controlling a computing device, comprising:
a frame configured to be worn on a person's head such that the device does not interfere substantially with the person's vision;
a sensor on the frame comprising a lens directed towards the eye of the person when the frame is worn, the sensor generating output signals representing video images of the eye; and
a processor coupled to the sensor for processing the output signals to monitor movement of the eye relative to a reference frame, the processor communicating with an electronic device remote from the frame and interpreting the output signals to control the electronic device,
wherein the processor is configured for monitoring the output signals to detect video images indicating that the person wearing the device has blinked in a predetermined sequence, the processor configured for executing the command on the electronic device based upon the predetermined sequence.
0. 49. A system for monitoring movement of a person's eye, comprising:
an eyeglass frame comprising a nose bridge configured to be placed on the person's nose, and a pair of ear supports configured to be placed over the person's ears such that the frame, when worn on the person's head, does not interfere substantially with the person's vision along an axis extending directly ahead of a first eye of the person;
one or more emitters on the frame for directing light towards the first eye when the frame is worn, the one or more emitters provided on the frame at one or more locations that generally minimize interference with the person's vision along the axis;
a sensor on or adjacent the nose bridge such that the sensor is offset from the axis to generally minimize interference with the person's vision along the axis and oriented directly towards the region surrounding the first eye, the sensor configured for converting images of the first eye into output signals; and
a processor coupled to the sensor for interpreting the output signals to control one or more devices.
0. 1. A system for monitoring movement of a person's eye, comprising:
a device configured to be worn on a person's head;
an array of emitters on the device for directing light towards an eye of the person when the device is worn, the array of emitters configured for projecting a reference frame towards the eye; and
a camera oriented towards the eye for monitoring movement of the eye relative to the reference frame; and
one or more sensors on the device for detecting light from the array of emitters that is reflected off of the eye or its eyelid, the one or more sensors producing an output signal indicating when the eye is open or closed.
0. 2. The system of claim 1, wherein the one or more sensors comprise an array of sensors in a predetermined relationship with the array of emitters for detecting light from the array of emitters that is reflected off of respective portions of the eye or its eyelid, each sensor producing an output signal indicating when the respective portion of the eye is covered or not covered by the eyelid.
0. 3. The system of claim 1, wherein the array of emitters and the one or more sensors are disposed separately and substantially laterally from one another.
0. 4. The system of claim 1, wherein the array of emitters and the one or more sensors comprise solid state devices capable of operating both as an emitter and as a sensor.
0. 5. The system of claim 1, wherein the camera is configured for producing a video signal, and wherein the system further comprises a processor for correlating the output signal from the one or more sensors with the video signal from the camera for determining the person's level of drowsiness.
0. 6. The system of claim 5, further comprising a warning indicator on the device, the warning indicator being activated when the processor determines a predetermined level of drowsiness has occurred.
0. 7. The system of claim 1, wherein the array of emitters comprises a plurality of emitters disposed in a substantially vertical arrangement on the device.
0. 8. The system of claim 7, wherein the array of emitters further comprises a plurality of emitters disposed in a substantially horizontal arrangement on the device.
0. 9. The system of claim 1, wherein the array of emitters is configured for projecting a set of crossed bands towards the eye for dividing a region including the eye into four quadrants.
0. 10. The system of claim 1, further comprising a transmitter on the device for wireless transmission of video output signals from the camera to a remote location.
0. 11. The system of claim 1, wherein the array of emitters comprise infrared emitters configured to emit pulses of infrared light.
0. 12. The system of claim 11, wherein the camera comprises an infrared camera.
0. 13. The system of claim 1, wherein the camera is mounted on the device.
0. 14. The system of claim 13, wherein the camera comprises a fiberoptic assembly.
0. 15. The system of claim 13, wherein the camera comprises at least one of a CCD and CMOS detector.
0. 16. The system of claim 1, further comprising a sensor on the device for detecting one or more physiological characteristics of the person.
0. 17. The system of claim 16, wherein the sensor comprises at least one of an EEG electrode, an EKG electrode, an oximetry sensor, a pulse sensor, an airflow sensor, and a temperature sensor.
0. 18. The system of claim 1, further comprising at least one of an orientation sensor for detecting the spatial orientation of the device and an actigraphic sensor.
0. 19. The system of claim 1, wherein the device comprises at least one of an eyeglass frame, a hat, a helmet, a visor, and a mask.
0. 20. A system for monitoring movement of a person's eye, comprising:
a frame configured to be worn on a person's head;
an array of emitters on the frame for directing light towards an eye of the person when the frame is worn, the array emitters configured to project a reference frame towards the eye;
an array of sensors on the frame in a predetermined relationship with the array of emitters for detecting light from the array of emitters that is reflected off of respective portions of the eye or its eyelid, each sensor producing an output signal indicating when the respective portion of the eye is covered or not covered by the eyelid;
a camera on the frame for monitoring movement of the eye relative to the reference frame, the camera configured for producing a video signal of a region of the eye and the reference frame; and
a transmitter coupled to the sensor for wireless transmission of the output signal and the video signal to a remote location.
0. 21. The system of claim 20, further comprising a processor for correlating the output signal and the video signal to determine the person's level of drowsiness.
0. 22. The system of claim 21, further comprising a display for providing a graphical output of the output signal simultaneous with the video signal.
24. The method of claim 23, wherein the array of sensors is disposed in a predetermined relationship with the array of one or more emitters for detecting light from the array of one or more emitters that is reflected off of respective portions of the eye or its eyelid, each sensor producing an output signal indicating when the respective portion of the eye is covered or not covered by the eyelid.
25. The method of claim 24, further comprising correlating the output signal from the one or more sensors with video signals produced by the camera monitoring movement of the eye relative to the reference frame, thereby determining the person's level of alertness.
0. 26. The method of claim 23, wherein the monitoring step comprises measuring movement of the eye's pupil relative to the reference frame.
0. 27. The method of claim 26, further comprising graphically displaying the movement of the eye's pupil relative to the reference frame.
28. The method of claim 25, further comprising providing a warning to the person when the determined level of alertness falls below a predetermined level.
0. 30. The system of claim 29, further comprising a control system communicating with the processor, the processor configured for directing the control system to control one or more devices based upon the output signals.
0. 31. The system of claim 29, further comprising a computer communicating with the processor, the processor configured for controlling the computer based upon the output signals.
0. 32. The system of claim 31, wherein the processor is configured to interpret the output signals to operate the computer as an eye-activated mouse.
0. 33. The system of claim 29, wherein the one or more emitters comprise a plurality of infrared emitters configured to emit pulses at a predetermined frequency.
0. 34. The system of claim 29, wherein the device comprises a frame comprising a bridge piece extending between a pair of ear supports, the bridge piece including a nose bridge configured to be placed on the person's nose, and wherein the array of sensors are provided on or adjacent the nose bridge lateral from the first eye when the frame is worn to generally minimize interference with the person's vision along the axis.
0. 35. The system of claim 34, wherein the one or more emitters comprise an array of emitters provided on the frame.
0. 36. The system of claim 29, wherein the device comprises a nose bridge configured to be placed on the person's nose, and wherein the array of sensors are provided on or adjacent the nose bridge lateral from the first eye when the device is worn to generally minimize interference with the person's vision along the axis.
0. 37. The system of claim 36, wherein the device comprises a mask.
0. 38. The system of claim 36, wherein the device comprises an eyeglass frame.
0. 40. The method of claim 39, wherein the array of sensors comprises a two-dimensional array of sensors.
0. 41. The method of claim 39, wherein the light intensity signals are used to control a computer.
0. 43. The system of claim 42, wherein the sensor comprises a detector coupled to the lens by a fiberoptic cable.
0. 44. The system of claim 42, wherein the electronic device comprises a computer, and wherein the processor is configured to interpret the output signals to operate the computer as an eye-activated mouse.
0. 45. The system of claim 42, wherein the frame comprises a bridge piece extending between a pair of ear supports, the bridge piece including a nose bridge configured to be placed on the person's nose, the sensor provided on or adjacent the nose bridge at a location lateral from the first eye when the frame is worn to generally minimize interference with the person's vision along the axis.
0. 46. The system of claim 42, further comprising a plurality of infrared emitters on the frame for directing light towards the first eye of the person when the frame is worn, the emitters configured to emit pulses at a predetermined frequency.
0. 47. The system of claim 46, wherein the emitters are configured for projecting the reference frame onto the first eye of the person wearing the frame.
0. 50. The system of claim 49, wherein the sensor comprises a lens oriented directly towards the region surrounding the first eye when the frame is worn, and a detector coupled to the lens for converting the images of the first eye into digital video signals comprising the output signals.
0. 51. The system of claim 49, wherein the one or more devices comprise a computer.
0. 52. The system of claim 49, wherein the one or more devices comprise a wheelchair.

fe. The user 10 may blink to create a transmitted stream of data 553 that includes commands to turn off and on, or otherwise control, selected appliances using the control unit 550 and control modules 552a-552fe, such as, a radio 554, a television 556, a light 558a., a light 562 controlled by a wall switch 560, a fan 566 plugged into a wall socket 564, and the like.

Alternatively, as shown in FIG. 11B, the receiver 554 may be coupled to other systems, such as a computer 570 and printer 572, a vehicle integration system 574, a lifeline unit 576, a GPS or other satellite transmitter 578, and the like. The transmitted stream of data 553 may be processed alone or along with additional data, such as other vehicle sensor information 573, to further enhance monitoring a user, such as a long-distance truck driver.

Turning to FIG. 13, yet another embodiment of a system 810 for monitoring eye movement is shown. Generally, the system 810 includes a frame 812 that may include a bridge piece 814 and a pair of ear supports 816. The frame 812 may include a pair of lenses (not shown), such as prescription, shaded, or protective lenses, although they are not necessary for operation of the invention. Alternatively, the system may be provided on other devices that may be worn on a user's head, such as a pilot's oxygen mask, protective eye gear, a patient's ventilator, a scuba or swimming mask, a helmet, a hat, a head band, a head visor, and the like (not shown). The components of the system may be provided at a variety of locations on the device that generally minimize interference with the user's vision and/or normal use of the device.

An array of emitters 820 are provided on the frame 812, preferably in a vertical array 820a and a horizontal array 820b. In a preferred embodiment, the emitters 820 are infrared emitters configured to emit pulses at a predetermined frequency, similar to the embodiments described above. The emitters 820 are arranged on the frame such that they project a reference frame 850 onto the region of the user's eye 300. In a preferred embodiment, the reference frame includes a pair of crossed bands 850a, 850b dividing the region into four quadrants. The intersection of the crossed bands is preferably disposed at a location corresponding substantially to the eye's pupil during primary gaze, i.e., when the user is looking generally straight forward along axis 310 extending directly ahead of the user's eye 300. Alternatively, other reference frames may be provided, generally including a vertical component and a horizontal component.

An array of sensors 822 are also provided on the frame 812 for detecting light from the emitters 820 that is reflected off of the user's eyelid. The sensors 822 preferably generate output signals having an intensity identifying whether the eyelid is closed or open, similar to the embodiments described above. Preferably, the sensors 822 are disposed adjacent to respective emitters 820 for detecting light reflected off of respective portions of the eyelid. Alternatively, sensors 822 may only be provided in a vertical array, e.g., along the bridge piece 814, for monitoring the amount of eyelid closure, similar to the embodiments described above. In a further alternative, the emitters 820 and sensors 822 may be solid state biosensors (not shown) that provide both the emitting and sensing functions in a single device.

Circuitry may be provided for measuring PERCLOS or other parameters using the signals generated by the array of sensors. For example, FIG. 17 shows an exemplary schematic that may be used for processing signals from a five element array, e.g., to obtain PERCLOS measurements or other alertness parameters.

Returning to FIG. 13, the system 810 also includes a camera 830 provided on the frame 810. Preferably, the camera 830 is mounted on or adjacent the bridge piece 814 offset from the axis 310 such that the camera 830 is oriented towards the region surrounding one of the user's eyes 300 while minimizing interference with the user's vision. The camera 830 preferably includes a bundle of fiberoptic cables 832 that terminate in a lens 834, as shown in FIG. 14, on a first end mounted adjacent the bridge piece 814 and a second end 837 that is connected to a detector 838, e.g., a CCD or CMOS sensor, such as those used in endoscopes, that may convert an image into a digital video signal. The camera 830 is configured to detect the frequency of light emitted by the emitters 820, e.g., infrared light. The camera 830 may rely on the light projected by the emitters 820, or the fiberoptic cables 832 may include emitters 836 for projecting light, e.g., infrared light, onto the user's eyes and/or face. In addition, the system 810 may include a second camera 840 oriented away from the user's head, e.g., to monitor the user's surroundings.

One of the ear supports 816 may include a panel 818 for mounting a controller or other processor 842, a transmitter 844, an antenna 845, and a battery 846. Preferably, the processor 840 842 is coupled to the emitters 820, the sensors 822, and/or the camera 830 for controlling their operation. The transmitter 844 may be coupled to the processor 842 for receiving the output signals from the sensors 822 and/or the video signals from the camera 830, e.g., to transmit the signals to a remote location, as described below. Alternatively, the transmitter 844 may be coupled directly to output leads from the sensors 822 and the camera 830. The frame 812 may also include manual controls (not shown), e.g., on the ear support 816, for example, to turn the power off and on, or to adjust the intensity and/or threshold of the emitters 820, the sensors 822, and/or the camera 830.

If desired, the system 810 may also include one or more additional sensors on the frame 812. The sensors may be coupled to the processor 842 and/or to the transmitter 844 so that the signals from the sensors may be monitored, recorded, and/or transmitted to a remote location. For example, one or more position sensors 852a, 852b may be provided, e.g., for determining the spatial orientation of the frame 812, and consequently the user's head. For example, actigraphic sensors may be provided to measure tilt or movement of the head, e.g., to monitor whether the user's head is drooping forward or tilting to the side. Acoustic sensors, e.g., a microphone 854 may be provided for detecting environmental noise or sounds produced by the user.

In addition or alternatively, the frame 812 may include one or more sensors for measuring one or more physical characteristics of the user. For example, EEG electrodes 856 may be provided on the ear support 816, above or below the nasion, and/or other region that may contact the patient's skin to measure brain activity, e.g., waking, drowsy, or other sleep-related brain activity. An EKG electrode (not shown) may be provided that is capable of measuring cardiac activity through a skin contact site. A pulse sensor (not shown) may be used to measure cardiovascular pulsations, or an oximetry sensor 858 may be used to measure oxygen saturation levels. A thermistor or other sensor may measure of respiratory air flow, e.g., through the user's nose. A thermister, thermocouple, or other temperature sensor (not shown) may be provided for measuring the user's skin temperature. A sweat detector (not shown) may be provided for measuring moisture on the user's skin.

In addition, the system 810 may include one or more feedback devices on the frame 812. These devices may provide feedback to the user, e.g., to alert and/or wake the user, when a predetermined condition is detected, e.g., a state of drowsiness or lack of consciousness. The feedback devices may be coupled to the processor 842, which may control their activation. For example, a mechanical vibrator device 860 may be provided at a location that may contact the user, e.g., on the ear support 816, that may provide tactile vibrating stimuli through skin contact. An electrode (not shown) may be provided that may produce relatively low power electrical stimuli. A light emitter, such as one or more LED's may provided at desired locations, e.g., above the bridge piece 814. Alternatively, audio devices 862, such as a buzzer or other alarm, may be provided, similar to the previous embodiments. In a further alternative, aroma-emitters may be provided on the frame 810 812, e.g., on or adjacent to the bridge piece 814.

Alternatively, the feedback devices may be provided separate from the frame, but located in a manner capable of providing a feedback response to the user. For example, audio, visual, tactile (e.g., vibrating seat), or olfactory emitters may be provided in the proximity of the user, such as any of the devices described above. In a further alternative, heat or cold generating devices may be provided that are capable of producing thermal stimuli to the user, e.g., a remotely controlled fan or air conditioning unit.

The system 810 may also include components that are remote from the frame 812, similar to the embodiments described above. For example, the system 810 may include a receiver, a processor, and/or a display (not shown) at a remote location from the frame 812, e.g., in the same room, at a nearby monitoring station, or at a more distant location. The receiver may receive signals transmitted by the transmitter 842, including output signals from the sensors 822 or any of the other sensors provided on the frame 812 and/or the video signals from the camera 830.

A processor may be coupled to the receiver for analyzing signals from the components on the frame 812, e.g., to prepare the signals for graphical display. For example, the processor may prepare the video signals from the camera 830 for display on a monitor, thereby allowing personal monitoring of the user. Simultaneously, other parameters may be displayed, either on a single monitor or on separate displays. For example, FIG. 15a FIGS. 15A-15I shows signals indicating the output of various sensors that may be on the frame 812, which may be displayed along a common time axis or otherwise correlated, e.g., to movement of the user's eye and/or level of drowsiness. The processor may superimpose or otherwise simultaneously display the video signal in conjunction with the other sensed parameters to allow a physician or other individual to monitor and personally correlate these parameters to the user's behavior.

In a further alternative, the processor may automatically process the signals to monitor or study the user's behavior. For example, the processor may use the output signals to monitor various parameters related to eye movement, such as eye blink duration (EBD), eye blink frequency, eye blink velocity, eye blink acceleration, interblink duration (IBD), PERCLOS, PEROP (percentage eyelid is open), and the like.

The video signals from the camera 830 may be processed to monitor various eye parameters, such as pupillary size, location, e.g., within the four quadrant defined by the crossed bands 850, eye tracking movement, eye gaze distance, and the like. For example, because the camera 830 is capable of detecting the light emitted by the emitters 822, the camera 830 may detect a reference frame projected onto the region of the user's eye by the emitters. FIG. 16 shows an exemplary video output from a camera included in a system having twenty emitters disposed in a vertical arrangement. The camera may detect twenty discrete regions of light arranged as a vertical band. The camera may also detect a “glint” point, G, and/or a moving bright pupil, P. Thus, the movement of the pupil may be monitored in relation to the glint point, G, and/or in relation to the vertical band 1-20.

Because the emitters 822 are fixed to the frame 812, the reference frame 850 remains substantially stationary. Thus, the processor may determine the location of the pupil in terms of orthogonal coordinates (e.g., x-y or angle-radius) relative to the reference frame 850. Alternatively, if the reference frame is eliminated, the location of the pupil may be determined relative to any stationary “glint” point on the user's eye. For example, the camera 830 itself may project a point of light onto the eye that may be reflected and detected by the camera. This “glint” point remains substantially stationary since the camera 830 is fixed to the frame 812.

In addition, the video signals from a remote camera that may view the user's face from a distance may be used to monitor various facial measures, such as facial expression, yawning frequency, and the like, in addition to or alternatively, the project instead of the projected light reference frame from the emitters. In addition or alternatively, the parameters from other sensors may be processed and correlated, such as head orientation, tilt, body movement, physiological parameters, and the like. Preferably, the processor may correlate these parameters to generate a composite fatigue index (CFI) that is a function of two or more of these parameters. When a predetermined CFI is detected, the system 810 may activate an alarm or other notice to the user and/or to another party at a remote location. Thus, the system 810 may provide a more effective way to monitor the user's fatigue, drowsiness, alertness, mental state, and the like. In a further alternative, the system 810 may be used to generate predetermined outputs, e.g., to activate or deactivate equipment, such as a vehicle being operated by the user when a predetermined condition, e.g., CFI value, is determined by the system 810.

Alternatively, the processor may be provided on the frame 812, e.g. as part of processor 842, for monitoring the parameters for a predetermined event, such as a predetermined CFI value, to occur. Although only a single lens and set of emitters, sensors, and cameras are shown, it will be appreciated that another set may be provided for the other eye of the user of the system 810. In a further alternative, the eye tracking parameters described above may be monitored by a remote camera, e.g., in a fixed position in front of the user, such as the dashboard of a vehicle and the like. The remote camera may be coupled to the processor, either directly or via its own transmitter, as will be appreciated by those skilled in the art.

Thus, a system in accordance with the present invention may monitor or detect one or more parameters, such as those listed below in Table 1.

TABLE 1
Potential Biometric Measures
EYELID MEASURES EYEGLAZE MEASURES
Percentage of time (t) and Eye Tracking Movements (ETM)
the amount palpebral including Directional Nystagmus
fissure is opened Eye Gaze Distance (EGD) and
(PEROP-t, -d, -dt), Direction
or closed Eye Movement Distance
(PERCLOS-t, -d, -dt), Eye Movement Velocity (EMV)
lid droop Eye Movement Acceleration (EMA)
Eye Blink Duration (EBD) and Deceleration (EMD)
Eye Blink Frequency (EBF) Eye Movement Frequency (EMF)
Eye Blink Velocity (EBV) Phoria/eye Drift Measures (PDM)
Eye Blink Acceleration HEAD ORIENTATION MEASURES
(EBAc) Head Direction or Orientation
Decceleration (EBDc) (HDir)
Interblink duration (IBD) HEAD MOVEMENT MEASURES
Eye blink flurries Head Nodding Frequency (HNF)
PUPIL MEASURES Head Tilt (HT)
Pupillary Appearance or OTHER NON-
Disappearance (with eyelid VIDEO SENSOR METRICS
movement) EEG, EKG, pulse, oxygen
Pupillary Size Measurement saturation, respiration rate,
(PSM) body temp, skin conductance,
Presence and quality of actigraphic movements, head
Pupillary lilt sensors
Dilation or Construction
(including Hippus)

While the invention is susceptible to various modifications, and alternative forms, specific examples thereof have been shown in the drawings and are herein described in detail. It should be understood, however, that the invention is not to be limited to the particular forms or methods disclosed, but to the contrary, the invention is to cover all modifications, equivalents and alternatives falling within the spirit and scope of the appended claims.

Torch, William C.

Patent Priority Assignee Title
10206591, Oct 14 2011 Flint Hills Scientific, LLC Seizure detection methods, apparatus, and systems using an autoregression algorithm
10220211, Jan 22 2013 LivaNova USA, Inc Methods and systems to diagnose depression
10258291, Nov 10 2012 The Regents of the University of California; The Salk Institute for Biological Studies Systems and methods for evaluation of neuropathologies
10448839, Apr 23 2012 LivaNova USA, Inc Methods, systems and apparatuses for detecting increased risk of sudden death
11086473, Jul 28 2016 Tata Consultancy Services Limited System and method for aiding communication
11103707, Jan 22 2013 LivaNova USA, Inc. Methods and systems to diagnose depression
11559243, Feb 06 2016 System and method for evaluating neurological conditions
11596314, Apr 23 2012 LivaNova USA, Inc. Methods, systems and apparatuses for detecting increased risk of sudden death
11615688, Dec 22 2017 ResMed Sensor Technologies Limited Apparatus, system, and method for motion sensing
11707197, Dec 22 2017 ResMed Sensor Technologies Limited Apparatus, system, and method for physiological sensing in vehicles
11813081, Jun 15 2020 BEIJING XIAOMI MOBILE SOFTWARE CO., LTD.; Lumos Health Inc. Intelligent glasses and glasses box
11867915, Mar 31 2021 Microsoft Technology Licensing, LLC Head mounted display with obscured light emitting diodes
12144992, Jan 22 2013 LivaNova USA, Inc. Methods and systems to diagnose depression
8337404, Oct 01 2010 Flint Hills Scientific, LLC Detecting, quantifying, and/or classifying seizures using multimodal data
8382667, Oct 01 2010 Flint Hills Scientific, LLC Detecting, quantifying, and/or classifying seizures using multimodal data
8452387, Sep 16 2010 FLINT HILLS SCIENTIFIC, L L C Detecting or validating a detection of a state change from a template of heart rate derivative shape or heart beat wave complex
8562536, Apr 29 2010 LivaNova USA, Inc Algorithm for detecting a seizure from cardiac data
8571643, Sep 16 2010 Flint Hills Scientific, LLC Detecting or validating a detection of a state change from a template of heart rate derivative shape or heart beat wave complex
8641646, Jul 30 2010 LivaNova USA, Inc Seizure detection using coordinate data
8649871, Apr 29 2010 Cyberonics, Inc Validity test adaptive constraint modification for cardiac data used for detection of state changes
8684921, Oct 01 2010 Flint Hills Scientific LLC; FLINT HILLS SCIENTIFIC, L L C Detecting, assessing and managing epilepsy using a multi-variate, metric-based classification analysis
8725239, Apr 25 2011 LivaNova USA, Inc Identifying seizures using heart rate decrease
8831732, Apr 29 2010 Cyberonics, Inc Method, apparatus and system for validating and quantifying cardiac beat data quality
8852100, Oct 01 2010 Flint Hills Scientific, LLC Detecting, quantifying, and/or classifying seizures using multimodal data
8888702, Oct 01 2010 Flint Hills Scientific, LLC Detecting, quantifying, and/or classifying seizures using multimodal data
8945006, Oct 01 2010 Flunt Hills Scientific, LLC Detecting, assessing and managing epilepsy using a multi-variate, metric-based classification analysis
8948855, Sep 16 2010 Flint Hills Scientific, LLC Detecting and validating a detection of a state change from a template of heart rate derivative shape or heart beat wave complex
9020582, Sep 16 2010 Flint Hills Scientific, LLC Detecting or validating a detection of a state change from a template of heart rate derivative shape or heart beat wave complex
9177202, Jul 11 2011 Toyota Jidosha Kabushiki Kaisha Red-eye detection device
9220910, Jul 30 2010 LivaNova USA, Inc Seizure detection using coordinate data
9241647, Apr 29 2010 LivaNova USA, Inc Algorithm for detecting a seizure from cardiac data
9265458, Dec 04 2012 NEUROSYNC, INC Application of smooth pursuit cognitive testing paradigms to clinical drug development
9380976, Mar 11 2013 NEUROSYNC, INC Optical neuroinformatics
9402550, Apr 29 2011 LivaNova USA, Inc Dynamic heart rate threshold for neurological event detection
9418617, Mar 13 2013 GOOGLE LLC Methods and systems for receiving input controls
9489817, Jan 29 2015 Vigo Technologies, Inc. Infrared sensing of eye and eyelid movements to detect drowsiness
9504390, Mar 04 2011 GLOBALFOUNDRIES Inc. Detecting, assessing and managing a risk of death in epilepsy
9681836, Jul 25 2012 LivaNova USA, Inc Methods, systems and apparatuses for detecting seizure and non-seizure states
9700256, Apr 29 2010 LivaNova USA, Inc Algorithm for detecting a seizure from cardiac data
9883814, May 05 2016 System and method for evaluating neurological conditions
ER5769,
Patent Priority Assignee Title
3689135,
3798599,
3863243,
3966310, Feb 15 1974 Pupillometer and method of use thereof
4102564, Apr 18 1975 Portable device for the accurate measurement of eye movements both in light and obscurity
4359724, Apr 28 1980 Ronald R., Zimmerman Eyelid movement detector
4815839, Aug 03 1987 Infrared/video electronystagmographic apparatus
4850691, Mar 18 1987 University of Illinois Method and apparatus for determining pupillary response parameters
4852988, Sep 12 1988 Applied Science Laboratories; APPLIED SCIENCE LABORATORIES, 335 BEAR HILL ROAD WALTHAM, MASSACHUSETTS 02154 Visor and camera providing a parallax-free field-of-view image for a head-mounted eye movement measurement system
4894777, Jul 28 1986 CANON KABUSHIKI KAISHA, A CORP OF JAPAN Operator mental condition detector
4953111, Feb 12 1987 OMRON TATEISI ELECTRONICS CO Doze detector
4967186, Aug 18 1989 Method and apparatus for fatigue detection
4988183, Jun 13 1988 Konan Camera Research Institute, Inc. Eye movement inspection device
5070883, Dec 16 1988 Konan Camera Research Institute Inc. Eye movement analyzing device utilizing pupil center-of-gravity data
5093567, Jul 14 1989 GEC-MARCONI LIMITED, A BRITISH COMPANY Helmet systems with eyepiece and eye position sensing means
5183512, Aug 29 1991 Aquotech, Inc.; AQUOTECH, INC , A CORP OF DE Method of cleaning semiconductor wafer and computer disks by purified water treated with electric AC signal
5214456, Oct 09 1991 TOMEY CO , LTD Mapping of corneal topography with display of pupil perimeter
5341181, Nov 20 1992 Systems and methods for capturing and presentng visual information
5345281, Dec 17 1992 Eye tracking system and method
5402109, Apr 29 1993 Sleep prevention device for automobile drivers
5447166, Sep 26 1991 Neurocognitive adaptive computer interface method and system based on on-line measurement of the user's mental effort
5469143, Jan 10 1995 Sleep awakening device for drivers of motor vehicles
5478239, Dec 21 1993 NIKE, Inc Dynamic visual acuity training method and apparatus
5481622, Mar 01 1994 Rensselaer Polytechnic Institute Eye tracking apparatus and method employing grayscale threshold values
5566067, Mar 23 1995 The President and Fellows of Harvard College Eyelid vigilance detector system
5570698, Jun 02 1995 Siemens Medical Solutions USA, Inc System for monitoring eyes for detecting sleep behavior
5583795, Mar 17 1995 The United States of America as represented by the Secretary of the Army Apparatus for measuring eye gaze and fixation duration, and method therefor
5682144, Nov 20 1995 Eye actuated sleep prevention devices and other eye controlled devices
5689241, Apr 24 1995 Sleep detection and driver alert apparatus
5704369, Jul 25 1994 BETH ISRAEL HOSPITAL ASSOCIATION, INC Non-invasive method for diagnosing Alzeheimer's disease in a patient
5726916, Jun 27 1996 The United States of America as represented by the Secretary of the Army Method and apparatus for determining ocular gaze point of regard and fixation duration
5748113, Aug 19 1996 GOOGLE LLC Method and apparatus for communication
5778893, Apr 01 1991 HUNTINGTON POTTER Method of diagnosing and monitoring a treatment for Alzheimer's disease
5795306, Mar 10 1994 Mitsubishi Denki Kabushiki Kaisha Bodily state detection apparatus
5861936, Jul 26 1996 SORENSEN RESEARCH AND DEVELOPMENT Regulating focus in accordance with relationship of features of a person's eyes
5867587, May 19 1997 MEDFLEX, LLC Impaired operator detection and warning system employing eyeblink analysis
5956125, Jun 19 1997 BIOPROBES, INC System and method for screening for dementia
6003991, Feb 17 1996 VIIRRE, ERIK SCOTT Eye examination apparatus and method for remote examination of a patient by a health professional
6087941, Sep 01 1998 Warning device for alerting a person falling asleep
6088470, Jan 27 1998 Sensar, Inc.; Sarnoff Corporation Method and apparatus for removal of bright or dark spots by the fusion of multiple images
6090051, Mar 03 1999 EYETRACKING LLC Method and apparatus for eye tracking and monitoring pupil dilation to evaluate cognitive activity
6091378, Jun 17 1998 EYE CONTROL TECHNOLOGIES, INC Video processing methods and apparatus for gaze point tracking
6091546, Oct 30 1997 GOOGLE LLC Eyeglass interface system
6097295, Jan 28 1998 OL SECURITY LIMITED LIABILITY COMPANY Apparatus for determining the alertness of a driver
6116736, Apr 23 1999 NEUROPTICS, INC Pupilometer with pupil irregularity detection capability
6163281, Nov 25 1997 GOOGLE LLC System and method for communication using eye movement
6246344, Aug 19 1996 GOOGLE LLC Method and apparatus for voluntary communication
6246779, Dec 12 1997 Kabushiki Kaisha Toshiba Gaze position detection apparatus and method
6247813, Apr 09 1999 IRITECH, INC Iris identification system and method of identifying a person through iris recognition
6252977, Dec 01 1997 Sensar, Inc. Method and apparatus for illuminating and imaging eyes through eyeglasses using multiple sources of illumination
6260968, Apr 23 1999 Neuroptics, Inc. Pupilometer with pupil irregularity detection capability
6334683, Oct 23 1997 General Electric Capital Corporation Eye illumination system and method
6346929, Apr 22 1994 Canon Kabushiki Kaisha Display apparatus which detects an observer body part motion in correspondence to a displayed element used to input operation instructions to start a process
6373961, Mar 26 1996 Eye Control Technologies, Inc. Eye controllable screen pointer
6388639, Dec 18 1996 Toyota Jidosha Kabushiki Kaisha Stereoscopic image display apparatus, method of displaying stereoscopic image, and recording medium
6611618, Nov 13 1997 SCHEPENS EYE RESEARCH INSTITUTE, INC Wide-band image enhancement
6775060, Jul 26 2001 Schepens Eye Research Institute Bioptic telescope system embedded into a spectacle lens
6820979, Apr 23 1999 NEUROPTICS, INC PUPILOMETER WITH PUPIL IRREGULARITY DETECTION, PUPIL TRACKING, AND PUPIL RESPONSE DETECTION CAPABILITY, GLAUCOMA SCREENING CAPABILITY, INTRACRANIAL PRESSURE DETECTION CAPABILITY, AND OCULAR ABERRATION MEASUREMENT CAPABILITY
6864473, Dec 07 2000 The United States of America as represented by the United States National Aeronautics and Space Administration Dynamic optical filtration
6867752, Aug 31 1998 SEMICONDUCTOR ENERGY LABORATORY CO , LTD Portable information processing system
6997556, Oct 01 2001 VIEWPOINT SICHERHEITSFORSCHUNG - BLICKFORSCHUNG GMBH Method for detecting, evaluating, and analyzing look sequences
7046215, Mar 01 1999 BAE SYSTEMS, plc Head tracker system
7071831, Nov 08 2001 SDIP HOLDINGS PTY LTD Alertness monitor
7120880, Feb 25 1999 Tobii AB Method and system for real-time determination of a subject's interest level to media content
7206435, Mar 26 2002 Honda Giken Kogyo Kabushiki Kaisha Real-time eye detection and tracking under various light conditions
7374284, Dec 17 2003 SCHEPENS EYE RESEARCH INSTITUTE, INC , THE Peripheral field expansion device
7391888, May 30 2003 Microsoft Technology Licensing, LLC Head pose assessment methods and systems
20010028309,
20020024633,
20040061680,
20050007552,
EP679984,
EP984347,
GB2284582,
JP2000137792,
JP2000201289,
JP2002309925,
WO2006092022,
WO9849028,
//
Executed onAssignorAssigneeConveyanceFrameReelDoc
Mar 28 2013TORCH, WILLIAM C , DR EYE-COM CORPORATIONASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0309640128 pdf
Aug 06 2013EYE-COM CORPORATIONEYEFLUENCE, INC ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0309640386 pdf
Date Maintenance Fee Events
Sep 29 2014M2553: Payment of Maintenance Fee, 12th Yr, Small Entity.


Date Maintenance Schedule
Jun 21 20144 years fee payment window open
Dec 21 20146 months grace period start (w surcharge)
Jun 21 2015patent expiry (for year 4)
Jun 21 20172 years to revive unintentionally abandoned end. (for year 4)
Jun 21 20188 years fee payment window open
Dec 21 20186 months grace period start (w surcharge)
Jun 21 2019patent expiry (for year 8)
Jun 21 20212 years to revive unintentionally abandoned end. (for year 8)
Jun 21 202212 years fee payment window open
Dec 21 20226 months grace period start (w surcharge)
Jun 21 2023patent expiry (for year 12)
Jun 21 20252 years to revive unintentionally abandoned end. (for year 12)