bioacoustic sensors and wireless technologies are used for the control of electronic devices, such as wireless telephones, pagers, music/media players or personal laptops and personal digital assistant (PDA) devices. Such control is implemented with certain systems, methods and apparatus which include bioacoustic sensors, a processor coupled to the bioacoustic sensors, and a transmitter coupled to the processor. The present invention is operative to detect particular hand and/or finger gestures, and to transmit control signals corresponding to the gestures for operative control of an electronic device.

Patent
   9395819
Priority
Jul 06 2000
Filed
Oct 25 2006
Issued
Jul 19 2016
Expiry
Apr 26 2028
Extension
2489 days
Assg.orig
Entity
unknown
0
149
EXPIRED
1. A method comprising:
receiving, by a wearable apparatus, a bioacoustic signal in a plurality of bioacoustic signals, each particular bioacoustic signal in the plurality of bioacoustic signals related to a particular hand gesture in a plurality of hand gestures and each particular hand gesture in the plurality of hand gestures associated with a particular command in a plurality of commands, each particular command in the plurality of commands associated with activating a particular function in a plurality of functions performed by an electronic device, wherein the bioacoustic signal is conducted, by way of at least one bone to the wearable apparatus;
identifying, by the wearable apparatus, the particular hand gesture related to the bioacoustic signal based on a positive correlation between the bioacoustic signal and predetermined hand gesture data;
encrypting, by the wearable apparatus, the particular command associated with the particular hand gesture prior to wirelessly transmitting the particular command to the electronic device; and
wirelessly transmitting, by the wearable apparatus, the particular command associated with the particular hand gesture to the electronic device.
15. A non-transitory storage component that stores instructions that, when executed by a processor of a wearable apparatus, cause the processor to perform operations comprising:
receiving a bioacoustic signal in a plurality of bioacoustic signals, each particular bioacoustic signal in the plurality of bioacoustic signals related to a particular hand gesture in a plurality of hand gestures and each particular hand gesture in the plurality of hand gestures associated with a particular command in a plurality of commands, each particular command in the plurality of commands associated with activating a particular function in a plurality of functions performed by an electronic device, wherein the bioacoustic signal is conducted, by way of at least one bone to the wearable apparatus;
identifying the particular hand gesture related to the bioacoustic signal based on a positive correlation between the bioacoustic signal and predetermined hand gesture data;
encrypting the particular command associated with the particular hand gesture prior to wirelessly transmitting the particular command to the electronic device; and
wirelessly transmitting the particular command associated with the particular hand gesture to the electronic device.
8. A wearable apparatus comprising:
a processor; and
a memory that stores instructions that, when executed by the processor, cause the processor to perform operations comprising
receiving a bioacoustic signal in a plurality of bioacoustic signals, each particular bioacoustic signal in the plurality of bioacoustic signals related to a particular hand gesture in a plurality of hand gestures and each particular hand gesture in the plurality of hand gestures associated with a particular command in a plurality of commands, each particular command in the plurality of commands associated with activating a particular function in a plurality of functions performed by an electronic device, wherein the bioacoustic signal is conducted, by way of at least one bone to the wearable apparatus,
identifying the particular hand gesture related to the bioacoustic signal based on a positive correlation between the bioacoustic signal and predetermined hand gesture data,
encrypting the particular command associated with the particular hand gesture prior to wirelessly transmitting the particular command to the electronic device, and
wirelessly transmitting the particular command associated with the particular hand gesture to the electronic device.
2. The method of claim 1, wherein the particular hand gesture reflects contact between a thumb and an index finger of a human hand.
3. The method of claim 1, wherein the particular hand gesture reflects contact between a thumb and a middle finger of a human hand.
4. The method of claim 1, wherein the particular hand gesture reflects contact between a thumb and a ring finger of a human hand.
5. The method of claim 1, wherein the particular hand gesture reflects contact between a finger and a fingernail.
6. The method of claim 1, further comprising emitting, via the wearable apparatus, audio feedback associated with the particular hand gesture.
7. The method of claim 1, wherein the at least one bone comprises at least one bone of a hand.
9. The wearable apparatus of claim 8, wherein the particular hand gesture reflects contact between a thumb and an index finger of a human hand.
10. The wearable apparatus of claim 8, wherein the particular hand gesture reflects contact between a thumb and a middle finger of a human hand.
11. The wearable apparatus of claim 8, wherein the particular hand gesture reflects contact between a thumb and a ring finger of a human hand.
12. The wearable apparatus of claim 8, wherein the particular hand gesture reflects contact between a finger and a fingernail.
13. The wearable apparatus of claim 8, wherein the operations further comprise emitting audio feedback associated with the particular hand gesture.
14. The wearable apparatus of claim 8, wherein the at least one bone comprises at least one bone of a hand.
16. The non-transitory storage component of claim 15, wherein the particular hand gesture reflects contact between a thumb and an index finger of a human hand.
17. The non-transitory storage component of claim 15, wherein the particular hand gesture reflects contact between a thumb and a middle finger of a human hand.
18. The non-transitory storage component of claim 15, wherein the particular hand gesture reflects contact between a thumb and a ring finger of a human hand.
19. The non-transitory storage component of claim 15, wherein the particular hand gesture reflects contact between a finger and a fingernail.
20. The non-transitory storage component of claim 15, wherein the operations further comprise emitting audio feedback associated with the particular hand gesture.

This application is a continuation of prior U.S. patent application Ser. No. 09/898,108, filed Jul. 3, 2001, now U.S. Pat. No. 7,148,879, which claimed priority to U.S. Provisional Application No. 60/216,207, filed Jul. 6, 2000, and U.S. Provisional Application No. 60/265,212, filed Jan. 31, 2001, which are hereby incorporated herein by reference.

The present invention relates to the field of user interfaces for portable electronic devices, and more particularly to a system, method and apparatus for sensing and interpreting finger gestures and movements to control and provide input to electronic devices.

Portable electronic devices have become increasingly popular. Examples of these devices include wireless or cellular telephones, personal digital assistants (PDAs), pagers and audio or music delivery devices. Some devices have become increasingly small such that they are now deemed “pocketable” and/or “wearable.”

A portable electronic device typically has a user interface for operative control. Most if not all conventional user interfaces for such portable electronic devices employ physical buttons, stylus, or voice control. In some devices, a large number of operations or functions are possible with the user interface.

One major shortcoming of these prior art user interfaces is that the user must physically retrieve and position the portable electronic device appropriately for physical contact therewith, for example, by utilizing a stylus to provide commands upon a touch sensitive screen of a PDA or by manually depressing function buttons on a portable media player. In addition, as the size of a device becomes smaller, the interface becomes increasingly inappropriate from an ergonomic standpoint. Voice controlled systems may alleviate some of these problems, however, the major shortcoming of a voice-controlled interface is that the user must speak openly in such a way that other nearby people may hear. Many voice controlled systems are also extremely sensitive to environmental noise and interference.

Accordingly, it would be desirable to have a system, method and apparatus for improving the shortcomings of prior art electronic device control systems.

The present invention is a system, method and apparatus for controlling and providing data, signals and commands to electronic devices such as wireless phones, Personal Digital Assistants (PDAs), music players/recorders, media players/recorders, computers such as laptops or other portable computers, public telephones and other devices. As described herein, the inventive systems, methods and apparatus involve the use of bioacoustic or contact sensing technology, adaptive training methods and wireless technology for the control of such electronic devices.

In one embodiment, the present invention is a method for controlling an electronic device which includes receiving one or more bioacoustic signals, each signal related to one or more hand gestures, determining the identity of the one or more hand gestures based on a positive correlation between the received signals and predetermined hand gesture data and selectively issuing one or more commands associated with the identified hand gesture for activating one or more functions of the electronic device.

In another embodiment, the present invention is a wrist adaptable wireless apparatus for invoking functions of a portable wireless device including a processor coupled to at least one piezo-electric contact microphone which receives sensor signal data, a storage facility for storing a plurality of gesture patterns wherein the processor is operative to compare sensor signal data with the plurality of gesture patterns, to detect a substantial match between the sensor signal data and one of the plurality of gesture patterns, and to select one of a plurality of user input commands associated with the match, wherein the plurality of user input commands correspond to a plurality of functions of the portable wireless device; and a wireless transmitter coupled to said processor and operative to wirelessly transmit the user input command to the portable wireless device.

In yet another embodiment, the present invention is a wireless control system including a bioacoustic sensor component, a digital processor coupled to the sensor component, a storage component for storing gesture pattern data indicative of a plurality of gestures, each gesture corresponding to a unique one of a plurality of electronic device commands wherein the processor is operative to compare acoustic sensor signals with the gesture pattern data and to select one of the electronic device commands corresponding to a gesture that correlates with the acoustic sensor signals and a wireless transmitter and antenna coupled to the processor and operative to transmit the electronic device command.

FIG. 1 illustrates an exemplary system of the present invention.

FIG. 2 illustrates an exemplary system configuration of the present invention.

FIG. 3 illustrates an exemplary method of the present invention.

FIG. 4 is an exemplary training scenario screen of the present invention.

Referring to FIG. 1, one embodiment of a system 10 for controlling electronic devices is shown. In this embodiment, system 10 includes a wireless bioacoustic apparatus 20 which is in communication with an electronic device 100. As used herein, the term “electronic device” may include laptops, mobile phones, personal digital assistants (PDAs), handhelds, PCs, pagers, music players/recorders, media players/recorders and other electronic devices. For example, the present invention may be used to activate certain play functions such as stop, pause, play, rewind and record on a media player/recorder device without having the user manually activate the device in a conventional fashion. In the present invention, bioacoustic apparatus 20 is operative to communicate to electronic device 100 via over the air signals 60, such as radio frequency (RF) signals, infrared signals, microwave signals or other suitable over-the-air signals using a variety of wireless standards and protocols like IEEE 802.11, Bluetooth, etc. or other similar methodologies, like via a capacitive body network. Accordingly, it is contemplated that electronic device 100 will have a facility for receiving and processing such signals in order to translate the signals into the corresponding device commands or sequences. Such a facility, in one exemplary embodiment, may be an RF receiver and processor for receiving RF signals and effectuating the commands corresponding to the signals on the electronic device.

In one embodiment, apparatus 20 includes a band 40 having bioacoustic sensor material formed therein or attached thereto. Band 40 has a signal processing component 50 attached thereto which may include components such as a signal amplifier, a digital processor, a memory, a broadcast component, an encryption module and an antenna, as discussed in more detail later herein. Bioacoustic sensor material may be one or more piezo-electric contact materials or films (also referred to herein as microphones). Preferably, band 40 is sized and configured to fit around a human arm 30. More preferably, band 40 is sized and configured to fit around a distal end 34 of human arm 30, proximate to a human hand 38. In one embodiment, the material of band 40 may be constructed of fabric, elastic, links, or other structure capable of incorporating bioacoustic sensor material, such as bioacoustic material incorporating one or more piezo-electric contact microphones therein. In a preferred embodiment, band 40 has an outer surface and an inner surface, wherein signal processing component 50 is attached or affixed to the band's outer surface and the bioacoustic sensor material is formed or attached to an inner surface of band 40. In this configuration, the bioacoustic material is positioned to receive bioacoustic signals from the user. In the present invention, the bioacoustic or piezo-electric material is optimized for sensing vibration in human skin over the ulna bone at the wrist. The internal sound is conducted by the bones of the hand and wrist to the ulna below the wristband sensor. Airborne sound does not register in the wristband.

As discussed in more detail later herein, signal processing component 50 may be configured in a wristwatch or wristwatch like configuration and incorporate one or more of a signal amplifier, digital processor, broadcast facility, memory and other components which are operative to receive, process and provide bioacoustic signals.

Referring to FIG. 2, an exemplary apparatus configuration 200 is shown. In this embodiment, configuration 200 includes a bioacoustic sensor 210, a signal amplifier 220, a digital processor 230, a wireless broadcast component 240, such as a small-field or narrowcast broadcast device which is coupled to an antenna 250. In one embodiment, signal amplifier 220, digital processor 230, wireless broadcast component 240 and antenna 250 are embedded or integrated in a wrist mount or wristwatch-like configuration, such as shown in FIG. 1 with respect to signal processing component 50. Configuration 200 may also include a memory 260 which is coupled to digital processor 230. A power source, such as a battery, not shown, may also be integrated within apparatus configuration 200. In another embodiment, apparatus configuration 200 may include an audio feedback mechanism, not shown, for emitting audio feedback to the user when a user gesture is sensed. In yet another embodiment, apparatus configuration may include an auxiliary information component, not shown, which can receive and display small amounts of information such as sports scores, stock quotes, weather and appointments received from third party providers. In a further embodiment, apparatus configuration 200 may include an encryption module for encrypting signals, for example, that are transmitted via wireless broadcast component 240. In one embodiment, these signals are narrowcasted from the apparatus which encrypts the signals via an encryption module and then broadcasts commands to an electronic device such as a wireless phones, handheld computers or any nearby devices equipped with appropriate receiver and decrypting facilities for decrypting the encrypted signals.

In the present invention it is contemplated that one or more of the various apparatus components or elements, such as digital processor 230, wireless broadcast device 240, antenna 250, memory 260 and other components such an encryption module may be remotely located from bioacoustic sensor 210. For example, such components or elements may be integrated into a container which is placed in a region other than the user's arm, such as in a belt configuration or other remote configuration.

In another embodiment, the present invention may be configured with one or more piezo-electric contact microphones, signal amplifier, digital processor, a small-field wireless broadcast device and antenna embedded in a finger ring plus a wristwatch amplifier/repeater, not shown. In this embodiment, the wristwatch contains a larger battery than the ring and rebroadcasts gesture commands to the user's wearable devices once received from the ring. The wristwatch can also receive and display small amounts of information such as sports scores, stock quotes, weather and appointments.

In such an embodiment, sensing is performed at the finger ring of the bone-conducted sound of index and middle finger contacts with the thumb. Once sensed, these signals are narrowcasted from the sensing ring to the wristwatch which encrypts and broadcasts commands to worn cell phones, handheld computers or any nearby digital devices equipped with an appropriate receiver. However, in such an embodiment, the broadcast signals can only be decrypted by authorized digital devices.

In yet another embodiment of the invention, a fingernail (or thumbnail) mounted, or ring-mounted touch-edge and touch-surface device that emits coded audio tones into the finger (or thumb) to be picked up by a wrist unit controller and relayed forward to the controlled wearable device is used. In still another embodiment, a narrow-casting infra-red remote control embedded in a watch configuration for control of devices in any environment is used. In another embodiment, coded audio is emitted out from the wrist through the hand to grasped objects such as door knobs for unlocking and locking doors. In still another embodiment, coded audio is received from objects grasped by the hand, and the audio signals are relayed from the wrist to the wearable device.

Referring to FIG. 3, an exemplary method of the present invention is shown. In this embodiment, the system receives one or more bioacoustic signals from the user wherein each signal is related to one or more hand gestures, step 310. Once the bioacoustic signals are received, the identity of the one or more hand related gestures is determined based on a positive correlation between the received signals and predetermined hand gesture data, step 320. Once a gesture is identified, one or more commands associated with the identified hand related gesture are issued which activate a corresponding function of an electronic device, such as the user's PDA, laptop, music player, media player, wireless phone, laptop or other similar device.

The invention includes a user-specific training module for machine learning of gesture classification of the finger gesture audio patterns. During the device training session, users are asked by the system to perform hand gestures repeatedly such as “touch index finger to thumb”, “touch middle finger to thumb”, or “snap your fingers”. At the same time the learning component accurately learns the mapping from signal to gesture for the given individual user. Training and adaptation for gesture classification may be performed using a discriminative algorithm. The learning algorithm first maps the high dimensional recordings into an internal compact representation. It then uses a machine learning technique called boosting to find a set of discriminative features. Finally, these features are combined into a single highly accurate yet compact gesture classifier. For example, in one embodiment, a state machine or hidden Markov model (HMM) may be used to classify quantized voltages into gesture classes to control the desired devices.

The present invention is designed to listen for or otherwise sense (via wrist, forearm, fingernail or ring-mounted sensors) naturally occurring fingertip or hand gestures. Exemplary detectable gestures include:

In one exemplary embodiment, a finger/thumb tap means select, a finger/thumb double-tap means operate, a money gesture (rub) means scroll. In the present invention sound made by fingertip gestures, worn rings or fingernail-mounted devices, or grasped objects such as doorknobs, light-switches or wall-mounted name-plates may also be sensed by the bioacoustic sensors or microphones.

In another exemplary embodiment, a user wearing a wireless phone and earpiece or headset might listen to voicemails or music using VCR-like “forward”, “play”, “stop”, and “rewind” commands mapped to the tapping of thumb to index finger for “play”, thumb to middle finger for “stop”, thumb sliding from middle finger to index finger for “forward” and thumb sliding from index finger to middle finger for “rewind”. In a public area, the user can make these gestures silently and in a visually concealed and private manner without unholstering any controlled digital device.

Referring to FIG. 4, an exemplary screen display 400 implementing an exemplary gesture classifier of the present invention is shown. In this embodiment of the gesture classifier, the classifier is shown in the process of receiving certain bioacoustic signatures of a specific fingertip gesture. Within screen display 400, a top line 410 shows quantized signals every 10th of one second 8000 samples per second. A second line 420 within screen display 400 provides the state of the state-machine classifier algorithm. A third line 430 shows the output classification either R for a rub of the thumbpad against the fingertips or T for a tapping of the thumb and index or ring finger.

One or more readout panels may also be provided as shown in FIG. 4. Panel 440 is a cellular phonebook application controlled with fingertip gestures. The application has names of contacts and their respective phone numbers. In this embodiment, for example, rubbing one's fingertips scrolls the application while tapping selects the highlighted name and number for dialing. Of course, other variations and options for controlling the application may be set as desired by the user or as required by the application.

One of the many advantages of the present invention is that users can control digital devices near them without speech or physically manipulating their respective devices. Users control nearby digital devices merely through the use of simple finger gestures with concurrent audio signals sensed, learned, interpreted, encrypted and broadcast to the devices. Furthermore, while the present system, method and apparatus are ideally suited to provide able-bodied users with more convenient, intuitive and efficient ways to control electronic devices, the system, method and apparatus of the present invention would also greatly benefit people with special needs, such as people with severe speech articulation problems or other similar ailments or handicaps which make conventional user interface controls difficult or even impossible to use. It is contemplated that a wearable communication device of the present invention could make a big difference in the quality of life of such people.

While the invention has been described in conjunction with specific embodiments thereof, it is evident that many alternatives, modifications and variations will be apparent to those skilled in the art. Accordingly, preferred embodiments of the invention as set forth herein are intended to be illustrative, not limiting. Various changes may be made without departing from the spirit and scope of the invention.

Amento, Brian, Terveen, Loren Gilbert, Hill, William Colyer

Patent Priority Assignee Title
Patent Priority Assignee Title
3629521,
4421119, Jul 28 1978 Massachusetts Institute of Technology Apparatus for establishing in vivo, bone strength
4720607, Jan 22 1986 Boussois S.A. Tactile screen for determining the coordinates of a point on a surface
4799498, Nov 01 1984 Kent Scientific; Industrial Project Ltd.; KENT SCIENTIFIC AND INDUSTRIAL PROJECT LIMITED Apparatus for detecting the resonant frequency of a bone
5073950, Apr 13 1989 Personnel Identification & Entry Access Control, Inc. Finger profile identification system
5319747, Apr 02 1990 U.S. Philips Corporation Data processing system using gesture-based input data
5327506, Apr 05 1990 Voice transmission system and method for high ambient noise conditions
5368044, Oct 24 1989 The Adelaide Bone and Joint Research Foundation, Inc. Vibrational analysis of bones
5615681, Dec 22 1994 ALOKA CO , LTD Method for measuring speed of sound in tissue and tissue assessment apparatus
5766208, Aug 09 1994 Lawrence Livermore National Security LLC Body monitoring and imaging apparatus and method
5810731, Nov 13 1995 SARVAZYAN, ARMEN P, DR Method and apparatus for elasticity imaging using remotely induced shear wave
5836876, Mar 03 1993 Washington University Method and apparatus for determining bone density and diagnosing osteoporosis
6115482, Feb 13 1996 Ascent Technology, Inc.; ASCENT TECHNOLOGY, INC Voice-output reading system with gesture-based navigation
6135951, Jul 30 1997 Living Systems, Inc. Portable aerobic fitness monitor for walking and running
6151208, Jun 24 1998 HEWLETT-PACKARD DEVELOPMENT COMPANY, L P Wearable computing device mounted on superior dorsal aspect of a hand
6234975, Aug 05 1997 Research Foundation of State University of New York, The Non-invasive method of physiologic vibration quantification
6380923, Aug 31 1993 Nippon Telegraph and Telephone Corporation Full-time wearable information managing device and method for the same
6396930, Feb 20 1998 Gentex Corporation Active noise reduction for audiometry
6409684, Apr 19 2000 Medical diagnostic device with multiple sensors on a flexible substrate and associated methodology
6507662, Sep 11 1998 GR Intellectual Reserve, LLC Method and system for biometric recognition based on electric and/or magnetic properties
6589287, Apr 29 1997 Handevelop AB Artificial sensibility
6631197, Jul 24 2000 GN Resound North America Corporation Wide audio bandwidth transduction method and device
6754472, Apr 27 2000 Microsoft Technology Licensing, LLC Method and apparatus for transmitting power and data using the human body
6783501, Jul 19 2001 NIHON SEIMITSU SOKKI CO , LTD Heart rate monitor and heart rate measuring method
6844660, Mar 23 2000 Cross Match Technologies, Inc Method for obtaining biometric data for an individual in a secure transaction
6898299, Sep 11 1998 GR Intellectual Reserve, LLC Method and system for biometric recognition based on electric and/or magnetic characteristics
7010139, Dec 02 2003 Bone conducting headset apparatus
7123752, Dec 19 2001 Sony Corporation Personal identification apparatus and method
7148879, Jul 06 2000 AT&T Corp Bioacoustic control system, method and apparatus
7198607, Dec 21 2001 MEDSIS FINLAND OY Detector unit, an arrangement and a method for measuring and evaluating forces exerted on a human body
7206423, May 10 2000 UNIVERSITY OF ILLINOIS, THE Intrabody communication for a hearing aid
7370208, Mar 07 2002 LEVIN, SHMUEL Method and apparatus for automatic control of access
7405725, Jan 31 2003 Olympus Corporation Movement detection device and communication apparatus
7536557, Mar 22 2001 Halo Wearables, LLC Method for biometric authentication through layering biometric traits
7539533, May 16 2006 KONINKLIJKE PHILIPS N V Mesh network monitoring appliance
7615018, Feb 19 2001 Vibrant Medical Limited Leg ulcer, lymphoedema and DVT vibratory treatment and device
7625315, Dec 14 1995 ICON HEALTH & FITNESS, INC Exercise and health equipment
7648471, May 22 2003 Merlex Corporation Pty Ltd. Medical apparatus, use and methods
7671351, Sep 05 2003 Apple Inc Finger sensor using optical dispersion sensing and associated methods
7708697, Apr 20 2000 KARMELSONIX AUSTRALIA PTY LTD Method and apparatus for determining conditions of biological tissues
7760918, Aug 06 2003 CLICK-INTO INC ; BEZVERSHENKO, ZINAYIDA; TCHOUIKEVITCH, VOLODYMYR; GHORAYEB, JOHN Identification of a person based on ultra-sound scan analyses of hand bone geometry
7778848, May 31 2000 Med-DataNet, LLC Electronic system for retrieving, displaying, and transmitting stored medical records from bodily worn or carried storage devices
7796771, Sep 28 2005 THE REVOCABLE TRUST AGREEMENT OF ROBERTA A CALHOUN Bone conduction hearing aid fastening device
7878075, May 18 2007 University of Southern California Biomimetic tactile sensor for control of grip
7914468, Sep 22 2004 SVIP 4 LLC Systems and methods for monitoring and modifying behavior
8023669, Jun 13 2005 Technion Research and Development Foundation LTD Shielded communication transducer
8023676, Mar 03 2008 SONITUS MEDICAL SHANGHAI CO , LTD Systems and methods to provide communication and monitoring of user status
8031046, Aug 02 2006 Apple Inc Finger sensing device with low power finger detection and associated methods
8098129, Nov 16 2004 Koninklijke Philips Electronics N.V. Identification system and method of operating same
8196470, Mar 01 2006 3M Innovative Properties Company Wireless interface for audiometers
8200289, Mar 16 2007 LG Electronics Inc. Portable terminal
8253693, Feb 17 2005 Koninklijke Philips Electronics N V Device capable of being operated within a network, network system, method of operating a device within a network, program element, and computer-readable medium
8270637, Feb 15 2008 SONITUS MEDICAL SHANGHAI CO , LTD Headset systems and methods
8270638, May 29 2007 SONITUS MEDICAL SHANGHAI CO , LTD Systems and methods to provide communication, positioning and monitoring of user status
8312660, May 09 2008 Corydoras Technologies, LLC Firearm
8348936, Dec 09 2002 THE TRUSTEES OF DARTMOUTH COLLEGE Thermal treatment systems with acoustic monitoring, and associated methods
8421634, Dec 04 2009 Microsoft Technology Licensing, LLC Sensing mechanical energy to appropriate the body for data input
8467742, Mar 17 2009 Denso Corporation Communications apparatus
8482488, Dec 22 2004 Oakley, Inc. Data input management system for wearable electronically enabled interface
8491446, Oct 02 2009 KIIO INC Exercise devices with force sensors
8500271, Oct 09 2003 IpVenture, Inc. Eyewear supporting after-market electrical components
8521239, Dec 27 2010 FINEWELL CO , LTD Mobile telephone
8540631, Apr 14 2003 Remon Medical Technologies, Ltd. Apparatus and methods using acoustic telemetry for intrabody communications
8542095, Feb 22 2008 NEC Corporation Biometric authentication device, biometric authentication method, and storage medium
8594568, May 08 2006 Koninklijke Philips Electronics N V Method of transferring application data from a first device to a second device, and a data transfer system
8750852, Oct 27 2011 Qualcomm Incorporated Controlling access to a mobile device
8922427, Jun 29 2011 BAE Systems Information and Electronic Systems Integration Inc. Methods and systems for detecting GPS spoofing attacks
20010013546,
20010051776,
20030048915,
20030066882,
20060018488,
20070012507,
20080223925,
20080260211,
20090149722,
20090234262,
20090287485,
20090289958,
20100016741,
20100066664,
20100137107,
20100162177,
20100168572,
20100286571,
20100316235,
20110125063,
20110134030,
20110135106,
20110137649,
20110152637,
20110227856,
20110245669,
20110255702,
20110269601,
20110282662,
20120010478,
20120011990,
20120058859,
20120065506,
20120212441,
20120290832,
20130034238,
20130041235,
20130119133,
20130120458,
20130135223,
20130142363,
20130171599,
20130173926,
20130215060,
20130225915,
20130278396,
20130288655,
20140009262,
20140028604,
20140035884,
20140097608,
20140099991,
20140168135,
20140174174,
20140188561,
20140210791,
20140240124,
20150084011,
20150199950,
AU2003257031,
AU2007200415,
CA1207883,
EP712114,
EP921753,
EP1436804,
EP2312997,
EP2483677,
EP2643981,
GB2226931,
JP2003058190,
JP2005142729,
JP2010210730,
JP2249017,
JP4317638,
KR20100056688,
TW200946887,
WO3033882,
WO2006094372,
WO2010045158,
WO2012168534,
WO8201329,
WO9601585,
/
Executed onAssignorAssigneeConveyanceFrameReelDoc
Oct 25 2006AT&T Intellectual Property II, L.P.(assignment on the face of the patent)
Date Maintenance Fee Events


Date Maintenance Schedule
Jul 19 20194 years fee payment window open
Jan 19 20206 months grace period start (w surcharge)
Jul 19 2020patent expiry (for year 4)
Jul 19 20222 years to revive unintentionally abandoned end. (for year 4)
Jul 19 20238 years fee payment window open
Jan 19 20246 months grace period start (w surcharge)
Jul 19 2024patent expiry (for year 8)
Jul 19 20262 years to revive unintentionally abandoned end. (for year 8)
Jul 19 202712 years fee payment window open
Jan 19 20286 months grace period start (w surcharge)
Jul 19 2028patent expiry (for year 12)
Jul 19 20302 years to revive unintentionally abandoned end. (for year 12)