A mobile terminal is used to assist individuals with disabilities. The mobile terminal (e.g., a “smartphone” or other commercially available wireless handheld device) may be loaded with software. The software may be configured to: (i) receive information about a sensory deficit associated with a user, (ii) receive information about a sensory proficiency associated with the user, (iii) determine whether an event associated with a sensory deficit satisfies a criterion, and if so (iv) provide an assistive output based on a sensory proficiency and the event.
|
19. A mobile terminal comprising:
a housing, wherein the housing comprises eyeglasses;
at least one sensor positioned within the housing, the sensor operable to detect at least one of: light, sound, motion, temperature, geographic location, blood glucose level, blood pressure, electrodermal skin response, and heart rate;
a user interface, wherein the user interface includes at least one of a display screen, an audio speaker, and a vibration unit as a user interface component; and
a control system operatively coupled to the at least one sensor and the user interface, and adapted to:
receive profile information associated with a user of the mobile terminal, wherein the profile information is stored electronically in a database and defines:
a sensory disability associated with the user of the mobile terminal, wherein the sensory disability comprises at least one among a vision, hearing, locomotion, balance, smell, and touch disability;
a sensory proficiency associated with the user of the mobile terminal, wherein the sensory proficiency comprises a vision, hearing, or touch proficiency;
an assistive output rule, which associates the sensory disability with at least one sensor, associates the sensory proficiency with at least one mobile terminal user interface component or peripheral eyeglasses user interface component, and associates the sensory disability with the sensory proficiency, such that at least one user interface component may be employed in a manner that compensates for the sensory disability; and
an event comprising a threshold criterion associated with the at least one sensor from a plurality of sensors;
compare a measurement from the at least one sensor to the threshold criterion to determine if the event has occurred; and
execute the assistive output rule by employing the at least one user interface component to compensate for the sensory disability and allow the user to perceive the event.
1. A system comprising:
a mobile terminal, and peripheral eyeglasses coupled communicatively with the mobile terminal;
at least one sensor, the sensor operable to detect at least one of: ectrodermal skin response, and heart rate;
a mobile terminal user interface, wherein the mobile terminal user interface includes at least one of a touch-sensitive display screen, an audio speaker, and a vibration unit as a mobile terminal user interface component;
a peripheral eyeglasses user interface, wherein the peripheral eyeglasses user interface includes at least one of a display screen and an audio speaker as a peripheral eyeglasses user interface component, and
a control system operatively coupled to the at least one sensor, the mobile terminal user interface, and the peripheral eyeglasses user interface, and adapted to:
receive profile information associated with a user of the mobile terminal, wherein the profile information is stored electronically in a database and defines:
a sensory disability associated with the user of the mobile terminal, wherein the sensory disability comprises at least one among a vision, hearing, locomotion, balance, smell, and touch disability;
a sensory proficiency associated with the user of the mobile terminal, wherein the sensory proficiency comprises a vision, hearing, or touch proficiency;
an assistive output rule, which associates the sensory disability with at least one sensor, associates the sensory proficiency with at least one mobile terminal user interface component or peripheral eyeglasses user interface component, and associates the sensory disability with the sensory proficiency, such that the at least one mobile terminal user interface component or peripheral eyeglasses user interface component may be employed in a manner that compensates for the sensory disability; and
an event comprising a threshold criterion associated with the at least one sensor from a plurality of sensors;
compare a measurement from the at least one sensor to the threshold criterion to determine if the event has occurred; and
execute the assistive output rule by employing the at least one mobile terminal user interface component or peripheral eyeglasses user interface component to compensate for the sensory disability and allow the user to perceive the event.
2. The system of
3. The system of
4. The system of
5. The system of
6. The system of
7. The system of
8. The system of
9. The system of
10. The system of
11. The system of
12. The system of
13. The system of
14. The system of
15. The system of
16. The system of
17. The system of
18. The system of
|
The present application is a continuation application of U.S. patent application Ser. No. 13/540,148, filed Jul. 2, 2012.
The '148 application is a divisional application of U.S. patent application Ser. No. 12/704,119, filed Feb. 11, 2010, now abandoned.
The '119 application claimed the benefit of and priority to U.S. Provisional Patent Application Ser. No. 61/152,915 filed Feb. 16, 2009.
The contents of the applications listed above are hereby incorporated by reference in their entireties.
The present application is directed to assisting those with sensory deficits through the use of a mobile terminal.
With a growing population, the number of developmentally disabled children grows. Additionally, the rates at which children have been diagnosed as developmentally disabled, and particularly diagnosed with autism spectrum disorders (ASD), have steadily increased. Individuals with developmental disabilities often have several challenges in common, including but not limited to speech and language impairments, cognitive deficits, social problems, behavioral problems, memory problems, attention deficits, and sensory processing dysfunction. The developmentally disabled population extends beyond those with ASD to include those with Down syndrome, cerebral palsy, Fragile X syndrome, and other disabilities which may involve sensory impairments. In addition, many individuals develop sensory deficits at later stages in life, from trauma, old age, or other neuropathy. More than eight million people over the age of five in the United States alone are hard of hearing. More than seven million are visually impaired. Sensory dysfunction can result from stroke, brain injury, cancer, and other ailments.
For individuals with sensory processing dysfunction any number of conditions such as: hypersensitivity or hyposensitivity with respect to senses of sight; hearing; touch; smell; taste; pain (nociception); balance (equilibrioception), spatial orientation, joint motion and acceleration (proprioception and kinesthesia); time; and temperature (thermoception) may pose significant challenges in simple daily tasks, particularly when such tasks involve new environments. Many take for granted the ability to visit a new place, walk in sunlight, cope with the general auditory din of a public location, interact with a pet, and so on, though these tasks may be particularly daunting for hypersensitive individuals. For the hyposensitive, difficulty hearing, feeling, seeing, or balancing presents a constant battle, forcing many to overcompensate for their deficits (e.g., aggressively bang an object simply to feel it), or exhaust themselves straining a particular sense. Many disabled individuals resort to self-stimulatory or “stereotypy” activity (e.g., echolalic or perseverative speech, repetitive body motions, etc.) as a mechanism to cope with or regulate sensory irregularities.
Described herein are techniques for assisting individuals with disabilities. In some embodiments, a mobile terminal (e.g., a “smartphone” or other commercially available wireless handheld device described herein) may be loaded with software of the present disclosure. The software may be configured to: (i) receive information about a sensory deficit associated with a user, (ii) receive information about a sensory proficiency associated with the user, (iii) determine whether an event associated with a sensory deficit satisfies a criterion in that an event has occurred that relates to or implicates the sensory deficit, and if so (iv) provide an assistive output based on a sensory proficiency and the event. In one embodiment, a sensory input is received by the software and translated into a sensory output that is more easily understood or processed by a user of a mobile terminal. For example, a user with auditory hypersensitivity may receive information about a loud environment (the presence of the loud environment implicates the auditory hypersensitivity) by way of a text shown on a display screen. In another example, a user with visual hypersensitivity may receive information about an overly bright environment through audio speakers or a peripheral headset. As an alternative to (or in addition to) such sensory translations, the software may provide therapeutic or instructional output to help the disabled individual cope with the event (e.g., soothing music, a vibration, advice for how to cope with the situation, a reassurance that a caregiver is nearby, or the like).
An overview of some of the hardware elements is provided before addressing some of the methods and potential screen configurations associated with embodiments of the present disclosure. A mobile terminal 10 is illustrated in
A block diagram of the mobile terminal 10 is illustrated in
Peripheral eyeglasses 58a may communicate with the mobile terminal 10, and may comprise an optical camera 60, a RFID transponder (not shown), an audio microphone (not shown), accelerometer (not shown), and/or an infrared transmitter (not shown). Eyeglasses 58a may be used to provide information related to (i) the environment surrounding an individual wearing the eyeglasses (e.g., still images or video are taken from outward-facing optical camera 60 and transmitted to mobile terminal 10 through eyeglasses 58a), (ii) pupil dilation of the individual wearing the glasses (e.g., optical-camera 60 is oriented inward to directly monitor the user's eyes), or (iii) the direction of the individual's gaze with respect to the mobile terminal (e.g., the orientation of eyeglasses 58a relative to mobile terminal 10 is determined using RFID, and thus the orientation of the individual's gaze relative to mobile terminal 10 may be inferred). As an example of eye-tracking technology used in conjunction with a mobile terminal, the 3D Eyetracking UI as developed by TAT THE ASTONISHING TRIBE, AB of Sweden for the T-MOBILE G1 mobile terminal alters the user-interface presented by the mobile terminal's display screen based on the orientation of the device to the user's eyes.
Peripheral skin patch 58b may comprise (i) a sensor enabling the measurement of biological properties, and (ii) a transmitter enabling wireless communication with mobile terminal 10. Biological properties measured by skin patch 58b may include temperature, blood glucose (the GUARDIAN® REAL-Time Continuous Glucose Monitoring System, as produced by MEDTRONIC MINIMED, INC. of Northridge, Calif., comprises a sensor embedded within a skin patch that monitors blood sugar levels and a transmitter to send data wirelessly to a receiver via BLUETOOTH™), blood pressure, Galvanic Skin Response (GSR) or electrodermal skin response, heart rate, and cardiac electrical activity (e.g., a wireless ECG patch such as the one developed by the IMEC research center in Belgium). Thus, a user of mobile terminal 10 may also wear skin patch 58b such that biological measurements may be taken in a continuous, periodic, or otherwise specified manner.
Wristwatch 58c may measure and transmit biological data in a manner similar to that of skin patch 58b. Thus, a user of mobile terminal 10 who enjoys wearing a watch may at the same time provide a variety of biological data used by the software of the present disclosure. A variety of commercially available digital wristwatches comprise sensors for measuring such data (e.g., the IRONMAN® Heart Rate Monitor Watch by TIMEX®). In an alternate embodiment, wristwatch 58c may instead take the form of a basic wristband, arm band or ankle band (i.e., comprising a sensor and transmitter but no clock or stopwatch functionality). Likewise, while not shown, the wristwatch 58c may also include an accelerometer, an RFID transponder, or the like to detect movement of the wristwatch 58c or its position relative to the mobile terminal 10.
Ring 58d may also measure and transmit biological data and/or positional data in a manner similar to that of skin patch 58b. The ring 58d and its components (e.g., sensors, electronics) may be fashioned from a variety of metals, plastics, composites, and/or semiconductors. In one embodiment, a commercially available ring, such as the MYBEAT Heart Rate Ring by LIFESPAN FITNESS, may be adapted for use with software of the present disclosure.
In one embodiment (not shown), a biosensor may be permanently affixed to or implanted beneath the user's skin. For example, a subcutaneous electrochemical sensor may be used to continuously measure and report on blood (or interstitial fluid) glucose level. An example of a device employing such a semi-permanent sensor is the FREESTYLE NAVIGATOR® Continuous Glucose Monitoring System by ABBOTT LABORATORIES of Abbott Park, Ill.
Clip-on microphone 58e may be used to receive audio data and re-transmit it to the mobile terminal 10. Thus, if the clip-on microphone 58e is oriented toward the user's mouth (as it appears in
Headphones 58f may be employed as a mechanism to hear audio produced by mobile terminal 10 (e.g., as an alternate to, or in addition to, speaker 14). In one embodiment, headphones may take the form of earbuds, and along with ergonomic plastic 64, may be positioned in a manner that is secure, comfortable, and effective for listening to sounds produce by the mobile terminal 10. In other embodiments, a full set of ear-encompassing headphones may be used. As alluded to above, headphones 58f may also be incorporated into a single device with a microphone 58e (not shown). Suffice it to say that headphones of a variety of sorts are widely available in the commercial marketplace and may be used in conjunction with software of the present invention. In some embodiments, such headphones may incorporate other sensors (e.g., RFID, an infrared sensor, a motion sensor) to facilitate embodiments of the present disclosure (e.g., determination of the physical orientation of the user's head).
Worth discussing at this point is the manner in which mobile terminal 10 may be carried or held. In various embodiments, a mobile terminal 10 may be carried or held by a disabled individual and/or a physically present caregiver. At times, mobile terminal 10 may be carried by hand, but this may not always be desirable, convenient, or possible. While the above described peripheral devices 58 may allow for the placement of mobile terminal 10 inside of a pocket or a purse, some embodiments (e.g., those requiring substantially continuous use of components integrated within mobile terminal 10 itself) may benefit from the placement of mobile terminal 10 outside of a pocket or a purse. In one embodiment, as depicted in
In some embodiments, a user of the mobile terminal 10 may be reminded or encouraged to appropriately position the mobile terminal 10 or any associated peripheral devices on or around his or her person. For example, in some embodiments, placement of the mobile terminal 10 around the neck of a disabled user through the use of lanyard 62 may be desirable. Thus, if the mobile terminal's integrated light sensor detects lack of light for a prolonged period of time during a specified coverage period (e.g., suggesting that the device may be in the user's pocket or purse), a vibration unit, speaker 14, or other output device may be actuated to alert the user and encourage him or her to appropriately position the mobile terminal 10 (e.g., the mobile terminal vibrates, and the speaker outputs a ringtone or other alarm). An accelerometer 34 and/or light sensor may then detect that the device has been picked up, at which point instructions for effectively positioning the device may be output (e.g., the display screen 16 outputs a graphical indication of the device hung around a neck, while speaker 14 outputs an audio instruction to “Hang around your neck, please!”). Placement of the mobile terminal 10 or associated peripheral devices in other locations may be encouraged in a similar manner (e.g., a user is encouraged to position peripheral microphone 58e outward such that it captures the speech of those engaged in conversation with the user).
Against this backdrop of hardware, an overview of an exemplary method is presented starting with reference to
The caregiver may then configure elements of the software, so as to customize the software on behalf of a disabled individual. Particularly, the caregiver may provide the software with information about one or more sensory deficits associated with a user, and thus the software receives information about a sensory deficit associated with the user (block 104). For example, one or more particular hypersensitivities or hyposensitivities associated with a disabled user may be described. The software may further receive information about a sensory proficiency associated with the user (block 106). Further, the caregiver may provide, and the software may store, a criterion for providing an assistive output as well as the actual assistive output (block 108). The provided information is saved, and the software, through the sensors associated with the mobile terminal 10 monitors the activities of the disabled individual to determine if the criterion has been satisfied (block 110). If the criterion has been satisfied, the mobile terminal 10, or an associated peripheral device 44 or 58 may be used to provide an assistive output. Many of these steps will be further explained below with reference to various exemplary screen shots.
In an exemplary embodiment, after the software is loaded onto the mobile terminal (block 104,
Regardless of the particular technique, the caregiver may see a screen such as that illustrated in
If the user selects a pre-existing profile or template (block 152), the user may be shown a screen like that shown in
If the user is satisfied with the current parameters associated with the profile (i.e., elements 206, 208, 210, and 212), the user may select the load command 216 (block 154) in which case the software loads the profile and begins monitoring to see if the criterion is satisfied as described with reference to block 110 of
If, instead of loading an existing profile, the user selects the “+” command 204, a blank profile is created (block 158) and the user may see a screen like that in
As shown, the screen of
Continuing with the new profile example, and assuming the caregiver has selected the “Add/Edit Sensory Deficit Info” option (
Continuing the example, and assuming the caregiver has selected the “Take a survey” option, a screen such as that of
Returning to
Returning to
Returning again to
Worth describing at this juncture are some sensory deficit types for which information may be provided. Again, though
Some users may suffer from a visual impairment or may have difficulty seeing. In one example, a user has a loss of visual acuity (e.g., difficulty seeing clearly, whether short-sightedness, near-sightedness, astigmatism, imbalance of acuity between eyes, or overall lack of acuity or blindness). In another example, a user has a loss of visual field (e.g., a reduction in the total area that can be seen without moving the head or eyes, such as a loss of tunnel vision). In another example, a user suffers from a lack of ocular motor control (e.g., difficulty fixating on objects of particular types or objects in particular locations). In another example, a user has a hypersensitivity to bright environments (e.g., shields eyes in sunlight or when standing in fluorescent light) or to bright colors. In another example, a user is colorblind.
Some users may suffer from a hearing or auditory impairment. In one embodiment, a user has a lost an ability (in part or in full) to hear certain frequencies of sound. For example, a use with mild hearing loss hears only sounds at 25 decibels and above, whereas a user with profound hearing loss hears only sounds louder than 90 decibels. A person's hearing sensitivity in association with particular ranges of frequencies may be plotted on an audiogram, as is known in the art. In another embodiment, a user may suffer from tinnitus, or the perception of sound within the human ear without a corresponding external sound. Tinnitus may be objective (e.g., so-called “pulsatile” tinnitus heard by others, such as a rumbling of blood flow emanating from a vascular source within the body), or subjective (e.g., ringing, buzzing, clicking, or beeping not heard by others). Some users with tinnitus may also suffer hearing loss, and/or may be more sensititive to sounds emitted at certain frequencies (e.g., a particular ringing tone is uncomfortable to hear). Some users may experience hyperacusis (i.e., oversensitivity to certain types of sounds or volume levels).
Some users may experience sensory deficits related to touch. For example, some users may exhibit tactile defensiveness (hypersensitivity to touch). Some users may avoid contact with certain types of textures or objects, such as wool, sand, hair combs, cardboard, or light human touch. In other examples, users may suffer from tactile anesthesia, or inability to register sensation through the skin. For example, a user may have lost, partially or completely, the inability to feel objects using a particular finger or hand.
Some users may experience impairments related to taste (gustatory sensations, as mediated by taste buds). For example, a user may experience ageusia, or the inability to detect sweetness, sourness, bitterness, saltiness in association with foods or beverages. Other users may experience hypogeusia (partial loss of gustatory sensation) or dysgeusia (the distortion or alteration of taste).
Some users may have difficulty smelling or accurately processing olfactory sensations. For example, a user with anosmia may not be able to detect an odor. A user with hyposmia may experience a reduction in the sense of smell (e.g., in association with a particular odor, or with all odors). Some may perceive odors which are not present (so-called “olfactory hallucinations”), or may be hypersensitive to certain types of odors (e.g., the smell of metal or rubber makes a particular user feel sick).
Some users may experience difficulty sensing temperature. For example, hyposensitive thermoception may result in the inability or impaired ability to register or sense different temperatures (e.g., a user may “under-feel” heat, and so may be prone to kitchen or fireplace burns). Some users may be hypersensitive to temperature (e.g., moderate warmth is perceived as excessive heat; a somewhat cold object is perceived as extremely cold).
Some users may experience problems with balance or equilibrioception. In some examples, balance-related challenges may stem from the vestibular system, resulting in impaired ability to sense body movement, accelerate, or maintain postural equilibrium. Further challenges related to equilibrioception may result in poor coordination, motor planning, or sequencing of actions.
Some users may suffer from challenges related to proprioception, or the ability to interoceptively sense where body parts are located in relation to one another. Some users may experience impairment related to the sensation of joint position, resulting in difficulty determining the location of a body part in space. Some users may experience kinesthetic impairment, resulting in difficulty sensing that a certain body part has moved. Some users with proprioceptive and/or kinesthetic impairments may exhibit clumsiness, insensitivity to pain, or sensory-seeking behaviors (e.g., walk, push objects, write, play with objects, or touch people with excessive force).
Returning to the flowchart of
Given this framework, perhaps the easiest way of receiving information related to sensory proficiencies (block 106 of
Of course, in some situations, a caregiver or other user may wish to expressly provide information related to one or more functional sensory capacities (e.g., perhaps a particular sense associated with a user is exceptionally well developed, such as acute “20/10” vision). In such a case, a user may first set the switch shown by
Once information related to sensory deficits and proficiencies are received by the software, the process depicted by
In some embodiments, software of the present disclosure automatically creates assistive output rules based on sensory proficiency and deficiency information provided by users. For example, sensory proficiency and deficiency data associated with a user may be organized into a database (an example database is described below), and the database may be scanned by the software. The purpose of the scan may be to identify trends within the data that would suggest certain types of assistive outputs may be beneficial to the user. For example, programmed logic within the software code may indicate that assistive output rules are to be automatically created when certain fields within the database indicate certain data. For example, if one or more database records associated with a user's visual acuity describe vision as proficient, and yet other records indicate the user suffers moderate to severe hearing loss at high frequencies (4000 Hz and above), a rule may automatically be created whereby sounds (detected by integrated microphone 26 or clip-on microphone 58e) occurring above 4000 Hz are automatically “translated” into images and output via display screen 16 (e.g., a chime from a clock is detected, and so a picture of a clock is output by the screen). Other examples of assistive output rules are described below, but suffice it to say that any rule may be so created by the software upon analyzing data provided by users; through time and experience, creators of the software may learn that users with particular deficits commonly program certain assistive output rules, and such information may feed the software's logic for automatically creating assistive output rules.
In other embodiments, assistive output rules may be created by users of the software. Turning to
Once the user selects “Create,” a second screen for configuring an assistive output rule may be presented, an example of which is shown by
For example, assuming the user selects the option labeled “Sounds, noises or speech,” a screen such as that of
Using a screen such as this, the caregiver may begin to configure a criterion such that an identified sound (i.e., as detected by integrated microphone 26 or clip-on microphone 58e) will be used to trigger an assistive output. Using one option, a user may search or browse an already-existing “Library” to select a particular sound to be used as a trigger. Using another option, the user may choose to record and save a custom sound.
If the caregiver selects the latter option, a screen such as that of
The screen of
Continuing with the example and assuming the user has selected the “Save” control 68a, the user has now specified a criterion (i.e., the detection of a sound produced by a tea kettle) for the provision of an assistive output. To complete programming the assistive output rule, the user may now define the assistive output that should be produced should the criterion be satisfied. To do so, one or more screens such as that of
The user may then select one or more types of assistive outputs, including but not limited to: (i) visual outputs, (ii) audio outputs, and (iii) tactile outputs (e.g., a vibration). As shown by
If the user were alternately or additionally to select the “Audio output” option of
If the user were alternately or additionally to select the “Tactile output” option of
After configuring such preferences, the user selects the “Save Assistive Output Rule” option such that the new rule is created and added to the list of saved rules shown by
With the rule now programmed, the process depicted by the flowchart of
A variety of sensors, peripherals, peripheral devices, and other technologies associated with the mobile terminal 10 may be employed when determining whether a specified event has occurred. Some examples will now be set forth in which particular technologies are paired with the detection of particular behaviors, environmental changes or properties, occurrences, and the like. Though an example is not provided for every possible pairing, this should not be construed as limiting the capacity of the software to employ all available technologies in detecting whether criteria have been satisfied.
In some embodiments, a criterion may be satisfied if a sound, a noise or human speech is detected. Either or both of an integrated microphone 26 and a peripheral microphone, such as clip-on microphone 56e, may be employed to help detect such audio occurrences. In one example of detecting speech, an integrated microphone 26 is used to detect that a particular word or phrase has been spoken. In another example, a clip-on microphone 56e is used to detect the word or phrase. In any case, an uttered word or phrase may be “matched” to a voiceprint (or waveform) stored within electronic memory. The voiceprint may be generic (e.g., the speech criterion is satisfied the word or phrase may be spoken by any person), or associated with a particular person (e.g., the speech criterion is satisfied only if the word or phrase is spoken by a particular person). Various technologies for so-called biometric voiceprint matching using mobile terminals are known in the art (e.g., the PHONEFACTOR system manufactured by PHONEFACTOR, INC of Overland Park, Kans. utilizes such technology). Of course, other sounds besides speech may be detected (e.g., human sounds such as crying or screaming, animal sounds such as a dog's bark or a bird's chirp, sounds occurring in an urban environment such as a car's horn, sounds occurring at home such as those emitted by home appliances). In one example, as described, an audio waveform detected by a microphone may be matched to a prerecorded waveform for purposes of identifying the detected waveform, using technologies similar to those incorporated by reference above. In some embodiments, a database of sounds may be stored locally by the mobile terminal 10. In other embodiments, a central server may store a database of sounds, and the mobile terminal 10 may communicate with such a database for the purposes of identifying detected sounds. One such database is maintained by COMPARISONICS® CORPORATION of Grass Valley, Calif. Another such system, the SOLAR (Sound Object Localization and Retrieval) system as developed by researchers at the University of Illinois at Champagne-Urbana, compares sounds detected by microphones to a vast database of sound types, so as to distinguish car horns, dog barks, trains, human voices, etc. Software of the present disclosure may communicate with such a database using an API. In some embodiments, in order for a criterion to be satisfied, noise must be above a threshold decibel level (e.g., as specified by a user). In another embodiment, for a criterion to be satisfied, a sound must occur a certain number of times within a specified time period (e.g., a repeated sound suggests a user may be engaged in auditory sensory-seeking behavior).
In some embodiments, a criterion may be satisfied if physical movement is detected. In some embodiments, physical motion is associated with a user of the mobile terminal 10. For example, certain bodily motions performed by a user may be detected using peripheral devices 58a-f. In one example, an arm flapping motion is detected through the employ of ring 58d, watch 58c or other peripheral device attached to an arm. Waving of the hand, punching and hitting may be so detected. In one embodiment, covering of the ears or eyes may be detected using a combination of sensors (e.g., eyeglasses 58a and ring 58d), with such actions suggesting the user may be engaged in sensory-avoidant behavior (e.g., the environment is too bright or too loud for the user to effectively process). In another example, a peripheral anklet may transmit data to the mobile terminal 10 that suggests the user may be running, kicking, or shaking his or her leg. In another example, rocking in place or shaking of the head are detected using eyeglasses 58a (e.g., a motion sensor, infrared sensor, RFID transponder, or other sensor is embedded within the eyeglasses, communicating relative head position to the mobile terminal 10). In one example, eyeglasses 58e are used to determine that the user's gaze is oriented toward the mobile terminal (e.g., using camera 60, an RFID transponder, an infrared sensor, or the like). In another example, a separate peripheral device associated with an object, person or area (e.g., an RFID receiver or other sensor placed in a particular location) may be used in conjunction with eyeglasses 58e (e.g., worn by a disabled individual) to determine that a disabled individual's gaze is indeed oriented in the direction of the object, person or area (e.g., for at least a threshold percentage of time during a particular time period); for example, if a user's gaze is affixed in a direction away from a source of light, sound, or other stimulus, it may be determined that the user is engaged in sensory-avoidant behavior.
Further, in some embodiments, any of the above motions may be detected without the use of a separate peripheral device. For example, an integrated camera 24 or motion sensor may detect motion activity (e.g., as facilitated by the hanging of mobile terminal 10 around a user's neck using lanyard 62, or by the attachment of the mobile terminal 10 to a belt clip, the sensor is oriented to detect activity motion activity occurring in front of the user). In some embodiments, motion must last for a predetermined duration, or occur at a particular velocity, for a criterion to be satisfied. Accelerometer 34 and/or an integrated altimeter may detect the mobile terminal 10 (and perhaps thus its user) has fallen to the ground. Also, direct physical interaction with the mobile terminal 10 may be considered. For example, a pressure sensor associated with a button or screen of the mobile terminal 10 may detect an excessive amount of force, an accelerometer 34 may detect excessive motion, or a light sensor may detect a pattern indicative of tactile sensory-seeking behavior (e.g., the light sensor is repeatedly covered and uncovered). In another example, a user repeatedly attempts to lower the volume associated with speaker 14 or a peripheral headset, suggesting auditory overstimulation. In some embodiments, combinations of several of the above behaviors may lead the software to conclude that criteria have been satisfied. For example, simultaneous crying above a threshold decibel level and physical mishandling of the mobile terminal 10 are illustrative of tactile sensory-seeking. In another example, tactile interaction with the mobile terminal 10 is first detected (e.g., using a pressure sensor), and then accelerometer 34 determines the device has been dropped or put down, suggesting tactile avoidance.
In some embodiments, the detection of a nearby person or object may satisfy a criterion. In one example, such a person or object may wear or carry a device equipped with a sensor, and the sensor may in turn communicate the presence of the object or person to the mobile terminal 10. For example, one or more sensors or other technologies associated with a first mobile terminal 10 (e.g., carried by a disabled individual) may be used to determine that a second mobile terminal 10 or peripheral device (e.g., carried by another individual) is present. Many technologies may be employed to accomplish such a goal, including a motion sensor, infrared sensor, RFID, GPS, triangulation of geolocation using WiFi or cellular network nodes. In one such embodiment, a first and second mobile terminal 10 may be registered (e.g., using software of the present disclosure) as “buddy” or “friend” devices (e.g., to facilitate the determination that a specific person or “friend” is nearby a disabled individual at a particular time). In another embodiment, a microphone may be used to detect a voiceprint associated with another person (e.g., a specific caregiver) or sound associated with a particular object.
In another embodiment, an optical camera, such as an integrated camera 24, or a camera associated with a peripheral device, such as camera 60, may be used to detect the presence of a particular person or object. Either or both of a still camera and video camera may be so utilized, and such cameras may capture images periodically, continuously, upon demand of a user, or in response to instructions stored within the software (e.g., an instruction states to actuate a camera in response to a loud spike in ambient volume as detected by a microphone). For example, image comparison search may be employed, allowing for images captured by a camera to be matched with those included within a database. In one embodiment, the database is stored locally. For example, a user takes a digital photograph and stores it within memory of the mobile terminal 10, such that a camera may capture a new image and compare it to the stored digital photograph (e.g., if a match is detected, a criterion is satisfied). In another embodiment, the database is maintained by a central server communicatively coupled with the mobile terminal 10. An example of a large online database of images used for image-comparison purposes is the TINEYE REVERSE IMAGE SEARCH ENGINE created by IDEE, INC of Toronto, Canada, which includes a commercial API allowing third party software applications (such as the software of the present disclosure) to communicate with the database. In this manner, captured images may be analyzed to detect particular objects or people, and such detection may satisfy a criterion related to an assistive output rule. For example, the detection of a fire or stove may be beneficial to a person with a thermoceptive deficit, the detection of a dog may be beneficial to someone with a known tactile hypersensitivity to fur, the detection of a flower may be beneficial to someone with a poor sense of smell, or the detection of a particular type of food may be beneficial to someone with an impaired gustatory sense. Again, effective positioning of the mobile terminal and/or its peripherals may assist in such detections; for example, a peripheral camera 60 must be oriented outward from the user for objects in front of the user to be detected.
In some embodiments, an optical camera associated with mobile terminal 10 may be used to recognize text, and a criterion may be satisfied upon the detection of text (whether any text, or a specific word or phrase). For example, Optical Character Recognition (OCR) technology may be used to identify letters. Methods for using OCR technology are well known in the mobile terminal art. For example, a user with a NOKIA phone (such as the N95) running the SYMBIAN operating system may take advantage of the NOKIA MULTISCANNER OCR feature, pointing the phone's optical camera at physical text, such that the text is recognized by the phone's software and output via the phone's display. Recognized text may then be saved, and the language library used to help identify words may be updated over the air. Such technology may be employed to assist a user with a vision deficit, who may encounter objects with physical text in his or her natural environment (e.g., books, signs, etc.), but may be challenged or unable to read such text.
In some embodiments, the detection of a specific amount of light in the environment proximate to the mobile terminal 10 may satisfy a criterion. For example, a light sensor may detect a particular abundance or lack of light during a period of time (e.g., indicating that the mobile terminal is in direct sunlight, or in a dark environment).
In some embodiments, the detection of a specific environmental temperature may satisfy a criterion. For example, a thermometer may be used to determine that the mobile terminal is in a particularly cold environment (e.g., beneath 40° F.). In another example, a directional temperature sensor may determine that a nearby object is at or above a threshold temperature (e.g., a nearby stove, fire, or other object is excessively hot). In another example, a Really Simple Syndication (RSS) feed may be monitored such that weather trends associated with the mobile terminal's current environment are known.
In one embodiment, the detection of an altitude or altitude change may satisfy a criterion. For example, an altimeter may be used to determine that an altitude associated with the mobile terminal has dropped rapidly within a short period of time. In another embodiment an altimeter may detect that the mobile terminal 10 is currently at a high altitude, which in and of itself may affect the sensory capacities of a user.
In one embodiment, the detection of a barometric pressure or change in barometric pressure may satisfy a criterion. For example, a barometer detects a change in atmospheric pressure, which satisfies a criterion.
In some embodiments, data concerning the geographic location of the mobile terminal 10 may satisfy a criterion. As is known in the mobile terminal art, a current geographic location may be determined using GPS technology, triangulation using WiFi or cellular network nodes, or the like. In one example, a criterion may be satisfied if the mobile terminal 10 is within proximity of a particular geographic location at a particular time. In another example, a criterion may be satisfied if the mobile terminal 10 has strayed outside of a geographic “safe zone” expressly programmed by a user (e.g., a 10 mile radius surrounding school and home). In another example, a criterion is satisfied if the mobile terminal 10 has been detected within sufficient range of three separate geographic areas within a particular time (e.g., a child carrying the mobile terminal has gone from school, to an after school program, and finally to home on a particular afternoon). In another example, a criterion is satisfied if the mobile terminal 10 has been taken to a “new” geographic location as per a database storing previously visited locations. In another embodiment, using GPS, WiFi triangulation, and/or cellular triangulation, the current location of a mobile terminal 10 is determined to be in an area characterized by particular sensory properties (e.g., a bus station is loud and dark; a library is quiet and bright; an amusement park implicates vestibular stimulation, a beach is hot, wet, bright and sandy; etc.).
In some embodiments, a property or state associated with a user's body (e.g., a biometric reading), or a change associated therewith, may satisfy a criterion. For example, a change in a user's heart rate or other property may suggest the user is nervous or excited, which in conjunction with another occurrence (e.g., a loud noise), may suggest that the user is struggling with sensory regulation. In some embodiments, a criterion is satisfied based on heart rate data associated with a user of the mobile terminal 10. For example, a user's heart rate may be monitored through the employ of a patch 58b, wristwatch 58c, ring 58d, or other wearable device that may detect and transmit such data (e.g., a bracelet, anklet, wristband, etc.). A criterion may be satisfied by a specified increase in, decrease in, or consistent level of beats per minute (e.g., a user's heart rate remains beneath 60 beats per minute for a period of time). In some embodiments, a criterion is satisfied based on blood pressure data associated with a user of the mobile terminal 10. For example, a user's blood pressure may be monitored through the employ of a patch 58b, wristwatch 58c, ring 58d, or other wearable device, such as an armband. A criterion may be satisfied by a specified increase in, decrease in, or consistent level of pressure (e.g., a user's blood pressure spikes above 130/80). In some embodiments, a criterion is satisfied based on body temperature data associated with a user of the mobile terminal 10. For example, a user's body temperature may be monitored using a peripheral device described herein (e.g., a patch 58b, wristwatch 58c, or ring 58d comprising an infrared or heat sensor). A criterion may be satisfied by a specified increase in, decrease in, or consistent body or skin temperature. In some embodiments, a criterion is satisfied based on electrodermal or galvanic skin response (GSR) data associated with a user of the mobile terminal 10. Such data are used in the art to monitor and infer human emotions (e.g., fear, anger, anxiety, startle response, orienting response). GSR data may be read and transmit using a peripheral device, with the device enabled to measure electrical resistance between two points on the user's skin. Commercially available sensors such as the GSR 2™ as produced by THOUGHT TECHNOLOGY LTD of Plattsburgh, N.Y. may be adapted for purposes of this disclosure. A criterion may be satisfied based on an increase or decrease in electrodermal activity (e.g., a sharp increase may suggest a user has grown fearful, anxious or disoriented).
In some embodiments, a criterion may be satisfied based (in part or in whole) on the occurrence of a particular time or date. For example a caregiver who knows that a user with SPD is scheduled to attend occupational therapy on Wednesday at 4:30 p.m. might create an assistive output in conjunction with this occasion. In one example, the assistive output occurs before the occasion, to prepare the user mentally for the upcoming experience (e.g., a display screen reads, “Almost time for OT! Remember, one leg at a time when you're on the ropes!” and presents a picture of a rope ladder). In another example, a specific time frame is set (e.g., Wednesday between 4:30 and 4:40), such that if a criterion is satisfied during the specified time period, an assistive output should be produced (e.g., if a camera detects a trampoline during the time period, or an accelerometer detects a bouncing motion during the timer period, a previously recorded video of the user successfully jumping on the trampoline is output via display 16, to model to the user an appropriate behavior when using the trampoline). In other words, in some embodiments, both (i) a criterion related to input from a sensor and (ii) a time/date criterion must be satisfied for an assistive output to be produced. An internal clock or calendar coupled to CPU 28 may assist with the determination of a current time and/or date.
Having described a wide array of criteria that may trigger an assistive output, and manners for detecting the satisfaction of such criteria, the discussion now turns to block 112 of the flowchart of
Generally speaking, certain communication types of assistive outputs may make particular sense when matched to certain sensory deficits and proficiencies received in blocks 104 and 106 of
In the first example, illustrated by
In the second example, illustrated by
In the third example, illustrated by
In the fourth example, illustrated by
Of course, many other examples exist of sensory-translational, therapeutic and instructional outputs in response to a variety of events. In one example, an individual with poor thermoceptive capability carries a mobile terminal, and a thermometer detects a sudden and extreme increase in heat; audio speaker 14 then indicates “It's hot in here—please be careful” while a display screen 16 shows image of a thermometer, and a map of the environment indicating the location of “hot spots”. In another example, a user suffers from a visual impairment, resulting in an inability to read. An optical camera detects a sign that says “Baseball Field Ahead”. Headphones 58f may then output instructions routing the user to the baseball field (e.g., in conjunction with GPA technology). In another example, a caregiver specifies that a disabled user of a mobile terminal 10 is hypersensitive to certain types of sounds (e.g., sounds of crowds and heavy machinery). The mobile terminal 10 may then detect using GPS that it is approaching a geographic location known for such types of sounds (e.g., a shopping mall, a busy downtown area, a construction zone), and output an instruction to avoid the area. In another example, a sensor detects proximity to a nearby object on behalf of a user who is hyposensitive to visual-spatial sensation (depth perception), and alerts the individual to the presence of the object using audio. In another example, a user suffers impairments to the vestibular system, resulting in poor balance and motor planning. Over time, an accelerometer 34 and/or altimeter detect that the user seems to frequently lose his or her balance after eating. Accordingly, immediately after lunch every day, a “balance game” is output, focusing the user on balancing an object shown by the mobile terminal's display screen 16 (e.g., balance a bowling ball as it rolls down an alley), honing vestibular regulation. In another example, a user is hyposensitive to touch, and commonly engages in tactile-sensory-seeking behavior by banging or otherwise aggressively manipulating objects. A pressure sensor detects the user is clasping the mobile terminal 10 and/or pressing buttons with unnecessary force, and so the mobile terminal 10 vibrates vigorously such that user receives the tactile sensory stimulation he or she may have been seeking. In another example, a user may suffer from a proprioceptive deficit resulting in frequent fits of disorientation. An increased heart rate and increased level of electrodermal activity are detected. In response, an image of a caregiver is output via the display screen 16, along with a pre-recorded voice recording in which the caregiver says “It's OK, take a deep breath and sit down for a minute”. In another example, a user has impairments related to both vision and equilibrioception. Whenever a high degree of motion is detected by a motion sensor, a warning is output for the user to proceed with caution.
The methods described above lend themselves to a database of saved user profiles. Such a database may store software settings in association with particular users (e.g., assistive output rules in accordance with “Jimmy Smith”). One exemplary user profile database is depicted by
The methods described above may also lend themselves to a database of saved sensory deficiency and/or proficiency data associated with a user (i.e., a “sensory profile database”). Turning to
In some embodiments, one or more fields of such a database may be updated based on the user's interaction with a mobile terminal. For example, initially, a user may hear sounds at 2000 Hz only when they occur at or above a level of 40 dB (e.g., database fields for “LEFT EAR @ 2000 Hz” and “RIGHT EAR @ 2000 Hz may both store an indication of “40 dB,” based on a first sensory evaluation conducted when the user began using the software). However, over time, the user's level of hearing may decline. Such a decline may be detected (i) in response to normal use of the mobile terminal (e.g., as sounds are periodically output at a frequency of 2000 Hz, the software determines that the user is less and less responsive to them), and/or (ii) as a result of a second sensory evaluation (e.g., one month after the user performs a first hearing test, a second hearing test is conducted, producing a new behavioral audiogram, such that the sensory profile database may be updated with new information on the user's hearing). For example, if the user's ability to hear sounds emitted at a frequency of 2000 Hz has decreased to the point where only sounds above a level of 50 dB are heard, the sensory profile database may be so updated. Of course, other sensory deficit and proficiency information besides hearing information may be so updated (e.g., a user is originally thought to be severely tactile defensive, but over time does not demonstrate dissatisfaction in response to the mobile terminal's vibrations (the user does not drop the device or express an emotional outburst), and so the user's tactile defensiveness may be downgraded from severe).
In other embodiments, one or more fields of a sensory profile database based on data specifically entered by a user related to a deficit. For example, a user may take a new survey, or otherwise enter or import new information.
In any case, as the data in the sensory profile database of
While the above discussion provides a particularly contemplated set of embodiments, the present disclosure is not so limited. Numerous alternate embodiments are also possible and within the scope of the present disclosure. These alternate embodiments are not necessarily mutually exclusive with one another or with the embodiments set forth above. Rather, components of the various embodiments may be mixed and matched as needed or desired. In one alternate embodiment, when a user configures an assistive output (e.g., as shown by the screen of
In some embodiments, a log of assistive output activity may be maintained. For example, each time an assistive output is triggered, an indication of the output may be stored in a database (not shown). The database may store: (i) a time/date when the assistive output was generated, (ii) the trigger for the assistive output, (iii) the type of assistive output, and (iv) information related to events that transpired before or the assistive output was produced (e.g., the database stores an electronic audio or video file of the user reacting to the output, an indication of any buttons pressed on the mobile terminal 10 after the output was produced, the current temperature or level of lighting at the time of the output, etc.). Thus, those not present at the time of the assistive output (e.g., caregivers) may review the context in which a particular assistive outputs is produced, or may review the data for trends (e.g., certain types of assistive outputs are less common during the past month, perhaps indicating one of the user's sensory deficits is improving). For example, a child's log indicates he more frequently covers his ears when the source of the noise is not man-made (e.g., tractors, busses, televisions), and seems to struggle with his balance more in the morning than at night. In another example, a user is more commonly bothered by bright environments when the environment is also noisy, or it is winter time. Such data may be accessible through the user's mobile terminal 10, may be uploaded or transmit to another mobile terminal 10 or computer 54. For example, the message shown by the screen of
In some embodiments, a remote observant may monitor the environment of a disabled individual. For example, a video or still camera transmits images from the disabled individual's mobile terminal 10 to a separate computing device (e.g., another mobile terminal or a personal computer), such that a remote observant (e.g., caregiver, therapist, call center employee) can view the images being sent in substantially real time. For example, an audio feed is transmitted to a remote parent, who can “listen in” to her child's environment. In another example, data from various sensors of the mobile terminal 10 may be uploaded to a central monitoring Web site allowing those with access to view current environmental data associated with a disabled individual.
Numerous embodiments are described in this disclosure, and are presented for illustrative purposes only. The described embodiments are not, and are not intended to be, limiting in any sense. The presently disclosed invention(s) are widely applicable to numerous embodiments, as is readily apparent from the disclosure. One of ordinary skill in the art will recognize that the disclosed invention(s) may be practiced with various modifications and alterations, such as structural, logical, software, and electrical modifications. Although particular features of the disclosed invention(s) may be described with reference to one or more particular embodiments and/or drawings, it should be understood that such features are not limited to usage in the one or more particular embodiments or drawings with reference to which they are described, unless expressly specified otherwise.
The present disclosure is neither a literal description of all embodiments nor a listing of features of the invention that must be present in all embodiments.
Neither the Title (set forth at the beginning of the first page of this disclosure) nor the Abstract (set forth at the end of this disclosure) is to be taken as limiting in any way as the scope of the disclosed invention(s).
The term “product” means any machine, manufacture and/or composition of matter as contemplated by 35 U.S.C. §101, unless expressly specified otherwise.
The terms “an embodiment”, “embodiment”, “embodiments”, “the embodiment”, “the embodiments”, “one or more embodiments”, “some embodiments”, “one embodiment” and the like mean “one or more (but not all) disclosed embodiments”, unless expressly specified otherwise.
The terms “the invention” and “the present invention” and the like mean “one or more embodiments of the present invention.”
A reference to “another embodiment” in describing an embodiment does not imply that the referenced embodiment is mutually exclusive with another embodiment (e.g., an embodiment described before the referenced embodiment), unless expressly specified otherwise.
The terms “including”, “comprising” and variations thereof mean “including but not limited to”, unless expressly specified otherwise.
The terms “a”, “an” and “the” mean “one or more”, unless expressly specified otherwise.
The term “plurality” means “two or more”, unless expressly specified otherwise.
The term “herein” means “in the present disclosure, including anything which may be incorporated by reference”, unless expressly specified otherwise.
The phrase “at least one of”, when such phrase modifies a plurality of things (such as an enumerated list of things) means any combination of one or more of those things, unless expressly specified otherwise. For example, the phrase at least one of a widget, a car and a wheel means either (i) a widget, (ii) a car, (iii) a wheel, (iv) a widget and a car, (v) a widget and a wheel, (vi) a car and a wheel, or (vii) a widget, a car and a wheel.
The phrase “based on” does not mean “based only on”, unless expressly specified otherwise. In other words, the phrase “based on” describes both “based only on” and “based at least on”.
Where a limitation of a first claim would cover one of a feature as well as more than one of a feature (e.g., a limitation such as “at least one widget” covers one widget as well as more than one widget), and where in a second claim that depends on the first claim, the second claim uses a definite article “the” to refer to the limitation (e.g., “the widget”), this does not imply that the first claim covers only one of the feature, and this does not imply that the second claim covers only one of the feature (e.g., “the widget” can cover both one widget and more than one widget).
Each process (whether called a method, algorithm or otherwise) inherently includes one or more steps, and therefore all references to a “step” or “steps” of a process have an inherent antecedent basis in the mere recitation of the term ‘process’ or a like term. Accordingly, any reference in a claim to a ‘step’ or ‘steps’ of a process has sufficient antecedent basis.
When an ordinal number (such as “first”, “second”, “third” and so on) is used as an adjective before a term, that ordinal number is used (unless expressly specified otherwise) merely to indicate a particular feature, such as to distinguish that particular feature from another feature that is described by the same term or by a similar term. For example, a “first widget” may be so named merely to distinguish it from, e.g., a “second widget”. Thus, the mere usage of the ordinal numbers “first” and “second” before the term “widget” does not indicate any other relationship between the two widgets, and likewise does not indicate any other characteristics of either or both widgets. For example, the mere usage of the ordinal numbers “first” and “second” before the term “widget” (1) does not indicate that either widget comes before or after any other in order or location; (2) does not indicate that either widget occurs or acts before or after any other in time; and (3) does not indicate that either widget ranks above or below any other, as in importance or quality. In addition, the mere usage of ordinal numbers does not define a numerical limit to the features identified with the ordinal numbers. For example, the mere usage of the ordinal numbers “first” and “second” before the term “widget” does not indicate that there must be no more than two widgets.
When a single device or article is described herein, more than one device or article (whether or not they cooperate) may alternatively be used in place of the single device or article that is described. Accordingly, the functionality that is described as being possessed by a device may alternatively be possessed by more than one device or article (whether or not they cooperate).
Similarly, where more than one device or article is described herein (whether or not they cooperate), a single device or article may alternatively be used in place of the more than one device or article that is described. For example, a plurality of computer-based devices may be substituted with a single computer-based device. Accordingly, the various functionality that is described as being possessed by more than one device or article may alternatively be possessed by a single device or article.
The functionality and/or the features of a single device that is described may be alternatively embodied by one or more other devices that are described but are not explicitly described as having such functionality and/or features. Thus, other embodiments need not include the described device itself, but rather can include the one or more other devices which would, in those other embodiments, have such functionality/features.
Devices that are in communication with each other need not be in continuous communication with each other, unless expressly specified otherwise. On the contrary, such devices need only transmit to each other as necessary or desirable, and may actually refrain from exchanging data most of the time. For example, a machine in communication with another machine via the Internet may not transmit data to the other machine for weeks at a time. In addition, devices that are in communication with each other may communicate directly or indirectly through one or more intermediaries.
A description of an embodiment with several components or features does not imply that all or even any of such components and/or features are required. On the contrary, a variety of optional components are described to illustrate the wide variety of possible embodiments of the present disclosure. Unless otherwise specified explicitly, no component and/or feature is essential or required.
Further, although process steps, algorithms or the like may be described in a sequential order, such processes may be configured to work in different orders. In other words, any sequence or order of steps that may be explicitly described does not necessarily indicate a requirement that the steps be performed in that order. The steps of processes described herein may be performed in any order practical. Further, some steps may be performed simultaneously despite being described or implied as occurring non-simultaneously (e.g., because one step is described after the other step). Moreover, the illustration of a process by its depiction in a drawing does not imply that the illustrated process is exclusive of other variations and modifications thereto, does not imply that the illustrated process or any of its steps are necessary to the invention, and does not imply that the illustrated process is preferred.
Although a process may be described as including a plurality of steps, that does not indicate that all or even any of the steps are essential or required. Various other embodiments within the scope of the described invention(s) include other processes that omit some or all of the described steps. Unless otherwise specified explicitly, no step is essential or required.
Although a product may be described as including a plurality of components, aspects, qualities, characteristics and/or features, that does not indicate that all of the plurality are essential or required. Various other embodiments within the scope of the described invention(s) include other products that omit some or all of the described plurality.
An enumerated list of items (which may or may not be numbered) does not imply that any or all of the items are mutually exclusive, unless expressly specified otherwise. Likewise, an enumerated list of items (which may or may not be numbered) does not imply that any or all of the items are comprehensive of any category, unless expressly specified otherwise. For example, the enumerated list “a computer, a laptop, a PDA” does not imply that any or all of the three items of that list are mutually exclusive and does not imply that any or all of the three items of that list are comprehensive of any category.
Headings of sections provided in this disclosure are for convenience only, and are not to be taken as limiting the disclosure in any way.
“Determining” something can be performed in a variety of manners and therefore the term “determining” (and like terms) includes calculating, computing, deriving, looking up (e.g., in a table, database or data structure), ascertaining, recognizing, and the like.
A “display” as that term is used herein is an area that conveys information to a viewer. The information may be dynamic, in which case, an LCD, LED, CRT, LDP, rear projection, front projection, or the like may be used to form the display. The aspect ratio of the display may be 4:3, 16:9, or the like. Furthermore, the resolution of the display may be any appropriate resolution such as 480i, 480p, 720p, 1080i, 1080p or the like. The format of information sent to the display may be any appropriate format such as standard definition (SDTV), enhanced definition (EDTV), high definition (HD), or the like. The information may likewise be static, in which case, painted glass may be used to form the display. Note that static information may be presented on a display capable of displaying dynamic information if desired.
The present disclosure frequently refers to a “control system”. A control system, as that term is used herein, may be a computer processor coupled with an operating system, device drivers, and appropriate programs (collectively “software”) with instructions to provide the functionality described for the control system. The software is stored in an associated memory device (sometimes referred to as a computer readable medium). While it is contemplated that an appropriately programmed general purpose computer or computing device may be used, it is also contemplated that hard-wired circuitry or custom hardware (e.g., an application specific integrated circuit (ASIC)) may be used in place of, or in combination with, software instructions for implementation of the processes of various embodiments. Thus, embodiments are not limited to any specific combination of hardware and software.
A “processor” means any one or more microprocessors, CPU devices, computing devices, microcontrollers, digital signal processors, or like devices. Exemplary processors are the INTEL PENTIUM or AMD ATHLON processors.
The term “computer-readable medium” refers to any medium that participates in providing data (e.g., instructions) that may be read by a computer, a processor or a like device. Such a medium may take many forms, including but not limited to, non-volatile media, volatile media, and transmission media. Non-volatile media include, for example, optical or magnetic disks and other persistent memory. Volatile media include DRAM, which typically constitutes the main memory. Transmission media include coaxial cables, copper wire and fiber optics, including the wires that comprise a system bus coupled to the processor. Transmission media may include or convey acoustic waves, light waves and electromagnetic emissions, such as those generated during RF and IR data communications. Common forms of computer-readable media include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, any other magnetic medium, a CD-ROM, DVD, any other optical medium, punch cards, paper tape, any other physical medium with patterns of holes, a RAM, a PROM, an EPROM, a FLASH-EEPROM, a USB memory stick, a dongle, any other memory chip or cartridge, a carrier wave, or any other medium from which a computer can read.
Various forms of computer readable media may be involved in carrying sequences of instructions to a processor. For example, sequences of instruction (i) may be delivered from RAM to a processor, (ii) may be carried over a wireless transmission medium, and/or (iii) may be formatted according to numerous formats, standards or protocols. For a more exhaustive list of protocols, the term “network” is defined below and includes many exemplary protocols that are also applicable here.
It will be readily apparent that the various methods and algorithms described herein may be implemented by a control system and/or the instructions of the software may be designed to carry out the processes of the present disclosure.
Where databases are described, it will be understood by one of ordinary skill in the art that (i) alternative database structures to those described may be readily employed, and (ii) other memory structures besides databases may be readily employed. Any illustrations or descriptions of any sample databases presented herein are illustrative arrangements for stored representations of information. Any number of other arrangements may be employed besides those suggested by, e.g., tables illustrated in drawings or elsewhere. Similarly, any illustrated entries of the databases represent exemplary information only; one of ordinary skill in the art will understand that the number and content of the entries can be different from those described herein. Further, despite any depiction of the databases as tables, other formats (including relational databases, object-based models, hierarchical electronic file structures, and/or distributed databases) could be used to store and manipulate the data types described herein. Likewise, object methods or behaviors of a database can be used to implement various processes, such as those described herein. In addition, the databases may, in a known manner, be stored locally or remotely from a device that accesses data in such a database. Furthermore, while unified databases may be contemplated, it is also possible that the databases may be distributed and/or duplicated amongst a variety of devices.
As used herein a “network” is an environment wherein one or more computing devices may communicate with one another. Such devices may communicate directly or indirectly, via a wired or wireless medium such as the Internet, Local Area Network (LAN), Wide Area Network (WAN), or Ethernet (or IEEE 802.3), Token Ring, or via any appropriate communications means or combination of communications means. Exemplary protocols include but are not limited to: BLUETOOTH™, TDMA, CDMA, GSM, EDGE, GPRS, WCDMA, AMPS, D-AMPS, IEEE 802.11 (WI-FI), IEEE 802.3, TCP/IP, or the like. Note that if video signals or large files are being sent over the network, a broadband network may be used to alleviate delays associated with the transfer of such large files, however, such is not strictly required. Each of the devices is adapted to communicate on such a communication means. Any number and type of machines may be in communication via the network. Where the network is the Internet, communications over the Internet may be through a website maintained by a computer on a remote server or over an online data network including commercial online service providers, bulletin board systems, and the like. In yet other embodiments, the devices may communicate with one another over RF, cellular networks, cable TV, satellite links, and the like. Where appropriate encryption or other security measures such as logins and passwords may be provided to protect proprietary or confidential information.
Communication among computers and devices may be encrypted to insure privacy and prevent fraud in any of a variety of ways well known in the art. Appropriate cryptographic protocols for bolstering system security are described in Schneier, APPLIED CRYPTOGRAPHY, PROTOCOLS, ALGORITHMS, AND SOURCE CODE IN C, John Wiley & Sons, Inc. 2d ed., 1996, which is incorporated by reference in its entirety.
The present disclosure provides, to one of ordinary skill in the art, an enabling description of several embodiments and/or inventions. Some of these embodiments and/or inventions may not be claimed in the present disclosure, but may nevertheless be claimed in one or more continuing applications that claim the benefit of priority of the present disclosure.
Jorasch, James A., Tedesco, Daniel E., Tedesco, Robert C.
Patent | Priority | Assignee | Title |
10012505, | Nov 11 2016 | Toyota Motor Engineering & Manufacturing North America, Inc. | Wearable system for providing walking directions |
10024667, | Aug 01 2014 | Toyota Motor Engineering & Manufacturing North America, Inc. | Wearable earpiece for providing social and environmental awareness |
10024678, | Sep 17 2014 | Toyota Motor Engineering & Manufacturing North America, Inc. | Wearable clip for providing social and environmental awareness |
10024679, | Jan 14 2014 | Toyota Jidosha Kabushiki Kaisha | Smart necklace with stereo vision and onboard processing |
10024680, | Mar 11 2016 | Toyota Motor Engineering & Manufacturing North America, Inc. | Step based guidance system |
10172760, | Jan 19 2017 | Responsive route guidance and identification system | |
10188323, | Sep 05 2014 | BEIJING ZITIAO NETWORK TECHNOLOGY CO , LTD | Systems, apparatus, and methods for using eyewear, or other wearable item, to confirm the identity of an individual |
10215568, | Jan 30 2015 | BEIJING ZITIAO NETWORK TECHNOLOGY CO , LTD | Systems and methods for tracking motion, performance, and other data for an individual such as a winter sports athlete |
10231664, | May 26 2016 | Method and apparatus to predict, report, and prevent episodes of emotional and physical responses to physiological and environmental conditions | |
10248856, | Jan 14 2014 | Toyota Jidosha Kabushiki Kaisha | Smart necklace with stereo vision and onboard processing |
10307085, | Sep 05 2014 | BEIJING ZITIAO NETWORK TECHNOLOGY CO , LTD | Wearable physiology monitor computer apparatus, systems, and related methods |
10360907, | Jan 14 2014 | Toyota Motor Engineering & Manufacturing North America, Inc. | Smart necklace with stereo vision and onboard processing |
10391631, | Feb 27 2015 | Toyota Motor Engineering & Manufacturing North America, Inc. | Modular robot with smart device |
10405786, | Oct 09 2013 | Systems, environment and methods for evaluation and management of autism spectrum disorder using a wearable data collection device | |
10432851, | Oct 28 2016 | Toyota Motor Engineering & Manufacturing North America, Inc. | Wearable computing device for detecting photography |
10448867, | Sep 05 2014 | BEIJING ZITIAO NETWORK TECHNOLOGY CO , LTD | Wearable gait monitoring apparatus, systems, and related methods |
10490102, | Feb 10 2015 | Toyota Jidosha Kabushiki Kaisha | System and method for braille assistance |
10521669, | Nov 14 2016 | Toyota Motor Engineering & Manufacturing North America, Inc. | System and method for providing guidance or feedback to a user |
10524715, | Oct 09 2013 | Systems, environment and methods for emotional recognition and social interaction coaching | |
10533855, | Jan 30 2015 | BEIJING ZITIAO NETWORK TECHNOLOGY CO , LTD | Systems and methods for tracking motion, performance, and other data for an individual such as a winter sports athlete |
10542915, | Sep 05 2014 | BEIJING ZITIAO NETWORK TECHNOLOGY CO , LTD | Systems, apparatus, and methods for using a wearable device to confirm the identity of an individual |
10561519, | Jul 20 2016 | Toyota Motor Engineering & Manufacturing North America, Inc. | Wearable computing device having a curved back to reduce pressure on vertebrae |
10617342, | Sep 05 2014 | BEIJING ZITIAO NETWORK TECHNOLOGY CO , LTD | Systems, apparatus, and methods for using a wearable device to monitor operator alertness |
10694981, | Sep 05 2014 | BEIJING ZITIAO NETWORK TECHNOLOGY CO , LTD | Wearable physiology monitor computer apparatus, systems, and related methods |
10722128, | Aug 01 2018 | BEIJING ZITIAO NETWORK TECHNOLOGY CO , LTD | Heart rate detection system and method |
11032471, | Jun 30 2016 | Nokia Technologies Oy | Method and apparatus for providing a visual indication of a point of interest outside of a user's view |
11039669, | Jun 27 2014 | David Gareth, Zebley | Band for performing an activity |
11381903, | Feb 14 2014 | Sonic Blocks Inc. | Modular quick-connect A/V system and methods thereof |
11395531, | Jun 27 2014 | Band for performing an activity | |
11429255, | Jan 28 2014 | International Business Machines Corporation | Impairment-adaptive electronic data interaction system |
11430414, | Oct 17 2019 | Microsoft Technology Licensing, LLC | Eye gaze control of magnification user interface |
11659903, | Jun 27 2014 | Band for performing an interactive activity | |
11786694, | May 24 2019 | NeuroLight, Inc. | Device, method, and app for facilitating sleep |
11799852, | Mar 29 2016 | BRAGI GmbH | Wireless dongle for communications with wireless earpieces |
11883176, | May 29 2020 | The Research Foundation for The State University of New York | Low-power wearable smart ECG patch with on-board analytics |
9311827, | Nov 17 2014 | Wearable assistive device, system and methods thereof for the visually impaired | |
9468845, | Apr 22 2014 | Zynga Inc.; Zynga Inc | Methods for using touch-screens to simulate balancing and manipulation of curved object during gameplay |
9576460, | Jan 21 2015 | Toyota Jidosha Kabushiki Kaisha | Wearable smart device for hazard detection and warning based on image and audio data |
9578307, | Jan 14 2014 | Toyota Jidosha Kabushiki Kaisha | Smart necklace with stereo vision and onboard processing |
9586318, | Feb 27 2015 | Toyota Motor Engineering & Manufacturing North America, Inc.; TOYOTA MOTOR ENGINEERING & MANUFACTURING NORTH AMERICA, INC | Modular robot with smart device |
9625990, | Mar 03 2015 | Toyota Motor Engineering & Manufacturing North America, Inc. | Vision-assist systems including user eye tracking cameras |
9629774, | Jan 14 2014 | Toyota Jidosha Kabushiki Kaisha | Smart necklace with stereo vision and onboard processing |
9649052, | Sep 05 2014 | BEIJING ZITIAO NETWORK TECHNOLOGY CO , LTD | Systems, apparatus, and methods for using eyewear, or other wearable item, to confirm the identity of an individual |
9677901, | Mar 10 2015 | Toyota Jidosha Kabushiki Kaisha | System and method for providing navigation instructions at optimal times |
9779756, | Dec 11 2015 | International Business Machines Corporation | Method and system for indicating a spoken word has likely been misunderstood by a listener |
9795324, | Sep 05 2014 | BEIJING ZITIAO NETWORK TECHNOLOGY CO , LTD | System for monitoring individuals as they age in place |
9811752, | Mar 10 2015 | Toyota Jidosha Kabushiki Kaisha | Wearable smart device and method for redundant object identification |
9898039, | Aug 03 2015 | Toyota Motor Engineering & Manufacturing North America, Inc. | Modular smart necklace |
9910298, | Apr 17 2017 | BEIJING ZITIAO NETWORK TECHNOLOGY CO , LTD | Systems and methods for a computerized temple for use with eyewear |
9915545, | Jan 14 2014 | Toyota Jidosha Kabushiki Kaisha | Smart necklace with stereo vision and onboard processing |
9922236, | Sep 17 2014 | Toyota Motor Engineering & Manufacturing North America, Inc. | Wearable eyeglasses for providing social and environmental awareness |
9936916, | Oct 09 2013 | Systems, environment and methods for identification and analysis of recurring transitory physiological states and events using a portable data collection device | |
9958275, | May 31 2016 | Toyota Motor Engineering & Manufacturing North America, Inc. | System and method for wearable smart device communications |
9972216, | Mar 20 2015 | Toyota Jidosha Kabushiki Kaisha | System and method for storing and playback of information for blind users |
9999280, | Jun 27 2014 | Interactive bracelet for practicing an activity between user devices | |
D768024, | Sep 22 2014 | Toyota Motor Engineering & Manufacturing North America, Inc.; TOYOTA MOTOR ENGINEERING & MANUFACTURING NORTH AMERICA, INC | Necklace with a built in guidance device |
Patent | Priority | Assignee | Title |
6139494, | Oct 15 1997 | Health Informatics Tools | Method and apparatus for an integrated clinical tele-informatics system |
6542623, | Sep 07 1999 | Portable braille computer device | |
6573883, | Jun 24 1998 | HEWLETT-PACKARD DEVELOPMENT COMPANY, L P | Method and apparatus for controlling a computing device with gestures |
6607484, | May 31 2000 | Kabushiki Kaisha Toshiba | Behavior and stress management recognition apparatus |
6628195, | Nov 10 1999 | Tactile stimulation device for use by a deaf person | |
6738951, | Dec 09 1999 | International Business Machines Corp. | Transcoding system for delivering electronic documents to a device having a braille display |
6944474, | Sep 20 2001 | K S HIMPP | Sound enhancement for mobile phones and other products producing personalized audio for users |
6965862, | Apr 11 2002 | Reading machine | |
7110946, | Nov 12 2002 | United States of America as represented by the Secretary of the Navy | Speech to visual aid translator assembly and method |
7251605, | Aug 19 2002 | Unites States of America as represented by the Secretary of the Navy | Speech to touch translator assembly and method |
7446669, | Jul 02 2003 | ALEXANDER TRUST | Devices for use by deaf and/or blind people |
7741962, | Oct 09 2006 | Toyota Motor Corporation | Auditory display of vehicular environment |
7766828, | Oct 27 2004 | Canon Kabushiki Kaisha | Estimation apparatus and its control method |
20050099291, | |||
20050124375, | |||
20060026001, | |||
20060061544, | |||
20060129308, | |||
20060135139, | |||
20060189278, | |||
20070117073, | |||
20070218958, | |||
20080318563, | |||
20090174673, | |||
20100041378, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Feb 08 2010 | TEDESCO, ROBERT C | HandHold Adaptive, LLC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 030722 | /0354 | |
Feb 09 2010 | JORASCH, JAMES A | HandHold Adaptive, LLC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 030722 | /0354 | |
Feb 11 2010 | TEDESCO, DANIEL E | HandHold Adaptive, LLC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 030722 | /0354 | |
Jul 01 2013 | HandHold Adaptive, LLC | (assignment on the face of the patent) | / |
Date | Maintenance Fee Events |
Jul 21 2017 | M2551: Payment of Maintenance Fee, 4th Yr, Small Entity. |
Jul 21 2017 | M2554: Surcharge for late Payment, Small Entity. |
Jul 13 2021 | M2552: Payment of Maintenance Fee, 8th Yr, Small Entity. |
Date | Maintenance Schedule |
Jan 14 2017 | 4 years fee payment window open |
Jul 14 2017 | 6 months grace period start (w surcharge) |
Jan 14 2018 | patent expiry (for year 4) |
Jan 14 2020 | 2 years to revive unintentionally abandoned end. (for year 4) |
Jan 14 2021 | 8 years fee payment window open |
Jul 14 2021 | 6 months grace period start (w surcharge) |
Jan 14 2022 | patent expiry (for year 8) |
Jan 14 2024 | 2 years to revive unintentionally abandoned end. (for year 8) |
Jan 14 2025 | 12 years fee payment window open |
Jul 14 2025 | 6 months grace period start (w surcharge) |
Jan 14 2026 | patent expiry (for year 12) |
Jan 14 2028 | 2 years to revive unintentionally abandoned end. (for year 12) |