Systems and methods for perception-based tactile sensing in a robot are provided. The robot may include an external portion and an illuminator that outputs illumination. The robot may further include a receiving sensor that receives illumination. The robot may also include a pair of conduits, located at an external portion, that include an injecting conduit that traverses one or more housings of the robot. The injecting conduit may be configured to receive the illumination from the illuminator and output the illumination to illuminate an object external to the robot. The pair of conduits may also include a receiving conduit, traversing one or more housings of the robot, configured to receive the illumination from the object external to the robot and output the illumination to the receiving sensor.
|
1. A robot having perception-based tactile sensing, comprising:
an external portion;
an illuminator configured to output illumination of a different color to each of a plurality of injecting conduits;
a receiving sensor configured to receive illumination;
conduits, located at the external portion, comprising:
the plurality of injecting conduits, traversing one or more housings of the robot, configured to:
receive the illumination from the illuminator; and
output the illumination to illuminate an object external to the robot; and
a receiving conduit, traversing one or more housings of the robot, configured to:
receive the illumination from the object external to the robot; and
output the illumination to the receiving sensor.
19. A robot having perception-based tactile sensing, comprising:
one or more external portions;
an illuminator configured to output illumination of a different color to each of a plurality of injecting conduits;
a receiving sensor configured to receive illumination;
a plurality of pairs of conduits each located at the one or more external portions, wherein each pair comprises:
an injecting conduit, traversing one or more housings of the robot, configured to:
receive the illumination from the illuminator; and
output the illumination to illuminate an object external to the robot,
wherein the injecting conduit is configured to directly provide illumination originating from the illuminator to an exterior point on the robot; and
a receiving conduit, traversing one or more housings of the robot, configured to:
receive the illumination from the object external to the robot; and
output the illumination to the receiving sensor, wherein the receiving conduit is configured to directly provide illumination received at the exterior point on the robot to the receiving sensor;
a plurality of conduit bundles, wherein:
at least two of the conduit bundles each comprise a plurality of injecting conduits and a plurality of receiving conduits; and
the injecting conduits in the plurality of bundles originate from the illuminator;
an external skin having exterior points each comprising a lens or window; and
a processor configured to calibrate the location of each of a plurality of exterior points with respect to exterior portions of the robot.
2. The robot of
3. The robot of
5. The robot of
6. The robot of
7. The robot of
9. The robot of
10. The robot of
11. The robot of
12. The robot of
14. The robot of
16. The robot of
|
The present application generally relates to robotics, and, more particularly, to a robot that can perform tactile sensing utilizing fiber optics.
Robots can function by utilizing a variety of sensory perceptions to experience their surroundings. For example, robots may utilize cameras for visual perception and audio sensors for to perceive sounds. However, it can be expensive and hardware intensive to provide a robot with sensors all over its body to provide a robot with a sense of touch.
In one embodiment, a robot having perception-based tactile sensing may include an external portion and an illuminator configured to output illumination. The robot may further include a receiving sensor configured to receive illumination. The robot may further still include a pair of conduits including an injecting conduit, traversing one or more housings of the robot, configured to receive the illumination from the illuminator and output the illumination to illuminate an object external to the robot. The pair of conduits may further include a receiving conduit, traversing one or more housings of the robot, configured to receive the illumination from the object external to the robot and output the illumination to the receiving sensor.
In another embodiment, a robot having perception-based tactile sensing may include one or more external portions and an illuminator configured to output illumination. The robot may further include a receiving sensor configured to receive illumination. The robot may also include a plurality of pairs of conduits each located at the one or more external portions, wherein each pair may include an injecting conduit, traversing one or more housings of the robot, configured to receive the illumination from the illuminator and output the illumination to illuminate an object external to the robot, wherein the injecting conduit may be configured to directly provide illumination originating from the illuminator to an exterior point on the robot. Each pair may further include a receiving conduit, traversing one or more housings of the robot, configured to receive the illumination from the object external to the robot and output the illumination to the receiving sensor, wherein the receiving conduit may be configured to directly provide illumination received at the exterior point on the robot to the receiving sensor. The robot may further include a plurality of conduit bundles, where at least two of the conduit bundles may each include a plurality of injecting conduits and a plurality of receiving conduits and the injecting conduits in the plurality of bundles originate from the illuminator. The robot may also further include an external skin having exterior points each comprising a lens or window. The robot may further still include a processor configured to calibrate the location of each of a plurality of exterior points with respect to exterior portions of the robot.
In yet another embodiment, a method for perception-based tactile sensing in a robot may include providing, by an illuminator within the robot, illumination to an injecting conduit and outputting illumination from the injecting conduit at an exterior point of the robot. The method may further include receiving, at the exterior point of the robot, illumination reflected back from an object external to the robot. The method may also include providing the received illumination through a receiving conduit to a receiving sensor within the robot.
These and additional features provided by the embodiments described herein will be more fully understood in view of the following detailed description, in conjunction with the drawings.
The embodiments set forth in the drawings are illustrative and exemplary in nature and not intended to limit the subject matter defined by the claims. The following detailed description of the illustrative embodiments can be understood when read in conjunction with the following drawings, where like structure is indicated with like reference numerals and in which:
Embodiments of the present disclosure are directed to systems and methods for robots that can perform tactile sensing utilizing fiber optics. For example, robots may lack any tactile sensing ability, or may have sensors requiring physical contact for tactile sensing. In some embodiments a robot having the ability of tactile sensing without the need for physical contact with external objects may be desirable. Particularly, embodiments described herein emit light onto an external object in close proximity, and detect reflected light to provide the robot with a sense of “touch.” The robot may utilize injecting conduits to provide illumination from an illuminator to an object external to the robot. The robot may then receive the illumination reflected back by the external object at receiving conduits. The receiving conduits may then provide the received illumination to a receiving sensor, which may be utilized to make tactile observations regarding the external object.
Referring now to
The computing device 100 may include non-volatile memory 108 (ROM, flash memory, etc.), volatile memory 110 (RAM, etc.), or a combination thereof. A network interface 112 can facilitate communications over a network 114 via wires, via a wide area network, via a local area network, via a personal area network, via a cellular network, via a satellite network, etc. Suitable local area networks may include wired Ethernet and/or wireless technologies such as, for example, wireless fidelity (Wi-Fi). Suitable personal area networks may include wireless technologies such as, for example, IrDA, Bluetooth, Wireless USB, Z-Wave, ZigBee, and/or other near field communication protocols. Suitable personal area networks may similarly include wired computer buses such as, for example, USB and FireWire. Suitable cellular networks include, but are not limited to, technologies such as LTE, WiMAX, UMTS, CDMA, and GSM. Network interface 112 can be communicatively coupled to any device capable of transmitting and/or receiving data via the network 114. Accordingly, the hardware of the network interface 112 can include a communication transceiver for sending and/or receiving any wired or wireless communication. For example, the network interface hardware may include an antenna, a modem, LAN port, Wi-Fi card, WiMax card, mobile communications hardware, near-field communication hardware, satellite communication hardware and/or any wired or wireless hardware for communicating with other networks and/or devices.
A computer readable storage medium 116 may comprise a plurality of computer readable mediums, each of which may be either a computer readable storage medium or a computer readable signal medium. A computer readable storage medium 116 may reside, for example, within an input device 106, non-volatile memory 108, volatile memory 110, or any combination thereof. A computer readable storage medium can include tangible media that is able to store instructions associated with, or used by, a device or system. A computer readable storage medium includes, by way of non-limiting examples: RAM, ROM, cache, fiber optics, EPROM/Flash memory, CD/DVD/BD-ROM, hard disk drives, solid-state storage, optical or magnetic storage devices, diskettes, electrical connections having a wire, or any combination thereof. A computer readable storage medium may also include, for example, a system or device that is of a magnetic, optical, semiconductor, or electronic type. Computer readable storage media and computer readable signal media are mutually exclusive. For example, a robot 200 and/or a server may utilize a computer readable storage medium to store data received from an illuminator 302 and/or a receiving sensor 312 in the robot 200.
A computer readable signal medium can include any type of computer readable medium that is not a computer readable storage medium and may include, for example, propagated signals taking any number of forms such as optical, electromagnetic, or a combination thereof. A computer readable signal medium may include propagated data signals containing computer readable code, for example, within a carrier wave. Computer readable storage media and computer readable signal media are mutually exclusive.
The computing device 100, such as a robot 200, may include one or more network interfaces 112 to facilitate communication with one or more remote devices, which may include, for example, client and/or server devices. In various embodiments the computing device (for example a robot) may be configured to communicate over a network with a server or other network computing device to transmit and receive data from a robot 200. A network interface 112 may also be described as a communications module, as these terms may be used interchangeably.
Turning now to
Still referring to
The processor 230 of the robot 200 may be any device capable of executing machine-readable instructions. Accordingly, the processor 230 may be a controller, an integrated circuit, a microchip, a computer, or any other computing device. The processor 230 may be communicatively coupled to the other components of the robot 200 by the communication path 228. This may, in various embodiments, allow the processor 230 to receive data from the illuminator 302 and/or receiving sensor 312, which may be part of the robot 200. Accordingly, the communication path 228 may communicatively couple any number of processors with one another, and allow the components coupled to the communication path 228 to operate in a distributed computing environment. Specifically, each of the components may operate as a node that may send and/or receive data. While the embodiment depicted in
Still referring to
The output device 234, if provided, is coupled to the communication path 228 and communicatively coupled to the processor 230. The output device 234 may, by way of non-limiting example, include displays and/or output devices 104 such as monitors, speakers, headphones, projectors, wearable-displays, holographic displays, printers, and/or anything else capable of being part of or coupled to the robot 200. For example, the output device 234 may provide a visual depiction of the pixels generated at the receiving sensor 312.
The inertial measurement unit 236, if provided, is coupled to the communication path 228 and communicatively coupled to the processor 230. The inertial measurement unit 236 may include one or more accelerometers and one or more gyroscopes. The inertial measurement unit 236 transforms sensed physical movement of the robot 200 into a signal indicative of an orientation, a rotation, a velocity, or an acceleration of the robot 200. The operation of the robot 200 may depend on an orientation of the robot 200 (e.g., whether the robot 200 is horizontal, tilted, and the like). Some embodiments of the robot 200 may not include the inertial measurement unit 236, such as embodiments that include an accelerometer but not a gyroscope, embodiments that include a gyroscope but not an accelerometer, or embodiments that include neither an accelerometer nor a gyroscope.
Still referring to
The speaker 240 (i.e., an audio output device) is coupled to the communication path 228 and communicatively coupled to the processor 230. The speaker 240 transforms audio message data from the processor 230 of the robot 200 into mechanical vibrations producing sound. For example, the speaker 240 may provide to the user navigational menu information, setting information, status information, information regarding the environment as detected by image data from the one or more cameras 244, and the like. However, it should be understood that, in other embodiments, the robot 200 may not include the speaker 240.
The microphone 242 is coupled to the communication path 228 and communicatively coupled to the processor 230. The microphone 242 may be any device capable of transforming a mechanical vibration associated with sound into an electrical signal indicative of the sound. The microphone 242 may be used as an input device 238 to perform tasks, such as navigate menus, input settings and parameters, and any other tasks. It should be understood that some embodiments may not include the microphone 242.
Still referring to
The network interface hardware 246 is coupled to the communication path 228 and communicatively coupled to the processor 230. The network interface hardware 246 may be any device capable of transmitting and/or receiving data via a network 270. Accordingly, network interface hardware 246 can include a wireless communication module configured as a communication transceiver for sending and/or receiving any wired or wireless communication. For example, the network interface hardware 246 may include an antenna, a modem, LAN port, Wi-Fi card, WiMax card, mobile communications hardware, near-field communication hardware, satellite communication hardware and/or any wired or wireless hardware for communicating with other networks and/or devices. In one embodiment, network interface hardware 246 includes hardware configured to operate in accordance with the Bluetooth wireless communication protocol. In another embodiment, network interface hardware 246 may include a Bluetooth send/receive module for sending and receiving Bluetooth communications to/from a portable electronic device 280. The network interface hardware 246 may also include a radio frequency identification (“RFID”) reader configured to interrogate and read RFID tags.
In some embodiments, the robot 200 may be communicatively coupled to a portable electronic device 280 via the network 270. In some embodiments, the network 270 is a personal area network that utilizes Bluetooth technology to communicatively couple the robot 200 and the portable electronic device 280. In other embodiments, the network 270 may include one or more computer networks (e.g., a personal area network, a local area network, or a wide area network), cellular networks, satellite networks and/or a global positioning system and combinations thereof. Accordingly, the robot 200 can be communicatively coupled to the network 270 via wires, via a wide area network, via a local area network, via a personal area network, via a cellular network, via a satellite network, or the like. Suitable local area networks may include wired Ethernet and/or wireless technologies such as, for example, wireless fidelity (Wi-Fi). Suitable personal area networks may include wireless technologies such as, for example, IrDA, Bluetooth, Wireless USB, Z-Wave, ZigBee, and/or other near field communication protocols. Suitable personal area networks may similarly include wired computer buses such as, for example, USB and FireWire. Suitable cellular networks include, but are not limited to, technologies such as LTE, WiMAX, UMTS, CDMA, and GSM.
Still referring to
The tactile feedback device 248 is coupled to the communication path 228 and communicatively coupled to the processor 230. The tactile feedback device 248 may be any device capable of providing tactile feedback to a user. The tactile feedback device 248 may include a vibration device (such as in embodiments in which tactile feedback is delivered through vibration), an air blowing device (such as in embodiments in which tactile feedback is delivered through a puff of air), or a pressure generating device (such as in embodiments in which the tactile feedback is delivered through generated pressure). It should be understood that some embodiments may not include the tactile feedback device 248.
The location sensor 250 is coupled to the communication path 228 and communicatively coupled to the processor 230. The location sensor 250 may be any device capable of generating an output indicative of a location. In some embodiments, the location sensor 250 includes a global positioning system (GPS) sensor, though embodiments are not limited thereto. Some embodiments may not include the location sensor 250, such as embodiments in which the robot 200 does not determine a location of the robot 200 or embodiments in which the location is determined in other ways (e.g., based on information received from the camera 244, the microphone 242, the network interface hardware 246, the proximity sensor 254, the inertial measurement unit 236 or the like). The location sensor 250 may also be configured as a wireless signal sensor capable of triangulating a location of the robot 200 and the user by way of wireless signals received from one or more wireless signal antennas.
The motorized wheel assembly 258 is coupled to the communication path 228 and communicatively coupled to the processor 230. As described in more detail below, the motorized wheel assembly 258 includes motorized wheels (not shown) that are driven by one or motors (not shown). The processor 230 may provide one or more drive signals to the motorized wheel assembly 258 to actuate the motorized wheels such that the robot 200 travels to a desired location, such as a location that the user wishes to acquire environmental information (e.g., the location of particular objects within at or near the desired location).
Still referring to
The proximity sensor 254 is coupled to the communication path 228 and communicatively coupled to the processor 230. The proximity sensor 254 may be any device capable of outputting a proximity signal indicative of a proximity of the robot 200 to another object. In some embodiments, the proximity sensor 254 may include a laser scanner, a capacitive displacement sensor, a Doppler effect sensor, an eddy-current sensor, an ultrasonic sensor, a magnetic sensor, an internal sensor, a radar sensor, a lidar sensor, a sonar sensor, or the like. Some embodiments may not include the proximity sensor 254, such as embodiments in which the proximity of the robot 200 to an object is determine from inputs provided by other sensors (e.g., the camera 244, the speaker 240, etc.) or embodiments that do not determine a proximity of the robot 200 to an object 215.
The temperature sensor 256 is coupled to the communication path 228 and communicatively coupled to the processor 230. The temperature sensor 256 may be any device capable of outputting a temperature signal indicative of a temperature sensed by the temperature sensor 256. In some embodiments, the temperature sensor 256 may include a thermocouple, a resistive temperature device, an infrared sensor, a laser sensor, a bimetallic device, a change of state sensor, a thermometer, a silicon diode sensor, or the like. Some embodiments of the robot 200 may not include the temperature sensor 256.
Still referring to
Turning now to
In this embodiment, an injecting bundle 304 is connected to the illuminator 302. An injecting bundle 304 in this embodiment may include two or more injecting conduits 306, where any suitable number of injecting conduits 306 may be utilized. An injecting bundle 304 may utilize any suitable mechanism, such as an outer layer to wrap at least two injecting conduits 306 together. In this embodiment injecting conduits 306 may be unbundled from an injecting bundle 304 at any suitable distance from the illuminator 302. In some embodiments injecting conduits 306 may be rebundled with other or the same injecting conduits 306. An injecting conduit 306 may be any suitable type of transport mechanism capable transporting or delivering illumination, such as fiber optics, waveguides, etc. In various embodiments, an illuminator 302 may provide differing illumination to each injecting conduit 306. By way of non-limiting example, a different color may be provided to each of a plurality of injecting conduits 306.
A robot 200 may feature one or more receiving sensors 312, which may be located in any suitable location(s) within the robot 200. In other embodiments, receiving sensors 312 may be located on the exterior of, or located remotely from, a robot 200. A receiving sensor 312 may be any suitable type of object such as image sensors (e.g., charge-coupled device, photovoltaic, photoresistors, photo diode, proximity, an active distributed camera, etc.) and/or any other object capable receiving and/or processing any type of illumination. Any suitable number of receiving sensors 312 may be utilized in various embodiments. In some embodiments a receiving sensor 312 may utilize non-uniform output from one or more illuminators 302. By way of non-limiting example, where an illuminator 302 provides differing colors to a plurality of injecting conduits 306, the receiving sensor 312 may receive at least a subset of those colors from the receiving conduits 310, where such color data may be utilized for three-dimensional imaging.
In this embodiment the receiving bundle 308 is connected to the receiving sensor 312. The injecting bundle 304 in this embodiment may include two or more receiving conduits 310, where any suitable number of receiving conduits 310 may be utilized. A receiving bundle 308 may utilize any suitable mechanism, such as an outer layer to wrap at least two receiving conduits 308 together. In this embodiment receiving conduits 310 may be unbundled from a receiving bundle 308 at any suitable distance from the receiving sensor 312. In some embodiments receiving conduits 310 may be rebundled with other or the same receiving conduits 310. A receiving conduit 310 may be any suitable type of transport mechanism capable transporting or delivering illumination, such as fiber optics, waveguides, etc. In embodiments, a receiving sensor 312 may translate illumination from receiving conduits 310 into pixels, such that each receiving conduit 310 (and associated exterior point 402) corresponds to a pixel value. In some embodiments, the illumination carried by receiving conduits 310 may be combined to form combined pixel values. For example, multiple receiving conduits 310 from different exterior points 402 may converge into a receiving bundle 308, such that the receiving bundle 308 may be connected to a receiving sensor 312 utilizing an array of pixels, and the illumination received from each receiving conduit 310 may then be represented by a pixel value within the array of pixels. Although a pixel is utilized in this embodiment, any suitable type of representation of illumination carried by receiving conduits 310 may be utilized.
In some embodiments each injecting conduit 306 may terminate at a different exterior point 402. In other embodiments no injecting conduits 306, or multiple injecting conduits 306, may terminate at any given exterior point 402. For example, an exterior point 402 may have no injecting conduits, such that one or more receiving conduits 312 at the exterior point 402 receive ambient illumination from external objects and/or illumination reflected by external objects as received from injecting conduits 306 located at other exterior points 402 on the robot 200. In the illustrated embodiment, each receiving conduit 312 terminates at an exterior point 402 at an exterior surface 404 of the robot 200. In some embodiments each receiving conduit 312 may originate at a different exterior point 402. In other embodiments no receiving conduits 312, or multiple receiving conduits 312, may begin at any given exterior point 402. For example, an exterior point 402 may have no receiving conduits, such that one or more injecting conduits 312 at the exterior point 402 provide illumination to objects that may reflect such illumination, which may be received by receiving conduits 306 located at other exterior points 402 on the robot 200. In the illustrated embodiment, each receiving conduit 312 terminates at an exterior point 402 at an exterior surface 404 of the robot 200. The terms “originate” and “terminate” may be used interchangeably in some embodiments. In some embodiments, an injecting conduit 306 and a receiving conduit 312 in a pair 406 are each part of a different bundle. In various embodiments at least one of the injecting bundles 304 comprises a plurality of injecting conduits 306 and at least one of the receiving bundles 308 comprises a plurality of receiving conduits 310.
Injecting conduits 306 and corresponding receiving conduits 310 form pairs 406 at exterior points 402. As an example, a pair 406 may have one injecting conduit 306 and one corresponding receiving conduit 310. However, any other suitable quantities of injecting conduits 306 and receiving conduits 310 may be utilized to form a pair in other embodiments. An exterior point 402 may have one, multiple, or no pairs 406 in various embodiments.
Some embodiments may only utilize ambient and/or reflected illumination, for example, by not utilizing illuminators 302, injecting bundles 304, or injecting conduits 306. In such embodiments, illumination may be passively received from other illumination sources external to the robot 200, which may include illumination emanating from and/or reflected off of an external object 502.
In
In another example, multiple external objects 502 may be located on opposite sides of the robot 200, such as in front and behind. The front-facing external points 402 of the robot 200, where each portion of the robot (arms, legs, heads, torso, etc.) may have their own receiving sensors 312 and associated pixel arrays. The external points 402 may provide for tactile sensing of these external objects 502 by providing illumination from one or more illuminators 302, through one or more injecting bundles 304 having multiple injecting conduits 306 to provide illumination at the front-facing and rear-facing external points 402. Reflected illumination from each of the external objects 502 may be reflected back to the front-facing and rear-facing external points 402 on the robot 200, relative to the positioning of the external objects 502. The reflected illumination may enter the receiving conduits 310 at the external points 402, which may converge to form receiving bundles 308 that connect to one or more receiving sensors 312 forming a pixel array value for the illumination received from each receiving conduit 310. In this way, the pixel arrays in the receiving sensors 312 throughout the robot 200 may each generate an array of pixel values that may be used to generate pixel-based representations of each external object 502. In an alternative example, the external objects 502 may be detected utilizing only ambient illumination from the environment, without illumination generated by illuminators 302 within the robot 200. In yet another alternative example, the external objects 502 may be detected utilizing a combination of ambient illumination in the environment and illumination generated by illuminators 302 within the robot 200.
In
At block 702, a calibration request may be received. At block 704, an illumination identifier may be provided for each unpaired injecting conduit 306 at one or more illuminators 302, although in some embodiments this may be provided to all injecting conduits 306. An illumination identifier may be anything that can identify injected illumination 504, such as a color, intensity, or blinking pattern. At block 706, an illumination identifier may be output from each unpaired injecting conduit 306 at an exterior point 402 on the robot 200. At block 708, illumination identifiers may be received by unpaired receiving conduits 310 at one or more receiving sensors 312 within the robot 200. For example, an illuminator 302 may output a particular color, such as red, which may be output by an injecting conduit 306 and received by a receiving conduit 310 to be provided as a pixel value at a sensor array within a receiving sensor 312. By contrast, another illuminator 302, or the same illuminator 302, may provide another color, such as blue. The color blue may be output by a different injecting conduit 306 and received by a different receiving conduit 310 to be provided as a different pixel value at a sensor array within the same, or at a different, receiving sensor 312. In this way, the external points 402 on the robot 200 outputting and receiving the color red external points 402 that may be pinpointed and distinguished from those external points 402 on the robot 200 outputting and receiving the color blue. At block 710, pairs 406 and/or location mappings of injecting conduits 306 and/or corresponding receiving conduits 310 may be created and/or updated, where an output identifier may correspond to a received identifier. In various embodiments an output identifier may be anything that can identify reflected illumination 506, such as a color, intensity, or blinking pattern. Based upon the output of an injecting conduit 306 and/or a receiving conduit 310, a pixel (or pixels) may be registered and associated with a signal. The pixel (or pixels) may become associated with that injecting conduit 306 and/or that receiving conduit 310. At block 712, a determination may be made whether there are any unmatched pairs 406 and/or any receiving conduits 310 remain without a received identifier. If so, the flowchart may return to block 704. If not, then at block 714 the calibration may be completed.
It is noted that recitations herein of a component of the present disclosure being “configured” or “programmed” in a particular way, to embody a particular property, or to function in a particular manner, are structural recitations, as opposed to recitations of intended use. More specifically, the references herein to the manner in which a component is “configured” or “programmed” denotes an existing physical condition of the component and, as such, is to be taken as a definite recitation of the structural characteristics of the component.
The order of execution or performance of the operations in examples of the disclosure illustrated and described herein is not essential, unless otherwise specified. That is, the operations may be performed in any order, unless otherwise specified, and examples of the disclosure may include additional or fewer operations than those disclosed herein. For example, it is contemplated that executing or performing a particular operation before, contemporaneously with, or after another operation is within the scope of aspects of the disclosure.
It is noted that the terms “substantially” and “about” and “approximately” may be utilized herein to represent the inherent degree of uncertainty that may be attributed to any quantitative comparison, value, measurement, or other representation. These terms are also utilized herein to represent the degree by which a quantitative representation may vary from a stated reference without resulting in a change in the basic function of the subject matter at issue.
While particular embodiments have been illustrated and described herein, it should be understood that various other changes and modifications may be made without departing from the spirit and scope of the claimed subject matter. Moreover, although various aspects of the claimed subject matter have been described herein, such aspects need not be utilized in combination. It is therefore intended that the appended claims cover all such changes and modifications that are within the scope of the claimed subject matter.
Stone, Kevin, Amacker, Matthew, Poursohi, Arshan, Zapolsky, Samuel
Patent | Priority | Assignee | Title |
Patent | Priority | Assignee | Title |
10359329, | May 13 2016 | Sensobright Industries, LLC | Sensing system with different upper layers |
10574909, | Aug 08 2016 | Microsoft Technology Licensing, LLC | Hybrid imaging sensor for structured light object capture |
4405197, | Nov 25 1980 | UNITED STATES OF AMERICA, AS REPRESENTED BY THE ADMINISTRATOR OF THE NATIONAL AERONAUTICS AND SPACE ADMINISTRATION | Optical fiber tactile sensor |
4547668, | Sep 14 1983 | Siemens Corporate Research & Support, Inc. | Two-dimensional pressure sensor using retro-reflective tape and semi-transparent medium |
4886361, | May 12 1987 | DEUTSCHE FORSCHUNGSANSTALT FUR LUFT-UND RAUMFAHRT E V | Flat tactile sensor |
9030653, | Jul 11 2012 | Sensobright Industries, LLC | Multi point, high sensitive tactile sensing module for robots and devices |
9032811, | Jan 21 2013 | Kabushiki Kaisha Yaskawa Denki | Robot apparatus |
9446521, | Jan 24 2000 | iRobot Corporation | Obstacle following sensor scheme for a mobile robot |
9513178, | Nov 17 2011 | ONROBOT HUNGARY KFT | Sensor device |
9770824, | May 07 2014 | Seiko Epson Corporation | Ceiling mounted robot with relay cable and connector portion |
20030063185, | |||
20130090763, | |||
20130215235, | |||
20160158942, | |||
20170043483, | |||
20190056248, | |||
20200122345, | |||
CN203848958, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Apr 18 2018 | AMACKER, MATTHEW | TOYOTA RESEARCH INSTITUTE, INC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 047223 | /0406 | |
May 01 2018 | ZAPOLSKY, SAMUEL | TOYOTA RESEARCH INSTITUTE, INC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 047223 | /0406 | |
May 01 2018 | STONE, KEVIN | TOYOTA RESEARCH INSTITUTE, INC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 047223 | /0406 | |
Jun 28 2018 | POURSOHI, ARSHAN | TOYOTA RESEARCH INSTITUTE, INC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 047223 | /0406 | |
Oct 18 2018 | TOYOTA RESEARCH INSTITUTE, INC. | (assignment on the face of the patent) | / | |||
Jan 27 2021 | TOYOTA RESEARCH INSTITUTE, INC | Toyota Jidosha Kabushiki Kaisha | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 055065 | /0568 |
Date | Maintenance Fee Events |
Oct 18 2018 | BIG: Entity status set to Undiscounted (note the period is included in the code). |
May 22 2024 | M1551: Payment of Maintenance Fee, 4th Year, Large Entity. |
Date | Maintenance Schedule |
Dec 08 2023 | 4 years fee payment window open |
Jun 08 2024 | 6 months grace period start (w surcharge) |
Dec 08 2024 | patent expiry (for year 4) |
Dec 08 2026 | 2 years to revive unintentionally abandoned end. (for year 4) |
Dec 08 2027 | 8 years fee payment window open |
Jun 08 2028 | 6 months grace period start (w surcharge) |
Dec 08 2028 | patent expiry (for year 8) |
Dec 08 2030 | 2 years to revive unintentionally abandoned end. (for year 8) |
Dec 08 2031 | 12 years fee payment window open |
Jun 08 2032 | 6 months grace period start (w surcharge) |
Dec 08 2032 | patent expiry (for year 12) |
Dec 08 2034 | 2 years to revive unintentionally abandoned end. (for year 12) |