A vision system for a vehicle, such as a mobile robot (10) includes at least one radiation projector (14, 16) which projects a structured beam of radiation into the robot's environment. The structured beam of radiation (14a, 16a) preferably has a substantially planar pattern of sufficient width to encompass the immediate forward path of the robot and also to encompass laterally disposed areas in order to permit turning adjustments. The vision system further includes an imaging (12) sensor such as a CCD imaging device having a two-dimensional field of view which encompasses the immediate forward path of the robot. An image sensor processor (18) includes an image memory (18A) coupled to a device (18D) which is operable for accessing the image memory. image processing is accomplished in part by triangulating the stored image of the structured beam pattern to derive range and bearing, relative to the robot, of an object being illuminated. A navigation control system (20) of the robot inputs data from at least the vision system and infers therefrom data relating to the configuration of the environment which lies in front of the robot. The navigation control system generates control signals which drive propulsion and steering motors in order to navigate the robot through the perceived environment.

Patent
   4954962
Priority
Sep 06 1988
Filed
Sep 06 1988
Issued
Sep 04 1990
Expiry
Sep 06 2008
Assg.orig
Entity
Small
312
24
all paid
24. A vision system coupled to a mobile robot, comprising:
means for emitting at least one structured, substantially planar beam of light in a direction of travel of the robot, the beam being oriented for forming a substantially stripe-like pattern upon a surface disposed generally in the direction of travel of the robot;
means for imaging at least a portion of the planar beam which is reflected from at least one surface disposed within a region through which the robot is to travel;
means for associating the image of the reflected beam with a range and a bearing, relative to the robot, of the surface reflecting the beam; and
means, coupled to an output of said associating means, for determining robot navigation data as a function of the range and bearing to the surface.
1. Object detection apparatus carried by a vehicle which moves over a surface, comprising:
means for emitting at least one structured, substantially planar beam of light in a direction of travel of the vehicle, the beam being oriented for forming a substantially stripe-like pattern upon a surface disposed generally in the direction of travel of the vehicle;
means for imaging at least a portion of the planar beam of light which is reflected from at least one surface disposed within a region through which the vehicle is to travel;
means for associating the imaged reflection of the beam of light with at least a range and a bearing, relative to the vehicle, of the surface reflecting the planar beam of light; and
means, coupled to an output of said associating means, for determining vehicle navigation-related information as a function of the range and bearing to the surface.
6. Object detection apparatus carried by a vehicle which moves over a surface, comprising:
means for emitting at least one structured, substantially planar beam of light in a direction of travel of the vehicle;
means for imaging the planar beam of light which is reflected from at least one surface disposed within a region through which the vehicle is to travel, wherein said means for imaging comprises means for generating an image of a two-dimensional field of view, the image being comprised of a plurality of pixels, each of said pixels having an associated value which is a function of the amount of the reflected beam of light within an associated portion of the field of view; and
means for associating the imaged reflection of the beam with at least a range and a bearing, relative to the vehicle, of the surface reflecting the beam of light, wherein said means for associating comprises means for generating a range and a bearing, relative to the vehicle, for each pixel within the field of view.
2. Object detection apparatus as defined in claim 1 wherein said means for emitting comprises a pulsed source of light.
3. Object detection apparatus as defined in claim 1 wherein said emitting means emits light having wavelengths within a range of wavelengths of approximately 750 nm to approximately 1,000 nm.
4. Object detection apparatus as defined in claim 1 wherein said means for imaging comprises means for generating an image of a two-dimensional field of view, the image being comprised of a plurality of pixels, each of said pixels having an associated value which is a function of an amount of reflected light within an associated portion of the field of view.
5. Object detection apparatus as defined in claim 1 wherein said means for imaging comprises a CCD imaging device having a two-dimensional field of view, the CCD imaging device generating a plurality of pixels.
7. Object detection apparatus as defined in claim 6 wherein said associating means further comprises means for storing data expressive of the range and bearing, relative to the vehicle for each pixel within the field of view.
8. Object detection apparatus as defined in claim 6 wherein said means for emitting emits output wavelengths within a range of wavelengths of approximately 750 nm to approximately 1,000 nm.
9. Object detection apparatus as defined in claim 6 wherein said means for emitting includes flashlamp means.
10. Object detection apparatus as defined in claim 6 wherein said means for emitting includes one or more light emitting diode means.
11. Object detection apparatus as defined in claim 6 wherein said means for emitting includes one or more incandescent lamp means.
12. Object detection apparatus as defined in claim 6 wherein the beam is emitted substantially parallel to a surface over which the vehicle moves.
13. Object detection apparatus as defined in claim 6 wherein the beam is emitted obliquely downward towards a surface over which the vehicle moves.
14. Object detection apparatus as defined in claim 6 wherein the beam is emitted substantially perpendicularly relative to a surface over which the vehicle moves.
15. Object detection apparatus as defined in claim 6 wherein said means for imaging includes a plurality of light detectors comprised of semiconductor material.
16. Object detection apparatus as defined in claim 6 wherein said means for imaging is comprised of vidicon imaging means.
17. Object detection apparatus as defined in claim 6 wherein said emitting means includes means for emitting a plurality of structured, substantially planar light beams.
18. Object detection apparatus as defined in claim 6 wherein said imaging means includes means for identifying image pixels having a value greater than a threshold value.
19. Object detection apparatus as defined in claim 6 wherein said imaging means includes means for identifying image pixels having a value greater than a threshold value, said identifying means operating in a coarse search mode in a first portion of the field of view and operating in a fine search mode in a second portion of the field of view, wherein the first portion of the field of view encompasses objects at greater ranges than the second portion of the field of view.
20. Object detection apparatus as defined in claim 6 wherein said imaging means includes means for identifying image pixels having a value greater than a threshold value and further includes means for accessing look-up table means at a location corresponding to an identified image pixel to retrieve therefrom at least range and bearing information associated with the identified image pixel.
21. Object detection apparatus as defined in claim 6 wherein said means foe emitting is comprised of a source of visible or infra-red light coupled to means for varying at least a pulse repetition rate of the source.
22. Object detection apparatus as set forth in claim 6 wherein said emitting means and/or said imaging means further includes optical filter means for passing light having wavelengths substantially within the infra-red range of wavelengths.
23. Object detection apparatus as set forth in claim 6 wherein said associating means includes means for associating the image of the reflected light beam with elevation information relating to the surface reflecting the light beam.
25. A vision system as defined in claim 24 wherein said means for emitting is comprised of a flashlamp.
26. A vision system as defined in claim 24 wherein said means for emitting is comprised of one or more light emitting diodes.
27. A vision system as defined in claim 24 wherein said means for emitting is comprised of one or more incandescent lamps.
28. A vision system as defined in claim 24 wherein said means for emitting is comprised of one or more laser means.
29. A vision system as defined in claim 24 wherein the beam is emitted substantially parallel to a substantially planar surface over which the robot moves.
30. A vision system as defined in claim 24 wherein the beam is emitted obliquely downward towards a substantially planar surface over which the robot moves.
31. A vision system as defined in claim 24 wherein the beam is emitted substantially perpendicularly relative to a substantially planar surface over which the robot moves.
32. A vision system as defined in claim 24 wherein said means for imaging is comprised of a CCD imaging means.
33. A vision system as defined in claim 24 wherein said means for imaging is comprised of a vidicon imaging means.

This invention relates generally to a navigation and obstacle avoidance vision system for a moving vehicle, such as a mobile robot and, in particular, to a vision system which includes at least one structured, substantially planar radiation pattern which is projected along a path of the vehicle and which further includes an image sensor for sensing reflected radiation.

An autonomous vehicle, such as a mobile robot, typically comprises some type of sensor system for sensing an environment through which the vehicle navigates. Preferably, the sensor system has the capability to detect obstacles within the path of the robot so that appropriate action may be taken. This action may include altering the path of the robot in order to steer around the obstacle. Alternatively, a sensed object may represent a navigation landmark, such as a support post, door frame, or wall, which the robot uses as a registration reference in following a preprogrammed trajectory. Systems employing ultrasonic detectors, mechanical contact devices and laser ranging apparatus are known in the art. Other systems which include a camera to observe the environment and a passive image processing system are also known.

A problem associated with ultrasonic detectors relates to the difficulty in obtaining reliable and consistent range signals in an environment which normally includes a number of objects having differing specular reflection characteristics. The objects also typically differ in size, surface characteristics and orientation relative to the ultrasound transmitter. A problem associated with mechanical contact devices relates at least to a lack of resolution and to a requirement that the obstacle actually be contacted in order to generate a signal. For some applications, such as navigation through a workplace or a hospital, the obstacle may be a human being. As can be appreciated, for these applications physical contact with the obstacle may be undesirable. Laser ranging systems are expensive, bulky, and consume substantial power. Traditional passive scene analysis vision systems require large amounts of computing power, are relatively slow and often yield erroneous results. Typically the interpretation of data is too slow to be useful for real time navigation, and may prove erroneous, such as interpreting a shadow as an object, which results in navigation errors.

It has also been known to provide visual markers or "beacons" within the robot's environment. Such beacons are undesirable in that they introduce additional cost and complexity to the system and constrain the motion of the robot to a region wherein the beacons are visible.

Commercial applications of mobile robots in the service sector include floor cleaning, aids to the handicapped, hospital delivery systems, mail carts, and security. These applications require robust, reliable navigation using sensors which are low in cost and power consumption while providing real-time maneuvering data.

It is therefor one object of the invention to provide a simplification of vision and vision processing for a mobile robot.

It is another object of the invention to provide a vision system for a mobile robot, the system requiring a minimum of image processing complexity while yet having an image resolution which is sufficient for guiding the robot through an environment.

It is a further object of the invention to provide a vision system for a mobile robot which does not require beacons or other environmental modification means to be disposed within the robot's environment.

It is another object of the invention to provide a vision system for a mobile robot which provides a complete and unambiguous interpretation of obstacles and landmarks relevant to navigation which lie in the path of the robot while having a minimal complexity, cost and power consumption as compared to conventional passive image analysis systems.

It is one still further object of the invention to provide a vision system for a mobile robot which operates in a high speed manner and which permits the continuous, adaptive motion of the robot through the robot's environment.

The aforedescribed problems are overcome and the objects of the invention are realized by an object detection or vision system for a vehicle, such as a mobile robot which, in accordance with methods and apparatus of the invention, includes at least one radiation projector which projects a structured beam of radiation into the robot's environment. The structured beam of radiation preferably has a substantially planar pattern of sufficient width to encompass the immediate forward path of the robot and also to encompass laterally disposed areas in order to permit turning adjustments. The brightness, spectral characteristics and pulse repetition rate of the structured beam are predetermined to maximize signal to noise ratio in an imaging sensor over a variety of ambient lighting conditions, while consuming minimal energy.

The object detection system of the invention further includes an imaging sensor which includes an electronic camera having a two-dimensional field of view which encompasses the immediate forward path of the robot. An image sensor processor may include a frame grabber, or image memory, coupled to a data processing device which is operable for accessing the image memory wherein the field of view of the camera is represented as binary data. Image processing is accomplished in part by triangulating the stored image of the structured beam pattern to derive at least range and bearing information, relative to the robot, of an object reflecting the substantially planar structured beam of radiation.

A motion control system of the robot inputs data from at least the vision system and infers therefrom data relating to the configuration of the environment which lies in front of the robot. The motion control system generates control signals which drive propulsion and steering motors in order to navigate the robot through the perceived environment.

The foregoing aspects of the invention will be made more apparent in the ensuing Detailed Description of the Invention read in conjunction with the accompanying Drawing wherein:

FIG. 1a is an illustrative block diagram showing a mobile robot, constructed and operated in accordance with one embodiment of the invention, which includes a camera having a downward pointing field of view and being disposed above two forwardly projecting structured beams of radiation;

FIG. 1b is a block diagram of the image processor 18 of FIG. 1a;

FIGS. 2a and 2b show a side view and a top view, respectively, of one embodiment of a structured beam projector, the projector comprising a flash tube, a cylindrical mirror and a plurality of cylindrical lens elements;

FIG. 2c shows a side view of another embodiment of a structured beam projector, the projector comprising a flashtube, a cylindrical mirror and a plurality of apertures;

FIGS. 3a and 3b are a lateral view and a top view, respectively, of structured beam patterns projected by the robot of FIG. 1;

FIG. 4 is an illustrative side view of a mobile robot constructed in accordance with another embodiment of the invention, the robot having an upper, downward pointing structured beam projector disposed above a camera, the robot further comprising a pair of beam projectors for projecting planar beams which are orientated substantially orthogonally with respect to a lower, horizontal beam projector;

FIG. 5 is a frontal view of the robot of FIG. 5;

FIG. 6 is a diagram which illustrates a processed field of view of the robot of FIGS. 4 and 5.

FIG. 7 is an illustrative view of the successive reflections of vertically orientated structured beam projectors from successively more distant vertical objects; and

FIG. 8 is an illustrative view of the reflections from objects within a robot's environment, the reflections being due to an obliquely projecting structured beam projector.

Referring now to FIG. 1a there is shown a side view of one embodiment of a mobile robot 10 comprising an electronic imaging device, such as a camera 12, and a plurality of structured beam projectors, namely an upper projector 14 and a lower projector 16. In accordance with the invention this optical configuration both detects and measures the position of objects lying within or closely adjacent to the forward path of the robot 10. These objects might be obstacles such as furniture or pedestrians. The objects may also be reference surfaces, such as walls and door frames.

The camera 12 preferably includes a CCD imaging device having a square or rectangular field of view (FOV) which is directed obliquely downward such that it encompasses the forward path of the robot 10 in the immediate maneuvering vicinity. The camera 12 generates a plurality of pixels, individual ones of which have a value indicative of an intensity of radiation incident upon a corresponding surface area of the camera radiation sensing device. The structured beams 14a and 16a which are projected by projectors 14 and 16, respectively, have the general form of a plane or slit of radiation disposed to intersect the field of view in a region most likely to be occupied by furniture, walls, pedestrians, or other obstacles.

Robot 10 further comprises an image processor 18 which is coupled to the output of camera 12. Image processor 18, as shown in greater detail in FIG. 1b, comprises a video memory 18A which stores a representation of one video frame output of camera 12. An input to video memory 18A may be provided by an analog to digital (A/D) convertor 18B which digitizes the analog output of camera 12. The digital output of D/A 18B may form an address input to a lookup table (LUT) 18C wherein pixel brightness values may be reassigned. The LUT 18C may also be employed for image thresholding and/or histogram correction. Image processor 18 further comprises an image processing device, such as a microcomputer 18D, which is coupled to the video memory 18A and which is operable for reading the stored video frame data therefrom. Image processor 18 further comprises memory 18E which includes memory for storing program data. This program data is operable for performing at least triangulation calculations upon the stored image frame data, this triangulation computation being described in detail hereinafter. Image processor 18 may further comprise memories 18F and 18G each of which stores a data structure, such as a lookup table, associated with a particular projector 14 or 16. Individual entries in each table correspond at least to range and bearing information associated with individual pixels of an image frame. This aspect of the invention will also be described in detail below. Image processor 18 may have a plurality of outputs coupled to projectors 14 and 16 for energizing the projectors for a predetermined period of time. As will be described the operation of the projectors 14 and 16 are synchronized to the operation, or frame rate, of the camera 12 while being desynchronized to each other. An output of image processor 18 which is expressive of position information relating to objects within the FOV of camera 12 may be supplied, via an RS-232 or parallel data link, to a navigation control processor 20 which derives navigation data based upon the perceived image of the environment. Such data may be employed to steer the robot down a straight path or may be employed to alter the path of the robot in order to avoid an obstacle within the path of the robot. An output of navigation control processor 20 is supplied to a drive and steering control 22 which has outputs coupled to drive and steering wheels 24. The wheels 24 are in contact with a supporting surface 26 which is typically a floor. Navigation control processor 20 may receive an output from the drive and steering control 22, the output being expressive of odometer readings which relate to the distance traveled by the robot 10. Navigation control processor 20 typically comprises a data processing device having associated memory and support circuitry. An enclosure is provided to contain the aforementioned apparatus and to provide protection therefore.

The camera 12 may be a model TM440 CCD camera manufactured by Pulnix. The camera 12 may have a relatively short focal length of, for example, 6.5 mm in order to maximize the field of view. Microcomputer 18D may be an 80286 microprocessor device manufactured by Intel. LUT 18C and video memory 18A may be contained within a frame grabber pc-board such as a type manufactured by Coreco or Imaging Technologies. In general, image processor 18 may conform to a standard computer architecture having printed circuit boards coupled to a common backplane and communicating over a bus. It should be realized that the invention may be practiced by a number of different means and should not be construed to be limited to only that disclosed herein.

Although the projectors 14 and 16 may be operable for projecting planar beams having any desired spectral characteristics a preferred embodiment of the invention employs a broad, near infrared (IR) light source having wavelengths within the range of approximately 700 to approximately 1000 nanometers (nm). Near-IR radiation is preferable for a number or reasons. Near-IR radiation is unobtrusive to humans which may be sharing the environment with the robot 10. CCD imaging sensors, which are preferred because of low cost and power consumption, are sensitive to near-infrared radiation. In addition, and relating to projectors 14 and 16, infrared light emitting diodes (LEDs) are energy efficient and available at low cost. In this regard it has been found that laser diode devices consume more energy per emitted power and typically provide a relatively narrow spectrum which may not optimally match the sensitivity of the camera 12. However, it should be realized that the invention may be practiced with any source, such as an incandescent lamp, laser, flashlamp or light emitting diode, having wavelengths which are efficiently detected by a radiation sensor. Furthermore it should be realized that the planar radiation pattern may be formed by any of a number of suitable techniques including, but not limited to, providing a knife-edged aperture, focussing and/or collimating the beam with lens, or mechanically scanning either the source of radiation or a reflecting element.

The energy of the output radiation beams 14a and 16a are preferably of sufficient magnitude to be distinguishable from ambient lighting, while consuming minimal power. In indoor environments interference from fluorescent lighting, which peaks in the visible spectrum, may be minimized by employing an infrared pass filter 12a at the input to the camera 12; thereby improving the system signal to noise ratio. A low duty cycle of the projected planar beam further improves efficiency as well. That is, the light source may be energized for a few milliseconds, corresponding to the interval of image exposure, after which the light source may be de-energized to conserve energy. For example, if the vehicle is travelling at one meter per second, a relatively rapid rate for a mobile robot sharing space with humans, one flash per 100 milliseconds results in an image being obtained once every 10 centimeters of floor travel. Many normal sized obstacles, such as furniture, are larger than this increment of travel. Thus, this rate of image exposures is sufficient for avoiding most normal sized obstacles.

Another technique to improve signal to noise ratio while conserving energy is to acquire two images in quick succession, one flashed and one non-flashed, and then to subtract on a pixel-by-pixel basis the brightness values of the non-flashed image from those of the flashed image. This technique is known in the art as image subtraction and results in the reflected pattern due to the structured radiation projector being emphasized.

A strobe light source having an output planar beam forming means, such as a knife-edge aperture, may be employed as a structured beam projector. The short duration of a typical strobe flash implies low duty cycle and hence an increased energy efficiency. If a xenon strobe source is employed it is desirable to include an infrared pass filter at the strobe output to reduce annoyance to humans sharing the maneuvering space with the robot.

In accordance with one embodiment of the invention there is illustrated in FIGS. 2a and 2b a beam projector, such as the beam projector 14 of FIG. 1a, which comprises an elongated, substantially cylindrical xenon flash tube 28 which is interposed between a circular cylindrical reflector 28a and an aspheric cylindrical lens 28b. Lens 28b may have a focal length of approximately 0.5 inches and operates to focus both the direct and reflected output of flash tube 28 in front of a second aspheric cylindrical lens 28c. The flashtube 28 preferably is positioned at the focal point of cylindrical reflector 28a so that direct and reflected light rays are co-aligned on entering lens 28b. The mirror reflector 28a thus increases the energy efficiency of the structured light system by gathering light emitted from the back of the flash tube and sending it back in the same direction as light emitted directly from the front of the tube. Lenses 28b and 28c may be Fresnel lenses in that such lenses are preferred to solid glass or plastic in that they are lighter, thinner, and can accommodate shorter focal lengths without spherical aberration. Shorter focal lengths are preferred because they collect light from a wider angle, so less radiant energy is lost. Cylindrical lens 28c may also have a focal length of approximately 0.5 inches and operates to collimate the radiation and to provide a planar radiation beam output. As was previously stated, a pass band filter 28D may be provided for filtering out substantially all wavelengths except those in a desired range, such as a range of 700 to 1000 nm.

As shown in FIG. 2c lenses 28b and 28c may be replaced by slit apertures 28e and 28f which collimate emitted light from flash tube 28. This arrangement is more wasteful of energy, but is simpler in design and less costly than the provision of lenses to collimate the radiation.

In general, it has been determined that the width of the projected planar beam, or radiation stripe pattern, is preferably broad enough to span the path in front of the robot, but simple enough to afford unambiguous interpretation. Thus, a single radiation stripe is preferred for a single image capture, although several stripes may be flashed in succession. For example, two horizontal radiation stripes projected alternately and viewed in consecutive images, which project at approximately ankle level and chair seat level, have been found to be useful for indoor navigation to detect low and medium height obstacles within the environment. If there are no obstacles at these levels to reflect the radiation stripes the image viewed by the camera 12 is substantially blank. Thus a very simple "no image" condition can be readily detected without significant signal processing, allowing the robot to proceed at top speed.

In presently preferred embodiments of the invention the structured beam projectors 14 and 16 and the camera 12 are mounted rigidly on the body of the robot 10 such that triangulation geometry processing which relates pixel position to an environmental position remains fixed in time. However, it is also possible to employ a movable camera and/or movable beam projectors whose relative positions and orientations may be varied. In this case, more complex imaging processing is required to account for the changes in position. It is also within the scope of the invention to provide for only one beam projector.

In accordance with one aspect of the invention relatively nearby objects within a range of 2-10 feet are illuminated with a structured radiation pattern, preferably a stripe of radiation. The image of the structured radiation reflecting to an image sensor, such as the CCD camera 12, is analyzed to determine the range, bearing and elevation geometry of objects relative to the robot 10 and the plane of the floor 26. The structure and pattern of light preferably provides azimuth coverage of approximately 90 degrees, leaving no gaps. With the span of the structured pattern being about 90 degrees the peripheral illuminance is preferably at least 50% of central illuminance. Illuminance fluctuations along the pattern boundary are generally tolerable to magnitudes of 25%, insofar as they may be compensated for by an intensity value lookup table. The cross section of the beam profile is preferably sharp enough such that there is a drop from substantially full illumination to substantially no illumination within a distance of approximately two inches on a perpendicularly illuminated surface at a distance of ten feet. This change in illumination decreases proportionally for closer surfaces, to one half inch at 2.5 feet. The thickness of the projected radiation beam at ten feet is preferably approximately four inches if perfectly collimated. If divergent, the angle of divergence should be less than approximately two degrees.

Inasmuch as the robot 10 typically operates in public areas it is desirable to minimize the visibility of the light to humans. Furthermore, since a silicon diode CCD camera 12 is presently preferred another consideration is the efficient use of the sensitivity of the camera 12. A wavelength range of 700 to 1000 nanometers achieves both of these goals. A filter on the source and a like filter on the camera maximizes signal to noise ratio over ambient light. If the beam projectors 14 and 16 have a sufficiently narrow output spectrum substantially within the range of 700-1000 nanometers, a filter is only required on the camera 12, the filter being matched to the spectrum of the source.

Preferably the measured brightness at the CCD camera 12 of the illuminated region, at a range of 2-10 feet and through a filter is two to five times greater than bright ambient light (corresponding to a brightly lit work area) from an incandescent light source, such as a 100 watt light bulb positioned five feet from a surface.

A maximum useful duration of a pulse of output radiation is 33 milliseconds for a typical CCD camera image acquisition. Durations as short as one millisecond may be employed if the camera 12 comprises an electronic shutter.

A pulse repetition rate of the beam projectors is preferably at least two flashes per second, and may be as high as 10 per second or more when gathering detailed information on nearby objects. At higher repetition rates, lower power flashes may be employed because of the shorter range to the object. Full power is generally required at repetition rates of four per second and slower. As was previously stated, control of the flash rate of the projectors 14 and 16 is preferably accomplished by the image processor 18.

In accordance with the invention image processing performed by image processor 18 and navigation control processor 20 generally involves the following steps or operations:

(a) locating the image of light stripes rapidly in the image;

(b) inferring the range and bearing of objects from the located stripe images;

(c) storing a geometric map representation of these object positions; and

(d) accessing and processing the map information with navigation algorithms and generating control signals which result in avoidance of obstacles or navigation to reference landmarks.

The first step (a) includes an image processing step of reducing the typically grey scale camera image to a binary image. If the structured beam is sufficiently bright to overpower ambient illumination, image intensity may be thresholded. Generally however, the structured beam is not sufficiently bright to overcome all ambient radiation in which case the aforementioned image subtraction technique may be employed. The result of this step of reducing the grey scale image to a binary image reduces the subsequent search for the structured radiation stripe within the image of the FOV to a less complex present/absent detection technique.

The first step (a) includes another image processing step which employs the use of a search algorithm which successively subdivides the size of steps or increments taken of the image during the search. That is, it scans the image rapidly at a coarse resolution and then searches at a finer resolution when detection of a pixel above threshold is encountered. This is one form of a binary search.

Step (b) above, inferring position within the environment from image position, exploits the fixed mounting of the camera 12 and projectors 14 and 16. Illumination of a particular pixel within the image for a particular projector output implies a unique position within the environment of an object reflecting the structured radiation. Each of these unique pixel related positions may be precalculated and stored in the lookup tables 18F and 18G to enhance real-time operation or each position may be calculated as an illuminated pixel is detected. One preferred method of calculating range and bearing associated with each pixel will be described in detail below.

Step (c) involves consolidating the individual determined range and bearing measurements into a geometric representation of the environment, which includes motion of the robot relative to the environment. One technique which may be employed is to represent the floor 26 as a two dimensional grid, to mark or designate grid cells which are occupied by detected objects, and to assign a degree of confidence in the visual measurement based on the persistence of detected objects at fixed position within a grid cell or cells. FIG. 6 illustrates such a map wherein the robot is moving in the direction of the v axis. The map which is stored in the navigation control processor 20 of the robot 10 is divided into cells which might typically be as small as one inch or as large as one foot on each side. When analysis of the image indicates the detection of an object at a particular (u,v) position, a confidence level C(u,v) is assigned to that position. This confidence level is increased as successive observations continue to detect a presence of the object at the same position. Confidence level ranges in value from 0.0 to 1∅ FIG. 6 illustrates that an object has been detected and confidence levels assigned for occupied cells as follows:

______________________________________
C(3,4) = .2, C(3,5) = .5, C(3,6) = .3,
C(4,5) = .8, C(3,7) = .3, C(4,6) = .8,
C(5,5) = .7.
______________________________________

Another geometric representation may be derived by considering contiguous detections as a single object, and defining the position and radius of an enclosing circle as object parameters for purposes of navigation. The circle in FIG. 6 illustrates this representation, the circle having parameters defined as a Center (3,5) and a Radius equal to two grid cells.

The optimum choice of coordinates for representing the map depends in part on the manner in which the map is to be used. Initial inference of object position from structured light vision in step (c) above yields polar coordinates. Other sensors, such as sonar, also yield polar coordinates, R and Theta. It may be advantageous to combine such multi-sensory data in the same polar coordinate representation to generate confidence levels, prior to converting to x-y coordinates. Cartesian (x, y) coordinates are computationally advantageous for representing motion of the robot, which can be computed by vector addition without altering the x-y relations between objects in the map.

Whatever coordinate system is chosen for the map, two dimensions of position are derivable for objects using structured light. There is also a third dimension, elevation, which is available implicitly from the elevation of the light plane which intersects the object. This may be useful in discriminating tall objects from short ones. However, since the physical envelope of the robot is substantially vertical, an object at any elevation is normally considered an obstruction to robot motion. Thus a two-dimensional map is generally sufficient for navigation purposes.

Step (d) above involves directing the robot 10 along a path which avoids obstacles or which corresponds in a prescribed reference frame to visually measured objects. A variety of well known path planning techniques can be used. For example, if there is a prescribed goal path which is obstructed by an obstacle one strategy is to find an alternate path through free space which is the shortest path between the present position and a desired, goal position.

Referring now to FIGS. 3a and 3b it can be seen that obstacle avoidance and/or reference surface recognition relies on structured light projection and detection. The reflected structured light planes are superimposed upon the horizontal pixel planes of the camera. As an object approaches the robot, it is first seen at the top of the field of view (FOV). As it moves closer to the robot, it moves down in the camera 12 FOV. Each pixel in the FOV corresponds to a range (R) and a bearing angle (Theta) from the robot 10 to the object.

Preferably, each R and Theta are pre-computed off-line and stored in a read only memory (ROM) which is permanently installed in the robot and which is accessed by microcomputer 18D. Alternatively, when the robot 10 is first energized, a lookup table is compiled by image processor 18 from equations that determine R and Theta for each individual camera pixel relative to the FOV of the environment. During operation the object detection algorithm searches the image of the FOV of the camera 12 for reflections from objects. The R and Theta of any pixels that are bright enough to exceed a predetermined threshold value are detected as locations of objects and stored in a data structure which defines a map of the robot's environment, such as that depicted in FIG. 6.

In a presently preferred embodiment of the invention the image is comprised of 512×480 pixels, resulting in approximately 250,000 total pixels. By example, it may require the microcomputer 18D approximately 33 microseconds to read the value of one pixel from the video memory 18A. To read all of the pixels would require in excess of eight seconds. Thus, in order to provide for operation in a real-time manner not all pixels within each image are searched, but only every nth pixel is searched for a pixel exceeding a threshold which may indicate a possible object. As was previously stated a coarse resolution search is preferably performed in the upper region of the FOV, wherein objects appear while they are still far off, and a finer resolution search is performed in the lower region of the FOV where objects are seen nearer to the robot 10.

A cylindrical coordinate system can be employed for plotting the position of objects with respect to the robot as illustrated in FIG. 3b, the origin of the coordinate system being the center of the robot. R is the distance or range from the center of the robot to the object and Theta is the angle to the object. A Theta of zero degrees corresponds to an axis which is normal to the front of the robot which passes through the center. An intermediate X, Y cartesian coordinate system is used in the calculations for obtaining R and Theta. The origin of this intermediate coordinate system is a point on the floor directly under the camera sensor, Y is a vertical axis and X is a horizontal axis which extends straight out in front of the robot.

The first step in the analysis is to determine X, Y coordinates of the points where the centers of the light planes and multiple horizontal pixel planes intersect in the x-y plane. This can be seen in FIG. 3a as dots such as A and B along the lines labelled 14a and 16a which represent the upper and lower light planes, respectively.

These intersection points can be found by determining the equations for both lines of interest and solving them simultaneously. The basic equation for a line is y=m*x+b where * denotes multiplication.

It is known that the height from the floor of the lower projector 16 is Hl. The height of the upper projector 14 is Hu. The camera height is Hc and the slopes for the individual horizontal pixel planes are denoted by CAMS.

The equation for the lower projector 16 is:

y=Hl (1)

The equation for the camera 12 is

y=CAMS * x+Hc. (2)

Solving these equations simultaneously yields:

x=(-Hl+Hc)/(-CAMS), and (3)

y=CAMS * (x+Hc). (4)

The above equations are solved for each value of CAMS, that is, the slope of each individual horizontal pixel plane of the image. Initially the slope of the center-line of the camera 12 (pixel 239.5) may be first determined, then the slopes of pixels 0 to 239, and 240 to 479 are found.

The slope of the center line of the camera 12 is

slope=-Hc/CAMD, (5)

where CAMD is a distance along the x-axis to a point where the center of the camera images the floor 26. The angle PHI of each pixel ray is

PHI=atan(-Hc/CAMD)+/-(atan(i/240 * 3.3/8.0)), (6)

where i varies from 1 to 240 and is the number of pixels from the center of the image. The term 3.3 is one half the sensor height in millimeters and the term 8.0 is the focal length of the camera lens in millimeters. Of course these terms, in addition to the number of the horizontal and vertical pixels, are specific to a particular camera and may have different values if another camera is employed.

The slope for each pixel plane is given by

CAMS=tan(PHI). (7)

Once the x, y coordinate of the intersection point is known, the hypotenuse (h) from each x, y point to the camera sensor is found using the Pythagorean Theorem, where

h=sqrt (x**2+(Hc-y)**2). (8)

where ** denotes exponentiation.

The distance h from the camera 12 to the obstacle and also the distance x along the floor from the robot to the obstacle is now known for an object directly in front of the robot 10 at a Theta equal to zero degrees.

Next the intersection points of the camera 12 and the structured beams are found for objects that are other than directly in front of the robot 10. The lateral distance from a centerline, where the pixel and light plane intersect, to points disposed to the left and the right is denoted by nx(i), where i is the number of pixels offset from the center. The slope to each intersection, as seen in the top view in FIG. 3b, is given by:

slope=((i/256) * 4.4/8.0), (9)

where i is the number of pixels from the center line, (up to 256), 4.4 is one half of the horizontal sensor dimension in millimeters, and 8.0 is the focal length. As before, the constant 4.4 is camera specific.

The slope to each pixel can also be represented as nx(i)/h, therefore:

nx(i) =((i/256) * 4.4/8.0) * h. (10)

R and Theta for each pixel in the FOV can thereafter be determined in accordance with the equations:

Theta=atan(nx(i)/(x+offset)), and (11)

R=nx(i)/sin(Theta), (12)

where offset is the distance along the x-axis from the camera image sensor plane to the center of the robot. As was previously stated, R and Theta for each pixel may be computed and stored in a lookup table prior to operation.

Referring now to FIGS. 4 and 5 there is shown another embodiment of a robot having a structured light visual navigation and obstacle avoidance system in accordance with the invention. Robot 30 has a plurality of structured light projectors including an upper projector 32, a lower projector 34, and a camera 36 which is disposed between the upper and lower projectors. Robot 30 further comprises a pair of structured light projectors 38 and 40 which are disposed on opposite sides of a camera 36 and in an elevated position therefrom. Projectors 38 and 40 provide a planar beam pattern which is projected orthogonally to the horizontally projected beam from projector 34. The planar beam pattern from upper projector 32 is projected obliquely downwards such that it intersects the floor 42 at a position in front of the robot 30. Other internal components of the robot 30 are as shown in FIG. 1. That is, the robot 30 comprises an image processor, a navigation control processor and a drive and steering controller. Drive and steering wheels 44 are provided for moving the robot over the floor 42.

The structured light planes 38a and 40a shown in FIGS. 4 and 5 are projected forward to intersect any objects in the two vertical planes bounding the robot's forward path through the environment. As seen in the illustrative field of view of FIG. 7, the vertical lines 38b and 40b indicate the loci of successive intersections of light planes with vertical objects at successive ranges, as seen from the camera. Thus range, bearing and elevation can be measured from pixel position using algorithms exactly analogous to those discussed previously with regards to horizontal planes of structured light, as is immediately obvious to those versed in the art.

It can also be seen in FIG. 8 that the camera view of the oblique structured light plane 32a of FIGS. 4 and 5 is reflected from the floor substantially uniformly (32b) and horizontally when there is no obstruction or other feature closely adjacent to the floor. The image stripe remains at a fixed position on the screen regardless of robot motion so long as the floor 42 is flat. This uniformity is broken by a depression, such as a hole within the floor, or by an obstacle closely adjacent to the floor.

The depression generates an image with a break in the stripe 32b having a bright portion 32c disposed below the break. An obstacle lying on the floor yields a break in the stripe having a bright portion 32d disposed above the break. Clearly, the magnitude of displacement of the bright portions 32c and 32d above and below the stripe 32b is a measure of range and elevation, and the position of the break is a measure of bearing, using algorithms exactly analogous to those discussed previously with regards to horizontal planes of structured light, as is also immediately obvious to those versed in the art.

When multiple planes of structured light are used, as illustrated in FIGS. 4 and 5, their timing should be desynchronized so that there is no ambiguity in interpretation of which beam is to be associated with any particular pixel location. Furthermore, a separate lookup table may be associated with each structured light source. These lookup tables such as 18F and 18G, are most conveniently stored in preprogrammed ROM's (read-only-memories).

For the embodiment of FIGS. 4 and 5 the determination of R and Theta as a function of pixel position is accomplished in a manner substantially identical to that disclosed above in reference to the robot of FIG. 1; it being realized that suitable adjustments are made for the height and position of the camera having a horizontal, forward looking FOV and for the slope and relative position of the upper beam projector 32.

It should be realized that the invention may be practiced with a variety of types of planar light projectors and with a variety of types of image sensors or cameras other than those disclosed above. For example, the invention may be practiced with a structured beam of visible light which is received by a vidicon camera. Furthermore, the exact nature of the image processing algorithm may be modified while still achieving a substantially identical result. Thus, it should be further realized that those having ordinary skill in the art may derive a number of modifications to the embodiments of the invention disclosed above. The invention is therefore not to be construed to be limited only to these disclosed embodiments but it is instead intended to be limited only as defined by the breadth and scope of the appended claims.

Evans, Jr., John M., Weiman, Carl F. R., King, Steven J.

Patent Priority Assignee Title
10015452, Apr 15 2004 MAGNA ELECTRONICS INC. Vehicular control system
10045676, Jun 24 2004 iRobot Corporation Remote control scheduler and method for autonomous robotic device
10068486, Jun 09 2015 Amazon Technologies, Inc. Transportation network utilizing multiple autonomous vehicles to transport items between network locations
10070764, May 09 2007 iRobot Corporation Compact autonomous coverage robot
10071676, Aug 11 2006 MAGNA ELECTRONICS INC Vision system for vehicle
10110860, Apr 15 2004 MAGNA ELECTRONICS INC. Vehicular control system
10118618, May 03 2002 MAGNA ELECTRONICS INC. Vehicular control system using cameras and radar sensor
10133276, Jun 19 2015 Amazon Technologies, Inc Object avoidance with object detection and classification
10133930, Oct 14 2014 LG Electronics Inc Robot cleaner and method for controlling the same
10139217, Feb 16 2016 GOOGLE LLC Array based patterned illumination projector
10147249, Mar 22 2017 Amazon Technologies, Inc.; Amazon Technologies, Inc Personal intermediary communication device
10181085, Nov 05 2014 TRW Automotive U.S. LLC Augmented object detection using structured light
10187615, Apr 15 2004 MAGNA ELECTRONICS INC. Vehicular control system
10216188, Jul 25 2016 Amazon Technologies, Inc.; Amazon Technologies, Inc Autonomous ground vehicles based at delivery locations
10222798, Sep 29 2016 Amazon Technologies, Inc Autonomous ground vehicles congregating in meeting areas
10233021, Nov 02 2016 Amazon Technologies, Inc. Autonomous vehicles for delivery and safety
10241516, Sep 29 2016 Amazon Technologies, Inc Autonomous ground vehicles deployed from facilities
10244915, May 19 2006 iRobot Corporation Coverage robots and associated cleaning bins
10245993, Sep 29 2016 Amazon Technologies, Inc Modular autonomous ground vehicles
10248120, Sep 16 2016 Amazon Technologies, Inc. Navigable path networks for autonomous vehicles
10255501, Oct 14 2014 LG Electronics Inc. Robot cleaner and method for controlling the same
10299652, May 09 2007 iRobot Corporation Autonomous coverage robot
10303171, Sep 29 2016 Amazon Technologies, Inc Autonomous ground vehicles providing ordered items in pickup areas
10306190, Apr 15 2004 MAGNA ELECTRONICS INC. Vehicular control system
10308430, Dec 23 2016 Amazon Technologies, Inc. Distribution and retrieval of inventory and materials using autonomous vehicles
10310499, Dec 23 2016 Amazon Technologies, Inc. Distributed production of items from locally sourced materials using autonomous vehicles
10310500, Dec 23 2016 Amazon Technologies, Inc. Automated access to secure facilities using autonomous vehicles
10314449, Feb 16 2010 iRobot Corporation Vacuum brush
10342400, May 20 2016 LG Electronics Inc. Autonomous cleaner
10342405, May 20 2016 LG Electronics Inc. Autonomous cleaner
10349798, May 17 2016 LG Electronics Inc Mobile robot and method of controlling the same
10351135, May 03 2002 MAGNA ELECTRONICS INC. Vehicular control system using cameras and radar sensor
10362916, May 20 2016 LG Electronics Inc. Autonomous cleaner
10362918, May 08 2014 ALFRED KÄRCHER SE & CO KG Self-propelled and self-steering floor cleaner and method for cleaning a floor surface
10398276, May 20 2016 LG Electronics Inc. Autonomous cleaner
10420447, Jan 03 2002 iRobot Corporation Autonomous floor-cleaning robot
10420448, May 20 2016 LG Electronics Inc. Autonomous cleaner
10433692, Jan 03 2002 iRobot Corporation Autonomous floor-cleaning robot
10441128, May 20 2016 LG Electronics Inc. Autonomous cleaner
10453021, Dec 12 2014 Amazon Technologies, Inc. Mobile base utilizing automated aerial vehicles with navigation systems for delivering items
10456004, May 17 2016 LG Electronics Inc Mobile robot
10457392, Dec 12 2014 Amazon Technologies, Inc. Mobile base utilizing automated aerial vehicles for delivering items
10462426, Apr 15 2004 MAGNA ELECTRONICS INC. Vehicular control system
10463212, May 20 2016 LG Electronics Inc. Autonomous cleaner
10463221, May 20 2016 LG Electronics Inc. Autonomous cleaner
10470629, Feb 18 2005 iRobot Corporation Autonomous surface cleaning robot for dry cleaning
10481611, May 20 2016 LG Electronics Inc. Autonomous cleaner
10514690, Nov 15 2016 Amazon Technologies, Inc. Cooperative autonomous aerial and ground vehicles for item delivery
10517454, Jan 03 2002 iRobot Corporation Autonomous floor-cleaning robot
10517456, May 17 2016 LG Electronics Inc Mobile robot and method of controlling the same
10524628, May 20 2016 LG Electronics Inc. Autonomous cleaner
10524629, Dec 02 2005 iRobot Corporation Modular Robot
10532885, Dec 23 2016 Amazon Technologies, Inc. Delivering items using autonomous vehicles
10553122, Mar 22 2016 Amazon Technologies, Inc.; Amazon Technologies, Inc Unmanned aerial vehicle data collection for routing
10573106, Mar 22 2017 Amazon Technologies, Inc.; Amazon Technologies, Inc Personal intermediary access device
10589931, Sep 30 2016 STAPLES, INC Hybrid modular storage fetching system
10623704, Sep 30 2004 Donnelly Corporation Driver assistance system for vehicle
10656656, May 17 2016 LG Electronics Inc Mobile robot and method for controlling and detecting inclination using pattern light
10683008, May 03 2002 MAGNA ELECTRONICS INC. Vehicular driving assist system using forward-viewing camera
10683171, Sep 30 2016 STAPLES, INC Hybrid modular storage fetching system
10698409, Sep 16 2016 Amazon Technologies, Inc. Navigable path networks for autonomous vehicles
10735695, Apr 15 2004 MAGNA ELECTRONICS INC. Vehicular control system with traffic lane detection
10787116, Aug 11 2006 MAGNA ELECTRONICS INC Adaptive forward lighting system for vehicle comprising a control that adjusts the headlamp beam in response to processing of image data captured by a camera
10796562, Sep 26 2019 Amazon Technologies, Inc.; Amazon Technologies, Inc Autonomous home security devices
10803420, Sep 30 2016 STAPLES, INC Hybrid modular storage fetching system
10827895, May 20 2016 LG Electronics Inc. Autonomous cleaner
10827896, May 20 2016 LG Electronics Inc. Autonomous cleaner
10828777, Jun 25 2015 KUKA Roboter GmbH Method for the redundancy-optimized planning of the operation of a mobile robot
10835095, May 20 2016 LG Electronics Inc. Autonomous cleaner
10856714, May 20 2016 LG Electronics Inc. Autonomous cleaner
10885491, Dec 12 2014 Amazon Technologies, Inc. Mobile base utilizing transportation units with navigation systems for delivering ordered items
10893787, Jun 24 2004 iRobot Corporation Remote control scheduler and method for autonomous robotic device
10901418, Jul 25 2016 Amazon Technologies, Inc. Autonomous ground vehicles receiving items from transportation vehicles for delivery
10939792, May 20 2016 LG Electronics Inc. Autonomous cleaner
10962376, Sep 30 2011 iRobot Corporation Adaptive mapping with spatial summaries of sensor data
11058271, Feb 16 2010 iRobot Corporation Vacuum brush
11072250, May 09 2007 iRobot Corporation Autonomous coverage robot sensing
11084410, Aug 07 2018 STAPLES, INC Automated guided vehicle for transporting shelving units
11119487, Dec 31 2018 STAPLES, INC Automated preparation of deliveries in delivery vehicles using automated guided vehicles
11124401, Mar 31 2019 STAPLES, INC Automated loading of delivery vehicles
11148583, Aug 11 2006 MAGNA ELECTRONICS INC. Vehicular forward viewing image capture system
11180069, Dec 31 2018 STAPLES, INC Automated loading of delivery vehicles using automated guided vehicles
11203340, May 03 2002 MAGNA ELECTRONICS INC. Vehicular vision system using side-viewing camera
11209634, Nov 17 2017 ROBERT BOSCH START-UP PLATFORM NORTH AMERICA, LLC, SERIES 1 Optical system
11222299, Aug 31 2017 Amazon Technologies, Inc.; Amazon Technologies, Inc Indoor deliveries by autonomous vehicles
11232391, Aug 31 2017 Amazon Technologies, Inc.; Amazon Technologies, Inc Customized indoor and outdoor navigation maps and routes for autonomous vehicles
11235929, Dec 23 2016 Amazon Technologies, Inc. Delivering hems using autonomous vehicles
11244523, Mar 22 2017 Amazon Technologies, Inc. Managing access to secure indoor spaces
11260970, Sep 26 2019 Amazon Technologies, Inc Autonomous home security devices
11263579, Dec 05 2016 Amazon Technologies, Inc. Autonomous vehicle networks
11278173, Jan 03 2002 iRobot Corporation Autonomous floor-cleaning robot
11360484, Nov 03 2009 iRobot Corporation Celestial navigation system for an autonomous vehicle
11378973, Nov 03 2009 iRobot Corporation Celestial navigation system for an autonomous vehicle
11392130, Dec 12 2018 Amazon Technologies, Inc Selecting delivery modes and delivery areas using autonomous ground vehicles
11396257, Aug 11 2006 MAGNA ELECTRONICS INC. Vehicular forward viewing image capture system
11402837, Nov 15 2016 Amazon Technologies, Inc. Item exchange between autonomous vehicles of different services
11474530, Aug 15 2019 Amazon Technologies, Inc. Semantic navigation of autonomous ground vehicles
11498438, May 09 2007 iRobot Corporation Autonomous coverage robot
11503253, Apr 15 2004 MAGNA ELECTRONICS INC. Vehicular control system with traffic lane detection
11547263, May 20 2016 LG Electronics Inc. Autonomous cleaner
11590997, Aug 07 2018 STAPLES, INC Autonomous shopping cart
11591085, Sep 26 2019 Amazon Technologies, Inc. Autonomous home security devices
11610493, Mar 22 2016 Amazon Technologies, Inc. Unmanned aerial vehicles utilized to collect updated travel related data for deliveries
11623559, Aug 11 2006 MAGNA ELECTRONICS INC. Vehicular forward viewing image capture system
11630447, Aug 10 2018 STAPLES, INC Automated guided vehicle for transporting objects
11691264, Jun 02 2017 Pixart Imaging Inc. Mobile robot performing multiple detections using image frames of same optical sensor
11697554, Sep 30 2016 Staples, Inc. Hybrid modular storage fetching system
11702287, Sep 30 2016 Staples, Inc. Hybrid modular storage fetching system
11720109, Dec 13 2017 Sony Corporation Moving apparatus, information processing apparatus, and method
11752635, Jun 02 2017 Pixart Imaging Inc. Mobile robot performing multiple detections using image frames of same optical sensor
11808853, Jun 02 2017 Pixart Imaging Inc. Tracking device with improved work surface adaptability
11821985, Jun 02 2017 PIXART IMAGING INC Mobile robot performing multiple detections using image frames of same optical sensor
11829923, Dec 12 2014 Amazon Technologies, Inc. Mobile base utilizing transportation units with navigation systems for delivering ordered items
11835947, Nov 15 2016 Amazon Technologies, Inc. Item exchange between autonomous vehicles of different services
11846937, May 20 2016 LG Electronics Inc. Autonomous cleaner
11847836, Apr 15 2004 MAGNA ELECTRONICS INC. Vehicular control system with road curvature determination
11893535, Sep 30 2016 Staples, Inc. Hybrid modular storage fetching system
11898848, Jul 13 2018 LABRADOR SYSTEMS, INC. Visual navigation for mobile devices operable in differing environmental lighting conditions
11906979, Sep 06 2018 LG Electronics Inc. Plurality of autonomous mobile robots and controlling method for the same
5090864, Oct 30 1989 Trutzschler GmbH & Co. KG Method of opening fiber bales
5101351, Apr 12 1989 Nissan Motor Company, Limited Autonomous vehicle using fuzzy control
5111401, May 19 1990 UNITED STATES OF AMERICA, THE, AS REPRESENTED BY THE SECRETARY OF THE NAVY Navigational control system for an autonomous vehicle
5122957, Feb 28 1989 Nissan Motor Company Limited Autonomous vehicle for automatically/autonomously running on route of travel and its method using fuzzy control
5125034, Dec 27 1990 TT MACHINERY HOLDINGS, INC , A CORP OF DELAWARE Method and apparatus for analyzing fabric conditions
5170351, Sep 18 1990 MATSUSHITA ELECTRIC INDUSTRIAL CO , LTD Automatic guided vehicle and method for controlling travel thereof
5170352, May 07 1990 FMC Corporation Multi-purpose autonomous vehicle with path plotting
5220508, Dec 28 1989 Kabushiki Kaisha Toyota Chuo Kenkusho Position and heading detecting device for self controlled vehicle
5233527, Mar 19 1990 Honda Giken Kogyo Kabushiki Kaisha Automatic travelling apparatus
5313542, Nov 30 1992 Breault Research Organization, Inc. Apparatus and method of rapidly measuring hemispherical scattered or radiated light
5525882, Oct 25 1993 International Business Machines Corporation Method and system for maneuvering a mobile robot
5545960, Apr 09 1991 INTERNATIONAL BUSINESS MACHINES CORPORATION, A CORPORATION OF NEW YORK Autonomous mobile machine, and system and method for controlling a mobile machine
5548511, Oct 29 1992 Axxon Robotics, LLC Method for controlling self-running cleaning apparatus
5559695, Dec 27 1994 Hughes Electronics Corporation Apparatus and method for self-calibrating visual time-to-contact sensor
5592567, Nov 10 1992 Siemens Aktiengesellschaft Method for detecting and separating the shadow of moving objects in a sequence of digital images
5615294, Nov 30 1992 Breault Research Organization Apparatus for collecting light and its method of manufacture
5758298, Mar 16 1994 Deutsche Forschungsanstalt fur Luft-und Raumfahrt e.V. Autonomous navigation system for a mobile robot or manipulator
6285778, Sep 19 1991 Yazaki Corporation Vehicle surroundings monitor with obstacle avoidance lighting
6442476, Apr 15 1998 COMMONWEALTH SCIENTIFIC AND INSUSTRIAL RESEARCH ORGANISATION; Research Organisation Method of tracking and sensing position of objects
6697147, Jun 29 2002 Samsung Electronics Co., Ltd. Position measurement apparatus and method using laser
6721679, Oct 27 2000 Honda Giken Kogyo Kabushiki Kaisha Distance measuring apparatus and distance measuring method
6727844, Oct 13 1999 Robert Bosch GmbH Method and device for detecting objects
6961443, Jun 15 2000 Joyson Safety Systems Acquisition LLC Occupant sensor
6968073, Apr 24 2001 Joyson Safety Systems Acquisition LLC Occupant detection system
7057368, Mar 27 2002 Sony Corporation Electrical charging system, electrical charging controlling method, robot apparatus, electrical charging device, electrical charging controlling program and recording medium
7061199, Mar 27 2002 Sony Corporation Electrical charging system, electrical charging controlling method, robot apparatus, electrical charging device, electrical charging controlling program and recording medium
7066291, Dec 04 2000 UNIBAP AB Robot system
7068004, Mar 27 2002 Sony Corporation Electrical charging system, electrical charging controlling method, robot apparatus, electrical charging device, electrical charging controlling program and recording medium
7071648, Mar 27 2002 Sony Corporation Electrical charging system, electrical charging controlling method, robot apparatus, electrical charging device, electrical charging controlling program and recording medium
7082350, Dec 04 2000 UNIBAP AB Robot system
7082351, Nov 20 2001 Sharp Kabushiki Kaisha Group robot system, and sensing robot and base station used therefor
7108379, May 09 2003 Benq Corporation Projector for adjusting a projected image size and luminance depending on various environments
7130448, Jan 18 2002 Arriver Software AB Device for monitoring around a vehicle
7272467, Dec 17 2002 iRobot Corporation Systems and methods for filtering potentially unreliable visual data for visual simultaneous localization and mapping
7283893, Nov 20 2001 Sharp Kabushiki Kaisha Group robot system, and sensing robot and base station used therefor
7305149, Mar 11 2002 Mitsubishi Denki Kabushiki Kaisha Image pickup information recognition system
7388967, Mar 26 2004 Mitsubishi Fuso Truck and Bus Corporation Vehicle running state judging apparatus
7406181, Oct 03 2003 Joyson Safety Systems Acquisition LLC Occupant detection system
7539563, Nov 03 2004 Samsung Electronics Co., Ltd. System and method for identifying objects in a space
7573403, Dec 17 2002 iRobot Corporation Systems and methods for controlling a density of visual landmarks in a visual simultaneous localization and mapping system
7679532, Dec 17 2002 iRobot Corporation Systems and methods for using multiple hypotheses in a visual simultaneous localization and mapping system
7689321, Feb 13 2004 iRobot Corporation Robust sensor fusion for mapping and localization in a simultaneous localization and mapping (SLAM) system
7706917, Jul 07 2004 iRobot Corporation Celestial navigation system for an autonomous robot
7774158, Dec 17 2002 iRobot Corporation Systems and methods for landmark generation for visual simultaneous localization and mapping
7818090, Nov 18 2004 Panasonic Corporation Method of controlling movement of mobile robot
7916898, Sep 15 2003 Deere & Company Method and system for identifying an edge of a crop
7983817, Jun 07 1995 AMERICAN VEHICULAR SCIENCES LLC Method and arrangement for obtaining information about vehicle occupants
8086419, Dec 17 2002 iRobot Corporation Systems and methods for adding landmarks for visual simultaneous localization and mapping
8095336, Dec 17 2002 iRobot Corporation Systems and methods for determining whether to add a landmark for visual simultaneous localization and mapping
8136404, Sep 19 2008 Denso Corporation Obstacle detection system
8150650, Dec 17 2002 iRobot Corporation Systems and methods for filtering potentially unreliable visual data for visual simultaneous localization and mapping
8185344, May 28 2008 Troxler Electronic Laboratories, Inc. Method of determining a dimension of a sample of a construction material and associated apparatus
8219274, Jul 02 2009 Robert Bosch GmbH 3-dimensional perception system and method for mobile platform
8229228, Sep 16 2008 Robert Bosch GmbH Image analysis using a pre-calibrated pattern of radiation
8239084, Sep 11 2006 Hitachi, LTD Moving device
8239992, May 09 2007 iRobot Corporation Compact autonomous coverage robot
8253368, Jan 28 2004 iRobot Corporation Debris sensor for cleaning apparatus
8266754, Feb 21 2006 iRobot Corporation Autonomous surface cleaning robot for wet and dry cleaning
8266760, Feb 18 2005 iRobot Corporation Autonomous surface cleaning robot for dry cleaning
8271129, Dec 02 2005 iRobot Corporation Robot system
8274406, Dec 17 2002 iRobot Corporation Systems and methods for using multiple hypotheses in a visual simultaneous localization and mapping system
8275482, Jan 24 2000 iRobot Corporation Obstacle following sensor scheme for a mobile robot
8359703, Dec 02 2005 iRobot Corporation Coverage robot mobility
8368339, Jan 24 2001 iRobot Corporation Robot confinement
8374721, Dec 02 2005 iRobot Corporation Robot system
8378613, Jan 28 2004 iRobot Corporation Debris sensor for cleaning apparatus
8380350, Dec 02 2005 iRobot Corporation Autonomous coverage robot navigation system
8382906, Feb 18 2005 iRobot Corporation Autonomous surface cleaning robot for wet cleaning
8386081, Sep 13 2002 iRobot Corporation Navigational control system for a robotic device
8387193, Feb 21 2006 iRobot Corporation Autonomous surface cleaning robot for wet and dry cleaning
8390251, Jan 21 2004 iRobot Corporation Autonomous robot auto-docking and energy management systems and methods
8392021, Feb 18 2005 iRobot Corporation Autonomous surface cleaning robot for wet cleaning
8396592, Jun 12 2001 iRobot Corporation Method and system for multi-mode coverage for an autonomous robot
8406923, Mar 12 2008 Denso Wave Incorporated Apparatus for determining pickup pose of robot arm with camera
8412377, Jan 24 2000 iRobot Corporation Obstacle following sensor scheme for a mobile robot
8417383, May 31 2006 iRobot Corporation Detecting robot stasis
8418303, May 19 2006 iRobot Corporation Cleaning robot roller processing
8422019, Mar 05 2010 NEC Corporation; Toyota Jidosha Kabushiki Kaisha Light measuring apparatus and light measuring method
8428778, Sep 13 2002 iRobot Corporation Navigational control system for a robotic device
8438695, May 09 2007 iRobot Corporation Autonomous coverage robot sensing
8456125, Jan 28 2004 iRobot Corporation Debris sensor for cleaning apparatus
8461803, Jan 21 2004 iRobot Corporation Autonomous robot auto-docking and energy management systems and methods
8463438, Jun 12 2001 iRobot Corporation Method and system for multi-mode coverage for an autonomous robot
8474090, Jan 03 2002 iRobot Corporation Autonomous floor-cleaning robot
8476861, Jan 28 2004 iRobot Corporation Debris sensor for cleaning apparatus
8478442, Jan 24 2000 iRobot Corporation Obstacle following sensor scheme for a mobile robot
8508388, Dec 17 2002 iRobot Corporation Systems and methods for using multiple hypotheses in a visual simultaneous localization and mapping system
8515578, Sep 13 2002 iRobot Corporation Navigational control system for a robotic device
8516651, Jan 03 2002 iRobot Corporation Autonomous floor-cleaning robot
8527141, Dec 25 2008 Toyota Jidosha Kabushiki Kaisha Sensor calibration device, and sensor calibration method
8528157, May 19 2006 iRobot Corporation Coverage robots and associated cleaning bins
8565920, Jan 24 2000 iRobot Corporation Obstacle following sensor scheme for a mobile robot
8572799, May 19 2006 iRobot Corporation Removing debris from cleaning robots
8584305, Dec 02 2005 iRobot Corporation Modular robot
8594840, Jul 07 2004 iRobot Corporation Celestial navigation system for an autonomous robot
8598829, Jan 28 2004 iRobot Corporation Debris sensor for cleaning apparatus
8600553, Dec 02 2005 iRobot Corporation Coverage robot mobility
8634956, Jul 07 2004 iRobot Corporation Celestial navigation system for an autonomous robot
8634958, Jul 07 2004 iRobot Corporation Celestial navigation system for an autonomous robot
8661605, Dec 02 2005 iRobot Corporation Coverage robot mobility
8670866, Feb 18 2005 iRobot Corporation Autonomous surface cleaning robot for wet and dry cleaning
8686679, Jan 24 2001 iRobot Corporation Robot confinement
8726454, May 09 2007 iRobot Corporation Autonomous coverage robot
8739355, Feb 18 2005 iRobot Corporation Autonomous surface cleaning robot for dry cleaning
8749196, Jan 21 2004 iRobot Corporation Autonomous robot auto-docking and energy management systems and methods
8761931, Dec 02 2005 iRobot Corporation Robot system
8761935, Jan 24 2000 iRobot Corporation Obstacle following sensor scheme for a mobile robot
8774966, Feb 18 2005 iRobot Corporation Autonomous surface cleaning robot for wet and dry cleaning
8781626, Sep 13 2002 iRobot Corporation Navigational control system for a robotic device
8782848, Feb 18 2005 iRobot Corporation Autonomous surface cleaning robot for dry cleaning
8788092, Jan 24 2000 iRobot Corporation Obstacle following sensor scheme for a mobile robot
8793020, Sep 13 2002 iRobot Corporation Navigational control system for a robotic device
8798840, Sep 30 2012 iRobot Corporation Adaptive mapping with spatial summaries of sensor data
8800107, Feb 16 2010 iRobot Corporation; IROBOT Vacuum brush
8818042, Apr 15 2004 MAGNA ELECTRONICS INC Driver assistance system for vehicle
8830091, Dec 17 2002 iRobot Corporation Systems and methods for using multiple hypotheses in a visual simultaneous localization and mapping system
8839477, May 09 2007 iRobot Corporation Compact autonomous coverage robot
8842176, May 22 1996 Donnelly Corporation Automatic vehicle exterior light control
8854001, Jan 21 2004 iRobot Corporation Autonomous robot auto-docking and energy management systems and methods
8874264, Mar 31 2009 iRobot Corporation Celestial navigation system for an autonomous robot
8886385, Nov 20 2009 Murata Machinery, Ltd Autonomous mobile body and control method of same
8917169, Jun 07 1995 MAGNA ELECTRONICS INC Vehicular vision system
8930023, Nov 06 2009 iRobot Corporation Localization by learning of wave-signal distributions
8948442, Aug 16 2000 Intelligent Technologies International, Inc. Optical monitoring of vehicle interiors
8948956, Nov 20 2009 Murata Machinery, Ltd Autonomous mobile body and control method of same
8950038, Dec 02 2005 iRobot Corporation Modular robot
8954192, Dec 02 2005 iRobot Corporation Navigating autonomous coverage robots
8958911, Feb 29 2012 AVA ROBOTICS, INC Mobile robot
8966707, Feb 18 2005 iRobot Corporation Autonomous surface cleaning robot for dry cleaning
8972052, Jul 07 2004 iRobot Corporation Celestial navigation system for an autonomous vehicle
8977008, Sep 30 2004 Donnelly Corporation Driver assistance system for vehicle
8978196, Dec 02 2005 iRobot Corporation Coverage robot mobility
8985127, Feb 18 2005 iRobot Corporation Autonomous surface cleaning robot for wet cleaning
8993951, Mar 25 1996 MAGNA ELECTRONICS INC.; MAGNA ELECTRONICS INC Driver assistance system for a vehicle
9002511, Oct 21 2005 iRobot Corporation Methods and systems for obstacle detection using structured light
9008369, Apr 15 2004 MAGNA ELECTRONICS INC Vision system for vehicle
9008835, Jun 24 2004 iRobot Corporation Remote control scheduler and method for autonomous robotic device
9020637, Nov 02 2012 iRobot Corporation Simultaneous localization and mapping for a mobile robot
9037396, May 23 2013 iRobot Corporation Simultaneous localization and mapping for a mobile robot
9038233, Jan 03 2002 iRobot Corporation Autonomous floor-cleaning robot
9092458, Mar 08 2005 iRobot Corporation System and method for managing search results including graphics
9104204, Jun 12 2001 iRobot Corporation Method and system for multi-mode coverage for an autonomous robot
9110470, Dec 17 2002 iRobot Corporation Systems and methods for using multiple hypotheses in a visual simultaneous localization and mapping system
9128486, Sep 13 2002 iRobot Corporation Navigational control system for a robotic device
9144360, Dec 02 2005 iRobot Corporation Autonomous coverage robot navigation system
9144361, Jan 28 2004 iRobot Corporation Debris sensor for cleaning apparatus
9144902, May 15 2012 KUKA Roboter GmbH Method for determining possible positions of a robot arm
9149170, Dec 02 2005 iRobot Corporation Navigating autonomous coverage robots
9167946, Jan 03 2002 iRobot Corporation Autonomous floor cleaning robot
9171217, May 03 2002 MAGNA ELECTRONICS INC. Vision system for vehicle
9182763, Nov 09 2007 Samsung Electronics Co., Ltd. Apparatus and method for generating three-dimensional map using structured light
9191634, Apr 15 2004 MAGNA ELECTRONICS INC. Vision system for vehicle
9215957, Jan 21 2004 iRobot Corporation Autonomous robot auto-docking and energy management systems and methods
9218003, Sep 30 2012 iRobot Corporation Adaptive mapping with spatial summaries of sensor data
9223749, Jul 07 2004 iRobot Corporation Celestial navigation system for an autonomous vehicle
9224054, Dec 21 2009 INDIAN INSTITUTE OF SCIENCE Machine vision based obstacle avoidance system
9229454, Jul 07 2004 iRobot Corporation Autonomous mobile robot system
9273951, Jun 06 2011 Troxler Electronic Laboratories, Inc Optical method and apparatus for determining a characteristic such as volume and density of an excavated void in a construction material
9286810, Sep 24 2010 iRobot Corporation Systems and methods for VSLAM optimization
9317038, May 31 2006 iRobot Corporation Detecting robot stasis
9320398, Dec 02 2005 iRobot Corporation Autonomous coverage robots
9329598, May 23 2013 iRobot Corporation Simultaneous localization and mapping for a mobile robot
9392920, Dec 02 2005 iRobot Corporation Robot system
9400501, Nov 02 2012 iRobot Corporation Simultaneous localization and mapping for a mobile robot
9404756, Sep 30 2011 iRobot Corporation Adaptive mapping with spatial summaries of sensor data
9428192, Apr 15 2004 MAGNA ELECTRONICS INC. Vision system for vehicle
9436880, Aug 12 1999 MAGNA ELECTRONICS INC Vehicle vision system
9440535, Aug 11 2006 MAGNA ELECTRONICS INC Vision system for vehicle
9445702, Feb 18 2005 iRobot Corporation Autonomous surface cleaning robot for wet and dry cleaning
9446521, Jan 24 2000 iRobot Corporation Obstacle following sensor scheme for a mobile robot
9480381, May 09 2007 iRobot Corporation Compact autonomous coverage robot
9486924, Jun 24 2004 iRobot Corporation Remote control scheduler and method for autonomous robotic device
9492048, May 19 2006 iRobot Corporation Removing debris from cleaning robots
9511494, Jun 18 2012 LG Electronics Inc. Robot cleaner and controlling method of the same
9555803, May 03 2002 MAGNA ELECTRONICS INC. Driver assistance system for vehicle
9582005, Jan 24 2001 iRobot Corporation Robot confinement
9587938, Jun 17 2003 Troxler Electronic Laboratories, Inc.; Troxler Electronic Laboratories, Inc Method and apparatus for determining a characteristic of a construction material
9599990, Dec 02 2005 iRobot Corporation Robot system
9609289, Apr 15 2004 MAGNA ELECTRONICS INC. Vision system for vehicle
9622635, Jan 03 2002 iRobot Corporation Autonomous floor-cleaning robot
9632505, Oct 21 2005 iRobot Corporation Methods and systems for obstacle detection using structured light
9643605, May 03 2002 MAGNA ELECTRONICS INC. Vision system for vehicle
9736435, Apr 15 2004 MAGNA ELECTRONICS INC. Vision system for vehicle
9786187, Jun 09 2015 Amazon Technologies, Inc Transportation network utilizing autonomous vehicles for transporting items
9834216, May 03 2002 MAGNA ELECTRONICS INC. Vehicular control system using cameras and radar sensor
9886037, Dec 17 2002 iRobot Corporation Systems and methods for using multiple hypotheses in a visual simultaneous localization and mapping system
9910444, Sep 24 2010 iRobot Corporation Systems and methods for VSLAM optimization
9928474, Dec 12 2014 Amazon Technologies, Inc Mobile base utilizing transportation units for delivering items
9948904, Apr 15 2004 MAGNA ELECTRONICS INC. Vision system for vehicle
9949608, Sep 13 2002 iRobot Corporation Navigational control system for a robotic device
9952053, Sep 30 2011 iRobot Corporation Adaptive mapping with spatial summaries of sensor data
9953287, Jul 01 2014 Amazon Technologies, Inc Utilizing automated aerial vehicles for transporting priority pick items
9955841, May 19 2006 iRobot Corporation Removing debris from cleaning robots
Patent Priority Assignee Title
3187185,
3590258,
3610754,
3625618,
3773422,
4119900, Dec 21 1973 MITEC Moderne Industrietechnik GmbH Method and system for the automatic orientation and control of a robot
4188544, Aug 22 1977 APPLIED SCANNING TECHNOLOGY, 1988 LEGHORN STREET, MOUNTAIN VIEW, CALIFORNIA 94043 Method and apparatus for automatically processing a workpiece employing calibrated scanning
4335962, Jul 20 1979 Robotic Vision Systems, Inc. Method and apparatus for determining spatial information
4558215, Mar 30 1982 Agency of Industrial Science and Technology; Ministry of International Trade and Industry Object detecting apparatus
4575304, Apr 07 1982 Hitachi, Ltd. Robot system for recognizing three dimensional shapes
4611292, Oct 06 1982 Hitachi, Ltd. Robot vision system
4620285, Apr 24 1984 NEC Corporation Sonar ranging/light detection system for use in a robot
4627511, Oct 18 1984 Casio Computer Co., Ltd. Optical tracking robot system
4653316, Mar 14 1986 Kabushiki Kaisha Komatsu Seisakusho Apparatus mounted on vehicles for detecting road surface conditions
4658385, May 25 1984 Casio Computer Co., Ltd. Obstacle detection system
4668859, Jun 26 1984 Erwin Sick GmbH Optik-Elektronik Protective zone device for a vehicle
4687326, Nov 12 1985 General Electric Company Integrated range and luminance camera
4706195, Jun 15 1984 Nippon Soken, Inc.; Nippondenso Co., Ltd. Speed control system for a motor vehicle
4716298, Jun 01 1984 Nissan Motor Company, Limited System for automatically detecting presence or absence of a preceding vehicle and method therefor
4729660, Mar 22 1985 Position measuring apparatus of moving vehicle
4751658, May 16 1986 E T M REALTY TRUST Obstacle avoidance system
4843565, Jul 30 1987 ZIMMERMAN ASSOCIATES, INC Range determination method and apparatus
4849679, Dec 31 1987 Westinghouse Electric Corp. Image processing system for an optical seam tracker
4851661, Feb 26 1988 The United States of America as represented by the Secretary of the Navy Programmable near-infrared ranging system
//////
Executed onAssignorAssigneeConveyanceFrameReelDoc
Sep 02 1988EVANS, JOHN M JR TRANSITIONS RESEARCH CORPORATION, 15 GREAT PASTURE RD , DANBURY, CTASSIGNMENT OF ASSIGNORS INTEREST 0049560097 pdf
Sep 02 1988WEIMAN, CARL F R TRANSITIONS RESEARCH CORPORATION, 15 GREAT PASTURE RD , DANBURY, CTASSIGNMENT OF ASSIGNORS INTEREST 0049560097 pdf
Sep 02 1988KING, STEVEN J TRANSITIONS RESEARCH CORPORATION, 15 GREAT PASTURE RD , DANBURY, CTASSIGNMENT OF ASSIGNORS INTEREST 0049560097 pdf
Sep 06 1988Transitions Research Corporation(assignment on the face of the patent)
Jun 14 1995Transitions Research CorporationCONNECTICUT INNOVATIONS, INCORPORATEDSECURITY INTEREST SEE DOCUMENT FOR DETAILS 0075210697 pdf
Dec 23 1999HELPMATE ROBOTICS, INC Pyxis CorporationASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0123500104 pdf
Date Maintenance Fee Events
Nov 10 1993ASPN: Payor Number Assigned.
Mar 01 1994M283: Payment of Maintenance Fee, 4th Yr, Small Entity.
Mar 03 1998M284: Payment of Maintenance Fee, 8th Yr, Small Entity.
Mar 19 2002REM: Maintenance Fee Reminder Mailed.
Jun 27 2002M282: 11.5 yr surcharge- late pmt w/in 6 mo, Small Entity.
Jun 27 2002M285: Payment of Maintenance Fee, 12th Yr, Small Entity.


Date Maintenance Schedule
Sep 04 19934 years fee payment window open
Mar 04 19946 months grace period start (w surcharge)
Sep 04 1994patent expiry (for year 4)
Sep 04 19962 years to revive unintentionally abandoned end. (for year 4)
Sep 04 19978 years fee payment window open
Mar 04 19986 months grace period start (w surcharge)
Sep 04 1998patent expiry (for year 8)
Sep 04 20002 years to revive unintentionally abandoned end. (for year 8)
Sep 04 200112 years fee payment window open
Mar 04 20026 months grace period start (w surcharge)
Sep 04 2002patent expiry (for year 12)
Sep 04 20042 years to revive unintentionally abandoned end. (for year 12)