Aspects of the disclosure relate generally to detecting discrete actions by traveling vehicles. The features described improve the safety, use, driver experience, and performance of autonomously controlled vehicles by performing a behavior analysis on mobile objects in the vicinity of an autonomous vehicle. Specifically, an autonomous vehicle is capable of detecting and tracking nearby vehicles and is able to determine when these nearby vehicles have performed actions of interest by comparing their tracked movements with map data.
|
0. 21. A method comprising:
receiving, by one or more sensors of a vehicle configured to operate in a fully autonomous mode, sensor data of an external environment of the vehicle;
filtering, by one or more computing devices of the vehicle, the received sensor data to identify one or more actions of interest of an object in the external environment, the filtering including filtering the received sensor data to only include instances where the object has performed an action of interest;
determining, by the one or more computing devices, whether the vehicle cannot currently operate in the fully autonomous mode based on the identified one or more actions of interest; and
upon determining by the one or more computing devices that the vehicle cannot currently operate in the fully autonomous mode, the one or more computing devices altering a control strategy;
wherein altering the control strategy includes either (i) delaying changing to the fully autonomous mode, or (ii) ceding control of the vehicle to a remote operator.
0. 1. A method comprising:
controlling, by one or more computing devices, an autonomous vehicle in accordance with a first control strategy;
receiving, by the one or more computing devices, sensor data indicating a detection of a first object;
classifying, by the one or more computing devices, the first object based on the sensor data;
accessing, by the one or more computing devices, behavior data based on a classification of the first object, wherein the behavior data identifies potential actions of the first object that are to result in a change in control strategy, and wherein at least one of the potential actions identified in the behavior data is the action of changing from traveling on a first road element to travelling on a second road element;
determining, by the one or more computing devices, that the first object has performed an action identified in the behavior data; and
based on the determination, altering the control strategy of the autonomous vehicle by the one or more computing devices.
0. 2. The method of
0. 3. The method of
0. 4. The method of
0. 5. The method of
0. 6. The method of
0. 7. The method of
0. 8. A method comprising:
controlling, by one or more computing devices, an autonomous vehicle;
receiving, by the one or more computing devices, sensor data indicating a position of a first object external to the autonomous vehicle;
classifying, by the one or more computing devices, the first object based on the sensor data;
accessing, by the one or more computing devices, map data having a plurality of road elements;
comparing the sensor data with the map data;
identifying, by the one or more computing devices, that the first object is travelling on a first road element from the plurality of road elements;
determining, by the one or more computing devices, that based on the comparison of the sensor data with the map data, the first object has travelled from the first road element to a second road element; and
altering, by the one or more computing devices, at least one of a position, heading, speed, and acceleration of the autonomous vehicle based on the determination that the first object has travelled from the first road element to the second road element.
0. 9. The method of
0. 10. The method of
0. 11. The method of
receiving, by the one or more computing devices, a request to navigate between a first location and a second location; and
autonomously navigating, by the one or more computing devices, the autonomous vehicle along a path between the first location and a second location; and
wherein altering at least one of a position,
heading, and speed of the autonomous vehicle, occurs while the autonomous vehicle is travelling along the path.
0. 12. The method of
determining a relative position of the autonomous vehicle with each of the one or more nearby vehicles, and
wherein associating each of the one or more nearby vehicles with a road graph element is based on the relative position.
0. 13. A system for controlling an autonomous vehicle, the system comprising:
one or more sensors for detecting a one or more vehicles in an autonomous vehicle's surroundings; and
one or more processors configured to:
control an autonomous vehicle in accordance with a first control strategy;
receive sensor data indicating a detection of a first object;
classify the first object based on the sensor data;
access behavior data based on a classification of the first object, wherein the behavior data identifies potential actions of the first object that are to result in a change in control strategy, and wherein at least one of the potential actions of the first object is the first object changing from traveling on a first road element to travelling on a second road element;
determine that the first object has performed an action identified in the behavior data; and
alter the control strategy of the autonomous vehicle based on the determination.
0. 14. The system of
0. 15. The system of
0. 16. The system of
0. 17. The system of
0. 18. The system of
0. 19. The system of
0. 20. The system of
receive a request for navigation between a first location and a second location; and
autonomously navigate the autonomous vehicle along a path between the first location and a second location; and
wherein altering the control strategy occurs while the autonomous vehicle travels along the path.
0. 22. The method of claim 21, wherein the object is another vehicle.
0. 23. The method of claim 21, wherein delaying changing to the fully autonomous mode includes the one or more computing devices scanning the external environment to determine whether there are any obstacles affecting an ability of the vehicle to avoid a collision.
0. 24. The method of claim 21, wherein delaying changing to the fully autonomous mode includes the one or more computing devices requiring a driver of the vehicle to control steering or accelerating before entering into the fully autonomous mode.
0. 25. The method of claim 21, wherein ceding control of the vehicle to the remote operator includes sending the received sensor data to a remote party associated with the remote operator.
0. 26. The method of claim 25, further comprising transmitting data or imagery to a remote computing device in conjunction with ceding control to the remote operator.
0. 27. The method of claim 21, wherein the identified one or more actions of interest include another vehicle changing lanes.
0. 28. The method of claim 21, wherein the identified one or more actions of interest include another vehicle changing its route.
|
The present application is a continuation reissue of application Ser. No. 15/847,390 filed Dec. 19, 2017 (now abandoned), which is an application for reissue of U.S. Pat. No. 9,216,737 (“the '737 patent”). The '737 patent is a continuation of U.S. patent application Ser. No. 14/190,621, filed on Feb. 26, 2014 and issued as U.S. Pat. No. 8,935,034, which is a continuation of U.S. patent application Ser. No. 13/446,518, filed on Apr. 13, 2012, now issued as U.S. Pat. No. 8,700,251, issued on Apr. 15, 2014, the disclosures of which are incorporated herein by reference, More than one reissue application has been filed for the reissue of U.S. Pat. No. 9,216,737. The reissue applications are application Ser. No. 17/138,339 (the present application), application Ser. No. 15/847,390 filed Dec. 19, 2017; application Ser. No. 15/847,064 filed Dec. 19, 2017 (now abandoned); and application Ser. No. 17/138,281, filed concurrently herewith, each of which is a reissue application of U.S. Pat. No. 9,216,737.
Autonomous vehicles use various computing systems to aid in the transport passengers from one location to another. Some autonomous vehicles may require some initial input or continuous input from an operator, such as a pilot, driver, or passenger. Other systems, for example autopilot systems, may be used only when the system has been engaged, which permits the operator to switch from a manual mode (where the operator exercises a high degree of control over the movement of the vehicle) to an autonomous mode (where the vehicle essentially drives itself) to modes that lie somewhere in between.
In various aspects, the disclosure provides a vehicle having a steering device (e.g., wheels that turn in the case of an automobile and a rudder in the case of a boat) and engine. The steering device may be controlled by a first user input controller (e.g., a steering wheel in the cockpit of a car), the engine may be controlled by a second user input controller (e.g., accelerator in the case of a car or a throttle in the case of boat), and both the engine and device may be controlled by a processor capable of executing computer instructions. The vehicle includes one or more sensors (e.g., cameras, radar, laser range finders) for capturing information relating to the environment in which the vehicle is operating. The processor receives data from the sensors and, based in part on data from the sensors or received from external sources or both, issues a navigation command, where the navigation command comprises a command to the steering device relating to the intended direction of the vehicle (e.g., a command to turn the front wheels of a car 10 degrees to the left) or to the engine relating to the intended velocity of the vehicle (e.g., a command to accelerate). Navigation commands may also include commands to brakes to slow the vehicle down, as well as other commands affecting the movement of the vehicle.
In one aspect, sensors are used to detect one or more nearby vehicles surrounding the autonomous vehicle, and data corresponding to these vehicles are sent to a processor. The processor analyzes the data corresponding to the nearby vehicles by comparing the data with map data containing numerous road graph elements. The processor may then associate the nearby vehicles with a specific road graph element, and determine when a nearby vehicle has performed an action of interest. The autonomously controlled vehicle may then alter its control strategy based on the occurrence of the action of interest. The road graph elements may include lanes of a roadway, and the action of interest may include a nearby vehicle changing from a first lane to a second lane.
In another aspect, altering the control strategy includes positioning the autonomous vehicle relative to the nearby vehicle in a predefined manner based on the occurrence of the action of interest. The control strategy may also be further altered based on actions of interest being performed by other nearby vehicles. The autonomous vehicle may also receive a request to navigate between a first location and a second location, where the control strategy includes having the autonomous vehicle travel along a path between the first and second locations.
In another aspect, the autonomous vehicle may filter data collected regarding the position and movement of nearby vehicles. In particular, the data may be filtered to only include instances where a nearby vehicle has performed an action of interest. As provided below, this determination is made by comparing data for the nearby vehicle with map data accessed by the autonomous vehicle.
Aspects of the disclosure relate generally to detecting instances when a vehicle has performed a discrete action of interest. In particular, a device implementing the disclosed system is capable of detecting surrounding vehicles using one or more sensors. The device may then determine when the surrounding vehicles have performed one of several predefined actions by comparing the sensor data with stored road graph data. The system described below may be implemented as part of autonomous driving vehicle. In turn, the autonomous vehicle may react to the behavior of nearby objects in a way that decreases the likelihood of an accident and increases the efficiency of travel.
As shown in
The memory 130 stores information accessible by processor 120, including instructions 132 and data 134 that may be executed or otherwise used by the processor 120. The memory 130 may be of any type capable of storing information accessible by the processor, including a computer-readable medium, or other medium that stores data that may be read with the aid of an electronic device, such as a hard-drive, memory card, ROM. RAM, DVD or other optical disks, as well as other write-capable and read-only memories. Systems and methods may include different combinations of the foregoing, whereby different portions of the instructions and data are stored on different types of media.
The instructions 132 may be any set of instructions to be executed directly (such as machine code) or indirectly (such as scripts) by the processor. For example, the instructions may be stored as computer code on the computer-readable medium. In that regard, the terms “instructions” and “programs” may be used interchangeably herein. The instructions may be stored in object code format for direct processing by the processor, or in any other computer language including scripts or collections of independent source code modules that are interpreted on demand or compiled in advance. Functions, methods and routines of the instructions are explained in more detail below.
The data 134 may be retrieved, stored or modified by processor 120 in accordance with the instructions 132. For instance, although the system and method is not limited by any particular data structure, the data may be stored in computer registers, in a relational database as a table having a plurality of different fields and records, XML documents or flat files. The data may also be formatted in any computer-readable format. By further way of example only, image data may be stored as bitmaps comprised of grids of pixels that are stored in accordance with formats that are compressed or uncompressed, lossless (e.g., BMP) or lossy (e.g., JPEG), and bitmap or vector-based (e.g., SVG), as well as computer instructions for drawing graphics. The data may comprise any information sufficient to identify the relevant information, such as numbers, descriptive text, proprietary codes, references to data stored in other areas of the same memory or different memories (including other network locations) or information that is used by a function to calculate the relevant data.
The processor 120 may be any conventional processor, such as commercially available CPU's. Alternatively, the processor may be a dedicated device such as an ASIC or FPGA. Although
In various of the aspects described herein, the processor may be located remote from the vehicle and communicate with the vehicle wirelessly. In other aspects, some of the processes described herein are executed on a processor disposed within the vehicle and others by a remote processor, including taking the steps necessary to execute a single maneuver.
Computer 110 may include all of the components normally used in connection with a computer, such as a central processing unit (CPU), memory 130 (e.g., RAM and internal hard drives) storing data 134 and instructions such as a web browser, an electronic display 142 (e.g., a monitor having a screen, a small LCD touch-screen or any other electrical device that is operable to display information), user input (e.g., a mouse, keyboard, touch screen and/or microphone), as well as various sensors (e.g., a video camera) for gathering explicit (e.g., a gesture) or implicit (e.g. “the person is asleep”) information about the states and desires of a person.
The vehicle may also include a geographic position component 144 in communication with computer 110 for determining the geographic location of the device. For example, the position component may include a GPS receiver to determine the device's latitude, longitude and/or altitude position. Other location systems such as laser-based localization systems, inertial-aided GPS, or camera-based localization may also be used to identify the location of the vehicle. The location of the vehicle may include an absolute geographical location, such as latitude, longitude, and altitude as well as relative location information, such as location relative to other cars immediately around it which can often be determined with less noise than absolute geographical location.
The device may also include other features in communication with computer 110, such as an accelerometer, gyroscope or another direction/speed detection device 146 to determine the direction and speed of the vehicle or changes thereto. By way of example only, acceleration device 146 may determine its pitch, yaw or roll (or changes thereto) relative to the direction of gravity or a plane perpendicular thereto. The device may also track increases or decreases in speed and the direction of such changes. The device's provision of location and orientation data as set forth herein may be provided automatically to the user, computer 110, other computers and combinations of the foregoing.
The computer 110 may control the direction and speed of the vehicle by controlling various components. By way of example, if the vehicle is operating in a completely autonomous mode, computer 110 may cause the vehicle to accelerate (e.g., by increasing fuel or other energy provided to the engine), decelerate (e.g., by decreasing the fuel supplied to the engine or by applying brakes) and change direction (e.g., by turning the front two wheels).
As shown in
Computer 110 may use visual or audible cues to indicate whether computer 110 is obtaining valid data from the various sensors, whether the computer is partially or completely controlling the direction or speed of the car or both, whether there are any errors, etc. Vehicle 101 may also include a status indicating apparatus, such as status bar 230, to indicate the current status of vehicle 101. In the example of
In one example, computer 110 may be an autonomous driving computing system capable of communicating with various components of the vehicle. Returning to
The vehicle may include components for detecting objects external to the vehicle such as other vehicles, obstacles in the roadway, traffic signals, signs, trees, etc. The detection system may include lasers, sonar, radar, cameras or any other detection devices. For example, if the vehicle is a small passenger car, the car may include a laser mounted on the roof or other convenient location. In one aspect, the laser may measure the distance between the vehicle and the object surfaces facing the vehicle by spinning on its axis and changing its pitch. The vehicle may also include various radar detection units, such as those used for adaptive cruise control systems. The radar detection units may be located on the front and back of the car as well as on either side of the front bumper. In another example, a variety of cameras may be mounted on the car at distances from one another which are known so that the parallax from the different images may be used to compute the distance to various objects which are captured by 2 or more cameras. These sensors allow the vehicle to understand and potentially respond to its environment in order to maximize safety for passengers as well as objects or people in the environment.
Many of these sensors provide data that is processed by the computer in real-time, that is, the sensors may continuously update their output to reflect the environment being sensed at or over a range of time, and continuously or as-demanded provide that updated output to the computer so that the computer can determine whether the vehicle's then-current direction or speed should be modified in response to the sensed environment.
The vehicle may also include various radar detection units, such as those used for adaptive cruise control systems. The radar detection units may be located on the front and back of the car as well as on either side of the front bumper. As shown in the example of
In another example, a variety of cameras may be mounted on the vehicle. The cameras may be mounted at predetermined distances so that the parallax from the images of 2 or more cameras may be used to compute the distance to various objects. As shown in
Each sensor may be associated with a particular sensor field in which the sensor may be used to detect objects.
In another example, an autonomous vehicle may include sonar devices, stereo cameras, a localization camera, a laser, and a radar detection unit each with different fields of view. The sonar may have a horizontal field of view of approximately 60 degrees for a maximum distance of approximately 6 meters. The stereo cameras may have an overlapping region with a horizontal field of view of approximately 50 degrees, a vertical field of view of approximately 10 degrees, and a maximum distance of approximately 30 meters. The localization camera may have a horizontal field of view of approximately 75 degrees, a vertical field of view of approximately 90 degrees and a maximum distance of approximately 10 meters. The laser may have a horizontal field of view of approximately 360 degrees, a vertical field of view of approximately 30 degrees, and a maximum distance of 100 meters. The radar may have a horizontal field of view of 60 degrees for the near beam, 30 degrees for the far beam, and a maximum distance of 200 meters.
The sensors described may be used to identify, track and predict the movements of pedestrians, bicycles, other vehicles, or objects in the roadway. For example, the sensors may provide the location and shape information of objects surrounding the vehicle to computer 110, which in turn may identify the object as another vehicle. The object's current movement may be also be determined by the sensor (e.g., the component is a self-contained speed radar detector) or by the computer 110 based on information provided by the sensors (e.g., by comparing changes in the object's position data over time).
The computer may change the vehicle's current path and speed based on the presence of detected objects. For example, the vehicle may automatically slow down if its current speed is 50 mph and it detects, by using its cameras and using optical-character recognition, that it will shortly pass a sign indicating that the speed limit is 35 mph. Yet further, if the computer determines that an object is obstructing the intended path of the vehicle, it may maneuver the vehicle around the obstruction.
In accordance with one aspect, the autonomous vehicle's computer system 110 may identify when another detected vehicle has performed a particular action of interest.
The position and movement data for the detected vehicles 510-550 may be stored in database 137 of the autonomous driving computer system, as shown in
For example, database 138 may include a set of actions or behaviors of interest, such as the vehicle changing lanes or routes, and instructions 132 may allow for computer system 110 to identify when a detected vehicle has performed one or more of the behaviors of interest. In particular, computer system 110 may access the recorded position and movement stored in database 137, as well as a road graph of the environment stored in database 136. By combining both sets of data, computer system 110, may then determine when one or more of the key behaviors have occurred.
Returning to
In this way, vehicle 101 may associate and track all surrounding vehicles with a particular road graph element, such as a lane of travel or intersection. For example, dashed line 630 in
Vehicle 101 may also filter the data collected for vehicle 510 so that it only contains instances where vehicle 510 has performed an action of interest. As provided by dotted line 630 on map 600, vehicle 510 changes it's heading around point 640, as it begins to travel from a north-west direction to a more south-west direction. While vehicle 101 will collect data regarding vehicle 510's change in heading, computer 110 will also determine that the change in heading does not correspond to an action of interest, as vehicle 510 merely travels along the same road graph element. Vehicle 101 may, in turn, exclude the data corresponding to vehicle 510's change in heading at point 640 as being recorded as an action of interest.
In another embodiment, autonomous vehicle 101 may transport itself, passengers, and/or cargo between two locations by following a route. For example, a driver may input a destination and activate an autonomous mode of the vehicle. In response, the vehicle's computer 110 may calculate a route using a map, its current location, and the destination. Based on the route (or as part of the route generation), the vehicle may determine a control strategy for controlling the vehicle along the route to the destination. In accordance with one embodiment, computer system 110 may control the autonomous vehicle 101 to take particular actions in response to the actions of the surrounding objects that have been identified as performing a behavior of interest. For example, by changing lanes as provided by arrow B2 of
As another example, vehicle 520 may come to a stop for a period of time before making the left-hand turn designated by arrow A2. Computer system 110 may identify this action as a behavior of interest, depending on which road element vehicle 520 is travelling. Specifically, if vehicle 520 is determined to be in a left-hand turn lane, vehicle 101 may not identify vehicle 520 having stopped will not as a behavior of interest. However, if vehicle 520 was travelling one lane over to the right, the fact that it has stopped could indicate that there is a backup ahead. Accordingly, vehicle 101 may adjust its control strategy based on which road graph element (e.g., lane) a vehicle is currently travelling.
Flow diagram 700 of
For each vehicle that has been detected, it may be determined whether the vehicle can be associated with a particular road graph element (e.g., a road, lane of traffic, intersection, or other map element) contained in the map data (Block 730). For example, based on a detected vehicle's state information, computer 110 may determine that the detected vehicle is travelling within a particular lane of traffic represented in the road graph. Computer 110 may then track the detected vehicles as they travel along the associated road graph element, (Block 735) and may determine when one of the detected vehicles has performed a behavior of interest (Block 740). Based on determining that a detected vehicle has performed an action of interest, such as a lane change, computer 110 may then alter the control strategy of autonomous vehicle 101 (Block 745). Blocks 715 through 745 may then be repeated until autonomous vehicle 101 has reached its destination or the autonomous control has otherwise terminated (Block 750). In this way, vehicle 101 may further alter the control strategy upon any of the detected vehicles performing an action of interest.
Vehicle 101 may include one or more user input devices that enable a user to provide information to the autonomous driving computer 110. For example, a user, such as a passenger, may input a destination (e.g., 123 Oak Street) into the navigation system using touch screen 217 or button inputs 219. In another example, a user may input a destination by identifying the destination. In that regard, the computer system may extract the destination from a user's spoken command.
The various systems described above may be used by the computer to operate the vehicle and maneuver from one location to another. For example, a user may enter destination information into the navigation system, either manually or audibly. The vehicle may determine its location to a few inches based on a combination of the GPS receiver data, the sensor data, as well as the detailed map information. In response, the navigation system may generate a route between the present location of the vehicle and the destination.
When the driver is ready to relinquish some level of control to the autonomous driving computer, the user may activate the computer. The computer may be activated, for example, by pressing a button or by manipulating a lever such as gear shifter 220. Rather than taking control immediately, the computer may scan the surroundings and determine whether there are any obstacles or objects in the immediate vicinity which may prohibit or reduce the ability of the vehicle to avoid a collision. In this regard, the computer may require that the driver continue controlling the vehicle manually or with some level of control (such as the steering or acceleration) before entering into a fully autonomous mode.
Once the vehicle is able to maneuver safely without the assistance of the driver, the vehicle may become fully autonomous and continue to the destination. The driver may continue to assist the vehicle by controlling, for example, steering or whether the vehicle changes lanes, or the driver may take control of the vehicle immediately in the event of an emergency.
The vehicle may continuously use the sensor data to identify objects, such as traffic signals, people, other vehicles, and other objects, in order to maneuver the vehicle to the destination and reduce the likelihood of a collision. The vehicle may use the map data to determine where traffic signals or other objects should appear and take actions, for example, by signaling turns or changing lanes. Once the vehicle has arrived at the destination, the vehicle may provide audible or visual cues to the driver. For example, by displaying “You have arrived” on one or more of the electronic displays.
The vehicle may be only partially autonomous. For example, the driver may select to control one or more of the following: steering, acceleration, braking, and emergency braking.
The vehicle may also have one or more user interfaces that allow the driver to reflect the driver's driving a style. For example, the vehicle may include a dial which controls the level of risk or aggressiveness with which a driver would like the computer to use when controlling the vehicle. For example, a more aggressive driver may want to change lanes more often to pass cars, drive in the left lane on a highway, maneuver the vehicle closer to the surrounding vehicles, and drive faster than less aggressive drivers. A less aggressive driver may prefer for the vehicle to take more conservative actions, such as somewhat at or below the speed limit, avoiding congested highways, or avoiding populated areas in order to increase the level of safety. By manipulating the dial, the thresholds used by the computer to calculate whether to pass another car, drive closer to other vehicles, increase speed and the like may change. In other words, changing the dial may affect a number of different settings used by the computer during its decision making processes. A driver may also be permitted, via the user interface 225, to change individual settings that relate to the driver's preferences. In one embodiment, insurance rates for the driver or vehicle may be based on the style of the driving selected by the driver.
Aggressiveness settings may also be modified to reflect the type of vehicle and its passengers and cargo. For example, if an autonomous truck is transporting dangerous cargo (e.g., chemicals or flammable liquids), its aggressiveness settings may be less aggressive than a car carrying a single driver—even if the aggressive dials of both such a truck and car are set to “high.” Moreover, trucks traveling across long distances over narrow, unpaved, rugged or icy terrain or vehicles may be placed in a more conservative mode in order reduce the likelihood of a collision or other incident.
In another example, the vehicle may include sport and non-sport modes which the user may select or deselect in order to change the aggressiveness of the ride. By way of example, while in “sport mode”, the vehicle may navigate through turns at the maximum speed that is safe, whereas in “non-sport mode”, the vehicle may navigate through turns at the maximum speed which results in g-forces that are relatively imperceptible by the passengers in the car.
The vehicle's characteristics may also be adjusted based on whether the driver or the computer is in control of the vehicle. For example, when a person is driving manually the suspension may be made fairly stiff so that the person may “feel” the road and thus drive more responsively or comfortably, while, when the computer is driving, the suspension may be made such softer so as to save energy and make for a more comfortable ride for passengers.
The driver may also select to have his or her vehicle communicate with other devices. As shown in
Vehicle 101 may also receive updated map or object data via network 820. For example, server 810 may provide vehicle 101 with new data relating to object classifications and behavior model information. Computer system 110, of
As these number and usage of these autonomous vehicles increases, various sensors and features may be incorporated into the environment to increase the perception of the vehicle. For example, low-cost beacon transmitters may be placed on road signs, traffic signals, roads or other highway infrastructure components in order to improve the computer's ability to recognize these objects, their meaning, and state. Similarly, these features may also be used to provide additional information to the vehicle and driver such as, whether the driver is approaching a school or construction zone. In another example, magnets, RFID tags or other such items may be placed in the roadway to delineate the location of lanes, to identify the ground speed vehicle, or increase the accuracy of the computer's location determination of the vehicle.
Autonomous vehicles may also be controlled remotely. For example, if the driver is asleep, the sensor data may be sent to a third party so that vehicle may continue to have a responsive operator. While delay and latency may make this type of telemetry driving difficult, it may for example be used in emergency situations or where the vehicle has gotten itself stuck. The vehicle may send data and images to a central office and allow a third party to remotely drive the vehicle for a short period until the emergency has passed or the vehicle is no longer stuck.
As these and other variations and combinations of the features discussed above can be utilized without departing from the invention as defined by the claims, the foregoing description of exemplary embodiments should be taken by way of illustration rather than by way of limitation of the invention as defined by the claims. It will also be understood that the provision of examples of the invention (as well as clauses phrased as “such as,” “e.g.”, “including” and the like) should not be interpreted as limiting the invention to the specific examples; rather, the examples are intended to illustrate only some of many possible aspects.
Ferguson, David I., Zhu, Jiajun, Dolgov, Dmitri A.
Patent | Priority | Assignee | Title |
Patent | Priority | Assignee | Title |
5765116, | Aug 28 1993 | Lucas Industries public limited company | Driver assistance system for a vehicle |
6385536, | Apr 11 2000 | Kabushikikaisha Equos Research | Navigation apparatus, method for map matching performed in the navigation apparatus, and computer-readable medium storing a program for executing the method |
6385539, | Aug 13 1999 | 21ST CENTURY GARAGE LLC | Method and system for autonomously developing or augmenting geographical databases by mining uncoordinated probe data |
6405132, | May 23 1994 | AMERICAN VEHICULAR SCIENCES LLC | Accident avoidance system |
6526352, | Jul 19 2001 | AMERICAN VEHICULAR SCIENCES LLC | Method and arrangement for mapping a road |
7072764, | Jul 18 2000 | University of Minnesota | Real time high accuracy geospatial database for onboard intelligent vehicle applications |
7102496, | Jul 30 2002 | Yazaki North America, Inc. | Multi-sensor integration for a vehicle |
7124027, | Jul 11 2002 | Yazaki North America, Inc. | Vehicular collision avoidance system |
7317973, | Mar 09 2002 | Robert Bosch GmbH | Automatic vehicle guidance method and system |
7499804, | Oct 22 2004 | iRobot Corporation | System and method for multi-modal control of an autonomous vehicle |
7510038, | Jun 11 2003 | Steering Solutions IP Holding Corporation | Steering system with lane keeping integration |
7894951, | Oct 21 2005 | iRobot Corporation | Systems and methods for switching between autonomous and manual operation of a vehicle |
8126642, | Oct 24 2008 | SAMSUNG ELECTRONICS CO , LTD | Control and systems for autonomously driven vehicles |
8364366, | Jun 24 2005 | Deere & Company | System and method for providing a safety zone associated with a vehicle |
8428820, | Dec 09 2005 | HELLA GMBH & CO KGAA | Path planning |
8437890, | Mar 05 2009 | Massachusetts Institute of Technology | Integrated framework for vehicle operator assistance based on a trajectory prediction and threat assessment |
8457827, | Mar 15 2012 | GOOGLE LLC | Modifying behavior of autonomous vehicle based on predicted behavior of other vehicles |
8589014, | Jun 01 2011 | GOOGLE LLC | Sensor field selection |
8660734, | Oct 05 2010 | GOOGLE LLC | System and method for predicting behaviors of detected objects |
8676487, | Feb 09 2009 | Toyota Jidosha Kabushiki Kaisha | Apparatus for predicting the movement of a mobile body |
8700251, | Apr 13 2012 | GOOGLE LLC | System and method for automatically detecting key behaviors by vehicles |
8718861, | Apr 11 2012 | GOOGLE LLC | Determining when to drive autonomously |
8935034, | Apr 13 2012 | GOOGLE LLC | System and method for automatically detecting key behaviors by vehicles |
8948955, | Oct 05 2010 | GOOGLE LLC | System and method for predicting behaviors of detected objects |
8983679, | Feb 27 2009 | Toyota Jidosha Kabushiki Kaisha | Movement trajectory generator |
9216737, | Apr 13 2012 | Waymo LLC | System and method for automatically detecting key behaviors by vehicles |
9381916, | Feb 06 2012 | GOOGLE LLC | System and method for predicting behaviors of detected objects through environment representation |
9495874, | Apr 13 2012 | GOOGLE LLC | Automated system and method for modeling the behavior of vehicles and other agents |
20020198632, | |||
20030191568, | |||
20040083037, | |||
20050015203, | |||
20050060069, | |||
20050149251, | |||
20080177470, | |||
20090118994, | |||
20090292468, | |||
20100194593, | |||
20100198491, | |||
20100332127, | |||
20120083960, | |||
20130060414, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Dec 30 2020 | Waymo LLC | (assignment on the face of the patent) | / |
Date | Maintenance Fee Events |
Dec 30 2020 | BIG: Entity status set to Undiscounted (note the period is included in the code). |
Date | Maintenance Schedule |
Sep 12 2026 | 4 years fee payment window open |
Mar 12 2027 | 6 months grace period start (w surcharge) |
Sep 12 2027 | patent expiry (for year 4) |
Sep 12 2029 | 2 years to revive unintentionally abandoned end. (for year 4) |
Sep 12 2030 | 8 years fee payment window open |
Mar 12 2031 | 6 months grace period start (w surcharge) |
Sep 12 2031 | patent expiry (for year 8) |
Sep 12 2033 | 2 years to revive unintentionally abandoned end. (for year 8) |
Sep 12 2034 | 12 years fee payment window open |
Mar 12 2035 | 6 months grace period start (w surcharge) |
Sep 12 2035 | patent expiry (for year 12) |
Sep 12 2037 | 2 years to revive unintentionally abandoned end. (for year 12) |