A system includes a processor. The processor is configured to derive one or more partitions of a field based on a vehicle system data via a learning system. The processor is further configured to derive one or more turn features representative of vehicle turns in the field based on the vehicle system via the learning system. The processor is also configured to derive an autonomous vehicle plan based on the partitions of the field and the turn features via a planning system, wherein the autonomous vehicle plan comprises a planned route of the autonomous vehicle in the field.
|
9. A method, comprising:
deriving one or more partitions of a field based on a vehicle system data via a learning system;
deriving one or more turn features representative of vehicle turns in the field based on the vehicle system via the learning system;
deriving an autonomous vehicle plan based on the partitions of the field and the turn features via a planning system, wherein the autonomous vehicle plan comprises a planned route of the autonomous vehicle in the field, wherein the learning system comprises a forward-chained expert system, a backward-chained expert system, a neural network, a data mining system, a state vector machine (SVM), a decision tree learning system, an association rule learning system, a deep learning system, an inductive logic programming system, a genetic algorithm, or a combination thereof.
1. A system, comprising:
a processor configured to:
derive one or more partitions of a field based on a vehicle system data via a learning system;
derive one or more turn features representative of vehicle turns in the field based on the vehicle system via the learning system;
derive an autonomous vehicle plan based on the partitions of the field and the turn features via a planning system, wherein the autonomous vehicle plan comprises a planned route of the autonomous vehicle in the field, wherein the learning system comprises a forward-chained expert system, a backward-chained expert system, a neural network, a data mining system, a state vector machine (SVM), a decision tree learning system, an association rule learning system, a deep learning system, an inductive logic programming system, a genetic algorithm, or a combination thereof.
15. A non-transitory, computer readable medium comprising instructions that when executed by a processor cause the processor to:
derive one or more partitions of a field based on a vehicle system data via a learning system;
derive one or more turn features representative of vehicle turns in the field based on the vehicle system via the learning system;
derive an autonomous vehicle plan based on the partitions of the field and the turn features via a planning system, wherein the autonomous vehicle plan comprises a planned route of the autonomous vehicle in the field, wherein the learning system comprises a forward-chained expert system, a backward-chained expert system, a neural network, a data mining system, a state vector machine (SVM), a decision tree learning system, an association rule learning system, a deep learning system, an inductive logic programming system, a genetic algorithm, or a combination thereof.
2. The system of
3. The system of
4. The system of
5. The system of
6. The system of
7. The system of
8. The system of
10. The method of
11. The method of
12. The method of
13. The method of
14. The method of
16. The non-transitory, computer readable medium of
17. The non-transitory, computer readable medium of
18. The non-transitory, computer readable medium of
19. The non-transitory, computer readable medium of
|
The invention relates generally to autonomous vehicle systems, and more specifically, to autonomous vehicle system planning applications.
Certain vehicles, such as agricultural tractors may be operated in fields having a variety of soil conditions and obstacles. For example, an autonomous vehicle such as a tractor may be driven through a field having soft soil (e.g., due to a high moisture content of the soil), around ponds, in proximity to human structures and boundaries (e.g., fences, barns), and so on. Generally, the autonomous vehicle may be provided a plan that may be used by the autonomous vehicle to follow certain paths and to avoid certain terrain features, as well as for following agricultural operations such as planting, fertilizing, and so on. It may be beneficial to improve the planning of autonomous vehicle system operations.
In one embodiment, a system includes a processor. The processor is configured to derive one or more partitions of a field based on a vehicle system data via a learning system. The processor is further configured to derive one or more turn features representative of vehicle turns in the field based on the vehicle system via the learning system. The processor is also configured to derive an autonomous vehicle plan based on the partitions of the field and the turn features via a planning system, wherein the autonomous vehicle plan comprises a planned route of the autonomous vehicle in the field.
In a further embodiment, a method deriving one or more partitions of a field based on a vehicle system data via a learning system. The method also includes deriving one or more turn features representative of vehicle turns in the field based on the vehicle system via the learning system. The method additionally includes deriving an autonomous vehicle plan based on the partitions of the field and the turn features via a planning system, wherein the autonomous vehicle plan comprises a planned route of the autonomous vehicle in the field.
In another embodiment, non-transitory, computer readable medium comprise instructions that when executed by a processor cause the processor to derive one or more partitions of a field based on a vehicle system data via a learning system. The instructions when executed by the processor further cause the processor derive one or more turn features representative of vehicle turns in the field based on the vehicle system via the learning system. The instructions when executed by the processor additionally cause the processor to derive an autonomous vehicle plan based on the partitions of the field and the turn features via a planning system, wherein the autonomous vehicle plan comprises a planned route of the autonomous vehicle in the field.
These and other features, aspects, and advantages of the present invention will become better understood when the following detailed description is read with reference to the accompanying drawings in which like characters represent like parts throughout the drawings, wherein:
Certain agricultural and other operations (mining, construction, and the like) may use an unmanned and/or manned vehicle such as a tractor or other vehicle. For agricultural operations, the vehicle may tow or include an agricultural implement such as a planter, seeder, fertilizer, and so on. In operations, the vehicle uses a plan suitable for defining vehicle operations. For example, the plan may include a map having field boundaries, the plan may include driving paths, as well as row turns, agricultural operations (e.g., planting, seeding, fertilizing, plowing) and the like, that the autonomous vehicle system should follow. The vehicle then may autonomously operate based on plan data. While plan data may be designed, for example, via a human operator, e.g., a human planner, it may be beneficial to provide for techniques that may create and/or update a plan based on certain data, such as data derived from observations of a human operator (e.g., agricultural vehicle operator) working on a field. Accordingly, the techniques described herein include a learning system that may observe a human operator drive a vehicle system and that then learn how to operate within a certain field based on the observations.
In certain embodiments, the learning system may extract general features from the observations, such as a size of headlands, direction of rows, ordering of sections, turn types used, field partitioning, and so on. The extracted information may be stored and used when planning operations in the same field used to gather the observations. Learned parameters may be general so that they can be applied in a flexible way during planning. For example recording the actual path may be too specific because the recorded path may not be appropriate if implement width changes or obstacles are introduced to the field. The learning system would derive certain learned parameters but still flexibly modify the learned parameters based on certain changes to operations. For example if turn angle has been adjusted such that the autonomous vehicle can no longer turn inside of certain angles then the size of the headlands may be increased by the learning system. In this manner, a more efficient and productive plan may be realized, thus improving farming operations and increasing field yields.
Turning now to
As the agricultural vehicle 10 and the agricultural implement 12 traverse the field during operations, the agricultural vehicle 10 and the agricultural implement 12 may be driven in certain directions 16 and preferred speeds by the human operator. The human operator may additionally select and use certain operational settings for the agricultural vehicle 10 and/or the agricultural implement 12, such as trench depth, seed flow rate, fertilizer flow rate, pump rates, pressures, height of equipment (e.g., planter height) and so on. As the end of a field is reached, the operator may turn, leaving headlands 20 and 21 unplowed and/or unplanted, and continue operations in adjacent or nearby row. Accordingly, turns 22 are shown. The turns 22 may include a turn direction (clockwise, counterclockwise), a turn angle, and so on.
The operator may partition the field into various sections. For example, two unplanted headlands 20, 21 sections are shown. The field is also partitioned into two planting sections 24, 26 having field edges 28, 30, and 30, 32, respectively. The planting sections 24 and 26 may be created for any variety of reasons, such as to have multiple crops, to operate differently because of soil conditions, because of terrain, because of geographical shape of the field, because of obstacles, and so on. While two headland sections 20, 21 and the two planting sections 24, 26 are shown, it is to be understood that the field may be partitioned into one or more planting sections and/or one or more headland sections. Indeed, depending on the operations involved (e.g., fertilizing, seeding, the geography of the field 14, the field 14 conditions, and so on), the field 14 may include multiple other partitions in addition to or alternative to headlands and planting partitions.
The agricultural vehicle 10 operator may choose various driving patterns with which to operate the agricultural vehicle 10 and the agricultural implement 12 in the field 14, such as a headland pattern, a circuitous pattern, a spiral pattern, and so on, as described in more detail below. As the agricultural vehicle 10 and the agricultural implement 12 are driven, the operator may encounter various field and/or soil conditions, as well as certain structures. Such field and/or soil conditions and structures may be defined as features for purposes of the description herein. For example, the agricultural vehicle 10 and the agricultural implement 12 may encounter features such as a pond 32, a tree stand 36, a building or other standing structure 38, fencing 40, points of interest 42, and so on. The points of interest 42 may include water pumps, above ground fixed or movable equipment (e.g. irrigation equipment, planting equipment), and so on. In certain embodiments, the agricultural vehicle 10 is communicatively coupled to a base station 44. Accordingly, geographic coordinates, agricultural vehicle 10 speed, agricultural vehicle 10 and/or agricultural implement 12 settings (e.g., fuel usage, fuel type, RPMs, trench depth, seed flow rate, fertilizer flow rate, pump rates, pressures, height of equipment), and so on, may be communicated to the base station 44 and/or to a local data logger disposed in the agricultural vehicle 10. Accordingly, drive and/or operational data may be gathered and the used, as described in more detail below, by a learning system suitable for improving planning applications.
Turning now to
Additionally, the vehicle data 54 may include on-board diagnostics II (OBDII) bus and/or controller area network (CAN) bus data such as oil pressure, miles per gallon (MPG), revolutions per minute (RPM), and so on. Implement data 56 may include data such as implement manufacturer, implement type (e.g., type of planter, type of row unit use to plant seed, type of closing system used to close ground trenches, type of fertilizer equipment), implement settings (height above ground, number of row units used in planting, spacing of row units), and/or implement operations data gathered during operations of the agricultural implement 12, such as seeding rates, fertilizer flow rates, pump rates, pressures including vacuum measures, and so on.
The planning features 52 extracted or otherwise “learned” by the learning system 50 may include field partitions 58, turn features 60, driving features 62, operational setting features 64, operating patterns 66, and other features 68. Deriving the field partitions 58 may include learning how operators divide the field 14 into sections or partitions, such as headland sections, planting sections, obstacle sections, fertilizing sections, harvesting sections, and so on. Accordingly, the field 14 may be split into various portions, such as square portions, circular portions, or other geometric and/or geographic shapes. The field partitions 58 may overlap. Additionally, the field partitions 58 may be different depending on field 14 operations. For example, field partitions 58 derived for planting operations may be different from field partitions 58 derived for harvesting operations.
Deriving turn features 60 may include deriving a type of turn 22 (e.g., clockwise, counterclockwise), a turn 22 angle, a turn 22 speed, and the like. Deriving driving features 62 may include deriving how operators approach driving during certain field 14 operations (e.g., planting, seeding, fertilizing) such as speeds used, direction of travel, slowdowns/speedups, number of rows 18, spacing between the rows 18, pattern between the rows 18 (e.g., parallel rows, zig zag rows, spiral rows, circular rows), preferred route(s) around obstacles in the field 14, and so on. Deriving operational features 64 include, for example, deriving how the operator uses the agricultural implement 12 while driving. For example, the operator may set certain equipment based on the field partitions 58, the type of operations (e.g., planting, seeding, fertilizing), the time of the year (e.g., spring planting, fall planting), conditions in the field 14 (e.g., wet conditions, dry conditions), and so on. Accordingly, the operational features 64 derived by the learning system 50 may include agricultural implement 12 settings for a variety of field 14 conditions, for the field partitions 58, as well as how the agricultural implement 12 is used during operations (e.g., seed rates, fertilizer rates, equipment height, pressures, flows, and so on).
Deriving operational patterns 66 may include deriving driving patterns for the field 14 during operations of the agricultural vehicle 10 and the agricultural implement 12. For example, the operator may elect to drive in a spiral or circuit pattern, where operations may begin at an edge of a field partition 58 and the operator drives the agricultural vehicle 10 and the agricultural implement 12 towards a center of the partition 58 in a spiral or circle. Headland patterns may involve using some or all of the headlands 20, 21 for turning, and may further include one way patterns, gathering patterns, and/or casting patterns. In one way patterns the agricultural vehicle 10 and the agricultural implement 12 may be driven on rows 18 parallel to each other starting, for example, at the headland 21, and driving “up” towards the headland 20, turning “down” towards the headland 21, then turning back up towards the headland 20, and so on. Gathering patterns may involve beginning at the center of the field 14 and driving towards edges 28, 32. Casting patterns may involve starting first at the edges 28 or 32 and then working towards the center of the field 14. It is to be understood that any number of operational patterns 66 may be used, that result in spiral rows, parallel rows, zig zag rows, and so on. Other features 68 may be derived, including deriving that there are certain obstacles in the field 14 (e.g., pond 34, tree stand 36, building 38, fencing 40, points of interest 42).
It is also to be noted that the techniques described herein, in certain embodiments, may also use drawing data 70 in addition to or as an alternative to the vehicle data 54 and/or the implement data 56. For example, a user may “draw” on a map of the field 14 a preferred driving route for the agricultural vehicle 10 and the agricultural implement 12. The drawing may include the rows 18 and direction of travel for each row 18, as well as preferred turns 22. The user may also draw obstacles, headland sections, planting sections, fertilizer sections, field edges 28, 32, and so on. The user may further draw preferred row patterns (e.g., parallel rows 18, spiral rows, zig zag rows), turns, start/end points for the agricultural vehicle 10 and the agricultural implement 12, and so on. Additionally, the user may enter other data as drawing data 70, such as by typing in desired driving speeds, seed deposition rates, fertilizer rates, pressures, equipment type, equipment manufacturer, fuel type, and so on. In short, any and all data 54 and 56 may be manually entered by the user in addition to or as an alternative to observing operations of the agricultural vehicle 10 and the agricultural implement 12.
The learning system 50 may then apply certain artificial intelligence (AI) techniques, such as knowledge extraction techniques, neural network pattern matching techniques, data mining techniques, machine learning techniques, and so on, to derive the planning features 52. For example, rule-based system (e.g., forward chained expert systems, backward chained expert systems) may use one or more rules to determine the field partitions 58, the turn features 60, the driving features 62, the operational setting features 64, the operational patterns 66, and other features 68. In certain embodiments, experts, such as expert operators of the agricultural vehicle 10 and the agricultural implement 12 may be interviewed and knowledge elicitation techniques used during the interview to create the one or more rules for the rule-based system.
Likewise, neural networks, state vector machines (SVMs), and the like, may be trained to recognize patterns that result in deriving the field partitions 58, the turn features 60, the driving features 62, the operational setting features 64, the operational patterns 66, and other features 68. Similarly, data mining techniques such as clustering, classification, pattern recognition, and the like, may be used to derive the field partitions 58, the turn features 60, the driving features 62, the operational setting features 64, the operational patterns 66, and other features 68. Additionally or alternatively, machine learning techniques such as decision tree learning, association rule learning, deep learning, inductive logic programming, genetic algorithms, and so on, may be used to learn from data 54, 56, and/or 70 and identify the field partitions 58, the turn features 60, the driving features 62, the operational setting features 64, the operational patterns 66, and other features 68.
The planning features 52 may then be provided to planning system 72, which may use the planning features 52 as input to derive a plan 74. For example, the planning system 72 may use a map 76 of the field 14, along with a set of planning inputs 78. The planning inputs 78 may include the operation (e.g. planting, fertilizing, seeding) that will be executed, for example, via an autonomous vehicle example of the vehicle 12. The planning inputs 78 may additionally include the type of agricultural implement 12 to be used, including desired setting for the agricultural implement 12, such as number of row units, type or row units, number of closing systems, type of closing systems, number of fertilizers, type of fertilizing equipment, type of seeds, type of crop to be planted, and so on.
The planning system 72 may then use the one or more of the derived planning features 52, the map 76, and/or the planning inputs 78 to derive the plan 74. For example, the field partitions 58 may be respected and the plan may include different approaches for each partition 58, including starting and stopping points, speeds, directions, turns, obstacle avoidance routes, and so on. In certain embodiments the planning system 72 may also apply neural network pattern matching techniques, data mining techniques, machine learning techniques, and so on, to derive the plan 74. For example, the planning system 72 may execute an expert system that uses a set of rules to determine a more optimal driving route through the one or more field partitions 58, including speeds to use, where to turn, how to turn (e.g., turn degrees), seed deposition rates to use, fertilizer rates to use, preferred paths to take around obstacles, backup paths to take around obstacles, and so on. Similarly, neural networks, data mining, machine learning techniques, and other artificial intelligence (AI) techniques may be used additional to or alternative to expert systems to derive (or to help derive) the plan 74. In this manner, the plan 74 may include operator knowledge and field 14 data that when used, may result in more efficient and more optimal field 14 operations.
In the illustrated embodiment, the agricultural vehicle 10 includes a spatial location system 108 which is configured to determine a geographic position of the agricultural vehicle 10. As will be appreciated, the spatial location system 108 may include any suitable system configured to determine the position of the agricultural vehicle 10, such as a global positioning system (GPS), for example, and/or GLONASS or other similar system. In certain embodiments, the spatial locating system 108 may additionally or alternatively be configured to determine the position of the agricultural vehicle 10 relative to a fixed point within the field 14 (e.g., via a fixed radio transceiver). Accordingly, the spatial location system 108 may be configured to determine the position of the scouting vehicle 10 relative to a fixed global coordinate system (e.g., via the GPS), a fixed local coordinate system, or a combination thereof. The spatial location system 108 may additionally use real time kinematic (RTK) techniques to enhance positioning accuracy.
The computing system 102 additionally includes a user interface 110 having a display 112. The user interface 110 may receive inputs from a vehicle operator suitable for recording observations of the field 14, among other inputs. The display 112 may provide for textual and/or graphical visualizations suitable for field operations, among others, of the vehicle 10 and/or agricultural implement 12. Data 54 and 56 may be gathered and/or communicated via a field observation system 114. The field observation system 114 may include, for example, software stored in a memory 116 and executed via one or more hardware processors 118 as described in more detail below. A storage device 120 may also be provided, suitable to store digital data, for example, the data 54 and 56.
The processor 118 may include multiple microprocessors, one or more “general-purpose” microprocessors, one or more special-purpose microprocessors, and/or one or more application specific integrated circuits (ASICS), or some combination thereof. For example, the processor 118 may include one or more reduced instruction set (RISC) processors. The memory 116 may include a volatile memory, such as random access memory (RAM), and/or a nonvolatile memory, such as ROM. The memory 116 may store a variety of information and may be used for various purposes. For example, the memory 116 may store processor-executable instructions (e.g., firmware or software) for the processor 118 to execute, such as instructions for the field observation system 114. The storage device 120 may include a hard drive, an optical drive, a flash drive, a solid state storage medium, or combination thereof, suitable for storing digital data, such as driving data logs, agricultural equipment 12 settings, and so on.
An inertial measurement unit (IMU) 122 may also be included, which may include one or more sensors, such as specific force sensors, angular rate sensors, and/or magnetic field change sensors that may provide for the inertial measurements as the vehicle 10 traverses the field 14. The IMU 122 may be used, for example, in at least two ways. One example of use is for terrain compensation which accounts for motion of the spatial location system 108 antenna due to vehicle 10 pitch and roll motions. The second example of use is through use of a dead-reckoning algorithm (e.g., executable via the processor 118 and stored in the memory 116) that verifies motion of the derived GPS position against acceleration measurements obtained via IMU 122 sensors. Dead-reckoning is used to detect GPS hops (e.g., rapid motion of the GPS position that may be caused by solar and/or atmospheric disturbances) to fine tune the position of the vehicle 10. Another basic use of the IMU 122 is to take a heading measurement to orient the vehicle 10 properly in the mapping software and to compensate for poor heading information from the spatial location system 108 at slow speeds or when stopped. In this manner, more accurate positioning data may be provided as part of the data 54, 56.
In certain embodiments, an operator may drive the vehicle 10 into the field 14 for field operations (e.g., planting, seeding, fertilizing) and the field observation system 114 may then capture the data 54, 56. For example, as the vehicle 10 and agricultural implement 12 is driven through the field 14, the spatial location system 108 and IMU 122 may be used to provide geographic locations, driving direction, speed, and so on, and the field observation system 114 may be continuously capturing the location data, the driving directions, the speed, and so on. Likewise, the field observation system 114 may be communicatively coupled to an OBDII and/or CAN port of the vehicle 10 and data such as fuel usage, RPMs, oil level, speed, and so on, may be captured. Further, the data 54, 56 may be recorded and stored in the memory 116, in the storage device 120, and/or communicated via the communication system 104.
In embodiments where the vehicle 10 is an autonomous vehicle, a driving control system 124 is provided, suitable for autonomously driving the vehicle 10. As mentioned earlier, the spatial location system 108 may be used in conjunction with the IMU 122 to derive accurate position data representative of the position of the vehicle 10 in the field 14. The driving control system 124 may use the derived position and may control driving operations such as steering, acceleration, braking, and other agricultural operations (e.g., planting, tilling, fertilizing) of the vehicle 10 and implement 12 in the field 14. It is to be noted that in certain autonomous embodiments of the vehicle 10, the vehicle 10 may provide for a human operator to control the vehicle 10 as an alternative to the driving control system 124. In these autonomous embodiments, the human operator may drive the vehicle 10 while the field observation system 114 collects the data 54, 56.
The data 54, 56 may be transmitted to the communications system 106 of the base station 44. The base station 44 may then execute the learning system 50 to derive the features 52. Once the features 52 are extracted via the learning system 50, the planning system 72 may be executed to derive the plan 74. As mentioned earlier, the learning system 50 and the planning system 72 may include executable code or instructions. Accordingly, the code or instructions may be executable via processor(s) 126 and stored in a memory 130 included in a computing device 130 of a computing system 132. It is to be understood that the learning system 50 and/or the planning system may disposed in other systems additional to or alternative to the base station 44. In embodiments, where the systems 50 and 72 are software systems the systems 50 and/or 72 may be stored an executed in a desktop computer, a laptop, a notebook, a tablet, a smart phone, and so on.
The computing system 132 may additionally include a user interface 134 having a display 136. The user interface 134 may receive inputs from a planning operator suitable for creating the plan 74, among other inputs. The display 136 may provide for textual and/or graphical visualizations suitable for learning and/or planning operations, among others, of the vehicle 10 and/or agricultural implement 12. Data 54 and 56 gathered and/or communicated via a field observation system 114 may be provided to the learning system 50 to derive the learned features 52. A storage device 138 may also be provided, suitable to store digital data, for example, the features 52 and/or plan 56. It is to be noted that the drawing data 70 may be entered via the user interface 110 and/or the user interface 134.
The processor 126 may include multiple microprocessors, one or more “general-purpose” microprocessors, one or more special-purpose microprocessors, and/or one or more application specific integrated circuits (ASICS), or some combination thereof. For example, the processor 126 may include one or more reduced instruction set (RISC) processors. The memory 128 may include a volatile memory, such as random access memory (RAM), and/or a nonvolatile memory, such as ROM. The memory 128 may store a variety of information and may be used for various purposes. For example, the memory 128 may store processor-executable instructions (e.g., firmware or software) for the processor 118 to execute, such as instructions for the learning system 50 and the planning system 74. The storage device 138 may include a hard drive, an optical drive, a flash drive, a solid state storage medium, or combination thereof, suitable for storing digital data, such as the features 52, the plan 74, driving data logs, agricultural equipment 12 settings, and so on.
Operations data 56 may include data captures agricultural operations during a drive, such as how much seed is deposited, seed deposition rates, fertilizer rates, planting rates, settings of the agricultural implement 12, types of equipment used, manufacturer of the equipment, and so on. The process 200 may then communicate (block 204) the collected data (e.g., data 54, 56, 70) to external systems, for example systems that may include the learning system 50 and/or the planning system 72.
The process 200 may then learn or otherwise derive (block 206) features 52 of the field 14. For example, the process 200 may execute the learning system 50 to learn field partitions 58, turn features 60, driving features 62, operational setting features 64, operating patterns 66, and other features 68. As mentioned earlier, the learning system 50 may include artificial techniques such as machine learning, neural networks, data mining, expert systems, and so on, that may be used to derive the features 52. Once the features 52 are derived, for example, from observations of field 14 operations, then the features 52 may be used by the planning system 72 to derive or otherwise create (block 208) the plan 74. By learning how field 14 operations are performed, for example, by an experienced farmer, the techniques described herein may result in a more efficient and product plan 74 suitable for execution via an autonomous vehicle.
Turning now to
An order of operations for the partitions show may also be derived. For example, the operator may first desire to plant inside of partitions 228 first, followed by a planting inside partition 236 second. Accordingly, the agricultural vehicle may follow certain headlands and row orientations in a desired planting order. When harvesting, the order may be different, thus resulting in a harvesting order. Indeed, different operations (e.g., planting, seeding, fertilizing, harvesting), implements 12, implement settings 12, vehicle 10 types, and so on, may result in different order of partition operations. By deriving multiple partitions, partition types, partition order of operations, features, and so on, the techniques described herein may improve planning operations and result in a more efficient and productive plan 74.
While only certain features of the invention have been illustrated and described herein, many modifications and changes will occur to those skilled in the art. It is, therefore, to be understood that the appended claims are intended to cover all such modifications and changes as fall within the true spirit of the invention.
Bunderson, Nathan Eric, Morwood, Daniel John, Turpin, Bret Todd, Foster, Christopher Alan, Posselius, John Henry
Patent | Priority | Assignee | Title |
11111644, | Dec 14 2017 | Caterpillar Paving Products Inc. | System and method for performing operations on a worksite surface |
11629472, | Dec 14 2017 | Caterpillar Paving Products Inc. | System and method for performing operations on a worksite surface |
Patent | Priority | Assignee | Title |
10139826, | Apr 24 2015 | AUTONOMOUS SOLUTIONS INC.; AUTONOMOUS SOLUTIONS INC | Apparatus and method for controlling a vehicle |
10338240, | Nov 05 2012 | SERCEL SAS | Method and seismic vibrator guidance system based on a field acquired trajectory |
5974348, | Dec 13 1996 | System and method for performing mobile robotic work operations | |
6112143, | Aug 06 1998 | Caterpillar Inc. | Method and apparatus for establishing a perimeter defining an area to be traversed by a mobile machine |
6128574, | Jul 23 1996 | CLAAS KGaA | Route planning system for agricultural work vehicles |
6795786, | Dec 31 2002 | Intel Corporation | Robotic sensor calibration system |
6907336, | Mar 31 2003 | Deere Company | Method and system for efficiently traversing an area with a work vehicle |
6934615, | Mar 31 2003 | Deere & Company | Method and system for determining an efficient vehicle path |
7010425, | Mar 31 2003 | Deere & Company | Path planner and a method for planning a path of a work vehicle |
7539557, | Dec 30 2005 | iRobot Corporation | Autonomous mobile robot |
7742860, | Jun 03 2004 | CLAAS Selbstfahrende Erntemaschinen GmbH | Route planning system and method for agricultural working machines |
8108138, | Oct 02 2008 | The Boeing Company | Optimal vehicle router with energy management system |
8126642, | Oct 24 2008 | SAMSUNG ELECTRONICS CO , LTD | Control and systems for autonomously driven vehicles |
8170785, | Mar 30 2006 | CLAAS Selbstfahrende Erntemaschinen GmbH | Method for creating a route plan for agricultural machine systems |
8209075, | Jul 31 2007 | Deere & Company | Method and system for generating end turns |
8332135, | Feb 20 2009 | CLAAS Selbstfahrende Erntemaschinen GmbH | Method for generating reference driving tracks for agricultural working machines |
8571742, | Feb 12 2010 | Murata Machinery, Ltd. | Traveling vehicle system and method of avoiding interference with traveling vehicle |
8718329, | Apr 09 2012 | GM Global Technology Operations LLC | Top-down view classification in clear path detection |
8849494, | Mar 15 2013 | Waymo LLC | Data selection by an autonomous vehicle for trajectory modification |
8930058, | Oct 20 2008 | The United States of America as represented by the Secretary of the Navy | System and method for controlling a vehicle traveling along a path |
9174672, | Oct 28 2013 | GM Global Technology Operations LLC | Path planning for evasive steering maneuver in presence of target vehicle and surrounding objects |
9248834, | Oct 02 2014 | GOOGLE LLC | Predicting trajectories of objects based on contextual information |
9612128, | Apr 29 2015 | Microsoft Technology Licensing, LLC | Controlling travel route planning module based upon user travel preference |
9933787, | Apr 29 2013 | VERGE TECHNOLOGIES IP CORP | Method and system for determining optimized travel path for agricultural implement on land with obstacle |
20050075785, | |||
20060040239, | |||
20120185113, | |||
20130081830, | |||
20140324345, | |||
20160052546, | |||
20160265922, | |||
WO9321587, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
May 02 2017 | CNH Industrial America LLC | (assignment on the face of the patent) | / | |||
May 02 2017 | Autonomous Solutions, Inc. | (assignment on the face of the patent) | / | |||
May 11 2017 | POSSELIUS, JOHN HENRY | CNH Industrial America LLC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 042526 | /0956 | |
May 12 2017 | TURPIN, BRET TODD | AUTONOMOUS SOLUTIONS, INC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 042526 | /0885 | |
May 12 2017 | MORWOOD, DANIEL JOHN | AUTONOMOUS SOLUTIONS, INC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 042526 | /0885 | |
May 12 2017 | BUNDERSON, NATHAN ERIC | AUTONOMOUS SOLUTIONS, INC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 042526 | /0885 | |
May 15 2017 | FOSTER, CHRISTOPHER ALAN | CNH Industrial America LLC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 042526 | /0956 |
Date | Maintenance Fee Events |
Jul 06 2023 | M1551: Payment of Maintenance Fee, 4th Year, Large Entity. |
Date | Maintenance Schedule |
May 12 2023 | 4 years fee payment window open |
Nov 12 2023 | 6 months grace period start (w surcharge) |
May 12 2024 | patent expiry (for year 4) |
May 12 2026 | 2 years to revive unintentionally abandoned end. (for year 4) |
May 12 2027 | 8 years fee payment window open |
Nov 12 2027 | 6 months grace period start (w surcharge) |
May 12 2028 | patent expiry (for year 8) |
May 12 2030 | 2 years to revive unintentionally abandoned end. (for year 8) |
May 12 2031 | 12 years fee payment window open |
Nov 12 2031 | 6 months grace period start (w surcharge) |
May 12 2032 | patent expiry (for year 12) |
May 12 2034 | 2 years to revive unintentionally abandoned end. (for year 12) |