systems and methods for development testing of vehicles and components are disclosed. In one embodiment, a system includes a position reference system and a command and control architecture. The position reference system is configured to repetitively measure one or more position and motion characteristics of one or more vehicles operating within a control volume. The command and control architecture is configured to receive the repetitively measured characteristics from the position reference system, and to determine corresponding control signals based thereon. The control signals are then transmitted to the one or more vehicles to control at least one of position, movement, and stabilization of the one or more vehicles in a closed-loop feedback manner. The system may further include a health monitoring component configured to monitor health conditions of the one or more vehicles, the control signals being determined at least in part on the health conditions.

Patent
   7813888
Priority
Jul 24 2006
Filed
Jul 24 2006
Issued
Oct 12 2010
Expiry
Jul 13 2029
Extension
1085 days
Assg.orig
Entity
Large
195
50
all paid
24. A method of operating one or more vehicles within a control volume, comprising:
measuring one or more stability and control characteristics of the one or more vehicles using a position reference system including a plurality of measuring devices operatively disposed with respect to the control volume as the one or more vehicles operate within the control volume;
receiving the measured one or more stability and control characteristics from the position reference system;
determining a command signal based on the one or more stability and control characteristics measured using the position reference system;
transmitting the command signal to the one or more vehicles operating within the control volume to control the one or more stability and control characteristics of the one or more vehicles in a closed-loop system; and
simulating one or more simulated vehicles to enable determination of an interoperability of the one or more simulated vehicles with respect to the one or more vehicles operating within the control volume to enable testing of new prototypes before deployment of a physical vehicle.
1. A system for operating one or more vehicles, comprising:
a position reference system including a plurality of cameras operatively disposed along at least a portion of a perimeter of a control volume, the position reference system being configured to repetitively measure one or more stability and control characteristics of the one or more vehicles at least in part using the plurality of cameras as the one or more vehicles operate within the control volume;
a control architecture configured to receive the repetitively measured one or more stability and control characteristics from the position reference system, to determine corresponding command signals based on the repetitively measured one or more stability and control characteristics from the position reference system, and to transmit the corresponding command signals to the one or more vehicles operating within the control volume to control the one or more stability and control characteristics of the one or more vehicles in a closed-loop system; and
a simulation component configured to simulate one or more simulated vehicles via the control architecture, and to determine the corresponding command signals based at least in part on a simulated characteristic of the one or more simulated vehicles.
15. A system for operating one or more vehicles, comprising:
a position reference system including a plurality of cameras operatively disposed with respect to a control volume, the position reference system being configured to repetitively measure one or more stability and control characteristics of the one or more vehicles at least in part using the plurality of cameras as the one or more vehicles operate within the control volume;
a health monitoring component configured to receive health monitoring information from the one or more vehicles, and to assess a health condition of the one or more vehicles based on the received health monitoring information;
a communication component configured to receive the repetitively measured one or more stability and control characteristics from the position reference system and the health monitoring information from the health monitoring component; and
a processing component operatively coupled to the communication component and configured to determine corresponding command signals based on the repetitively measured one or more stability and control characteristics and the health monitoring information, the communication component being further configured to transmit the corresponding command signals to the one or more vehicles to control the one or more stability and control characteristics in a closed-loop system.
2. The system of claim 1, wherein the control architecture includes a health monitoring component configured to receive health monitoring information from the one or more vehicles, and to assess a health condition of the one or more vehicles based on the received health monitoring information, the control architecture being further configured to determine the corresponding command signals based at least partially on the health monitoring information of the one or more vehicles.
3. The system of claim 2, wherein the health monitoring information includes at least one of a characteristic of an onboard vehicle propulsion system, a characteristic of an onboard power system, and a functional degradation of an onboard sensor system.
4. The system of claim 3, wherein at least one of the vehicles includes a flight vehicle, and wherein the onboard vehicle propulsion system comprises one or more rotor assemblies configured to provide lift for the flight vehicle.
5. The system of claim 1, wherein the control architecture includes:
at least one communication network operatively coupled to the position reference system;
a main processing component operatively coupled to the at least one communication network and configured to compute the command signals; and
one or more control modules operatively coupled to the at least one communication network and configured to receive the command signals from the main processing component, to recondition the command signals into a format suitable for use by the one or more vehicles, and to transmit the reconditioned command signals to the one or more vehicles.
6. The system of claim 1, wherein the control architecture includes:
at least one communication network operatively coupled to the position reference system and configured to transmit the one or more stability and control characteristics repetitively measured by the position reference system; and
an onboard processing component positioned on each of the vehicles and configured to receive the one or more stability and control characteristics from the at least one communication network, the onboard processing component being further configured to compute the command signals and to communicate the command signals to an onboard control system of each of the vehicles.
7. The system of claim 6, wherein the onboard processing component further includes a health monitoring component configured to receive health monitoring information regarding an associated one of the vehicles, the onboard processing component being further configured to determine the corresponding command signals based at least partially on the health monitoring information of the associated one of the vehicles.
8. The system of claim 1, wherein the simulation component is configured to determine interoperability of the one or more simulated vehicles with respect to the one or more vehicles operating within the control volume to enable testing of new prototypes before deployment of a physical vehicle.
9. The system of claim 1, wherein at least one of the vehicles and simulated vehicles includes at least one of a flight vehicle and a ground-based vehicle.
10. The system of claim 1, wherein the simulated characteristic of the one or more simulated vehicles includes at least one of a simulated position characteristic, a simulated attitude characteristic, a simulated stability characteristic, a simulated dynamic characteristic, and a simulated health condition characteristic.
11. The system of claim 1, wherein the plurality of cameras are configured to determine six-degree-of-freedom information for each of the vehicles operating within the control volume.
12. The system of claim 1, wherein each of the one or more vehicles has a plurality of retro-reflective markers positioned thereon, and wherein the plurality of cameras are configured to detect at least some of the retro-reflective markers positioned on the one or more test vehicles.
13. The system of claim 12, wherein the plurality of cameras are configured to track the retro-reflective markers using a visible wavelength portion of the spectrum.
14. The system of claim 12, wherein the plurality of cameras are configured to track the retro-reflective markers using an infrared wavelength portion of the spectrum.
16. The system of claim 15, wherein the health monitoring information includes at least one of a characteristic of an onboard vehicle propulsion system, a characteristic of an onboard power system, and a functional degradation of an onboard sensor system.
17. The system of claim 15, wherein the communication component includes:
at least one communication network operatively coupled to the position reference system; and
one or more control modules operatively coupled to the at least one communication network and configured to receive the command signals from the processing component, to recondition the command signals into a format suitable for use by the one or more vehicles, and to transmit the reconditioned command signals to the one or more vehicles.
18. The system of claim 15, wherein:
the communication component includes at least one communication network operatively coupled to the position reference system and configured to transmit the one or more stability and control characteristics repetitively measured by the position reference system; and
the processing component includes one or more onboard processing components, each processing component being positioned on an associated one of the vehicles and configured to receive the one or more stability and control characteristics from the at least one communication network, the onboard processing component being further configured to compute the command signals and to communicate the command signals to an onboard control system of the associated one of the vehicles.
19. The system of claim 15, further comprising a simulation component operatively coupled to the processing component and configured to simulate one or more simulated vehicles, the processing component being further configured to determine the corresponding command signals based at least in part on a simulated characteristic of the one or more simulated vehicles.
20. The system of claim 19, wherein the simulated characteristic of the one or more simulated vehicles includes at least one of a simulated position characteristic, a simulated attitude characteristic, a simulated stability characteristic, a simulated dynamic characteristic, and a simulated health condition characteristic.
21. The system of claim 15, wherein the plurality of cameras are operatively disposed along at least a portion of a perimeter of the control volume.
22. The system of claim 15, wherein each of the one or more vehicles has a plurality of retro-reflective markers positioned thereon, and wherein the plurality of cameras are configured to detect at least some of the retro-reflective markers positioned on the one or more test vehicles.
23. The system of claim 22, wherein the plurality of cameras are configured to track the retro-reflective markers using at least one of a visible wavelength portion of the spectrum and an infrared wavelength portion of the spectrum.
25. The method of claim 24, further comprising receiving health monitoring information from the one or more vehicles, and wherein determining a command signal includes determining a command signal based at least partially on the health monitoring information from the one or more vehicles.
26. The method of claim 25, wherein receiving health monitoring information includes receiving at least one of a characteristic of an onboard vehicle propulsion system, a characteristic of an onboard power system, and a functional degradation of an onboard sensor system.
27. The method of claim 26, wherein at least one of the vehicles includes a flight vehicle, and wherein the onboard vehicle propulsion system comprises one or more rotor assemblies configured to provide lift for the flight vehicle.
28. The method of claim 24, wherein:
receiving the measured one or more stability and control characteristics includes receiving the measured one or more stability and control characteristics using at least one communication network;
determining a command signal includes determining a command signal using a main processing component operatively coupled to the at least one communication network and reconditioning the command signal into a format suitable for use by the one or more vehicles; and
transmitting the command signal to the one or more vehicles includes transmitting the reconditioned command signals to the one or more vehicles.
29. The method of claim 24, wherein includes:
receiving the measured one or more stability and control characteristics includes receiving the measured one or more stability and control characteristics using at least one communication network;
a determining a command signal includes
receiving the measured one or more stability and control characteristics from the communication network into an onboard processing component positioned on an associated one of the vehicles; and
computing the command signal for the associated one of the vehicles using the onboard processing component.
30. The method of claim 29, wherein the onboard processing component further includes a health monitoring component configured to receive health monitoring information regarding the associated one of the vehicles, the onboard processing component being further configured to determine the command signal based at least partially on the health monitoring information.
31. The method of claim 25, wherein determining the command signal includes determining the command signal based at least in part on a simulated characteristic of the one or more simulated vehicles.
32. The method of claim 31, wherein determining the command signal includes determining the command signal based at least in part on at least one of a simulated position characteristic, a simulated attitude characteristic, a simulated stability characteristic, a simulated dynamic characteristic, and a simulated health condition characteristic.
33. The method of claim 24, wherein measuring one or more stability and control characteristics includes measuring one or more stability and control characteristics using a plurality of detection devices operatively distributed along at least a portion of a perimeter of the control volume, the detection devices being configured to determine six-degree-of-freedom information for each of the vehicles operating within the control volume.
34. The method of claim 33, wherein measuring one or more stability and control characteristics includes measuring one or more stability and control characteristics using a plurality of retro-reflective markers positioned on each of the one or more test vehicles.
35. The method of claim 34, wherein measuring one or more stability and control characteristics includes measuring one or more stability and control characteristics using a plurality of camera devices operating configured to track the retro-reflective markers using at least one of a visible wavelength portion of the spectrum and an infrared wavelength portion of the spectrum.

This patent application is related to co-pending, commonly-owned U.S. patent application Ser. No. 11/459,631 entitled “Closed-Loop Feedback Control of Vehicles Using Motion Capture Systems” filed concurrently herewith on Jul. 24, 2006, which application is incorporated herein by reference.

This invention relates generally to systems and methods for rapid development and testing of algorithms and configurations of vehicles, including manned and unmanned flight vehicles, as well as water and land-based vehicles.

Existing methods of developing and testing vehicles, including air, water, and land-based vehicles, typically involve both computer simulations and prototype testing. Unfortunately, computer simulations may be relatively time-consuming to perform and may undesirably simplify many of the complexities of the actual system. Similarly, prototype testing may be undesirably expensive. In the case of flight vehicles, conventional systems such as the BAT Unmanned Aerial Vehicle available from MLB Company of Mountain View, Calif., may only yield a relatively limited number of flight hours and conditions due to operating costs, logistical issues, safety regulations, and other factors.

Although prior art methods of developing and testing vehicles have achieved desirable results, there is room for improvement. More specifically, methods and systems that enable development and testing of algorithms and configurations of vehicles to be performed rapidly, accurately, and economically would have significant utility.

The present invention is directed to systems and methods for rapid development and testing of vehicles and vehicle components. Embodiments of the invention may advantageously provide a dramatic increase in test capability, allowing new vehicles (including air, water, and land-based vehicles) and vehicle components (including hardware and software components) to be more investigated and developed more rapidly, efficiently, and cost effectively in comparison with prior art systems and methods.

In one embodiment, a system for controllably operating one or more vehicles includes a position reference system and a command and control architecture. The position reference system is configured to repetitively measure one or more position and motion characteristics of the one or more vehicles as the one or more vehicles are operating within a control volume. The command and control architecture is configured to receive the repetitively measured one or more position and motion characteristics from the position reference system, and to determine corresponding control signals (in a centralized control mode of operation) based on the repetitively measured one or more position and motion characteristics from the position reference system. The control signals are then transmitted to the one or more vehicles operating within the control volume to control at least one of position, movement, and stabilization of the one or more vehicles in a closed-loop feedback manner. In an alternate embodiment, the system operates in a distributed control mode of operation in which the control signals are determined by onboard processing components located on the vehicles that receive the information measured by the position reference system, and determine the control signals of the associated vehicle.

In a further embodiment, the command and control architecture further includes a health monitoring component configured to receive health monitoring information from the one or more vehicles, and to assess a health condition of the one or more vehicles based on the received health monitoring information. The command and control architecture may be further configured to determine the corresponding control signals based at least in part on the assessed health condition of the one or more vehicles.

In yet another embodiment, a method of operating one or more vehicles includes measuring one or more stability and control characteristics of the one or more vehicles using a position reference system. The measured one or more stability and control characteristics are received from the position reference system, and a command signal is determined based on the one or more stability and control characteristics. The command signal is transmitted to the one or more vehicles operating within the control volume to control the one or more stability and control characteristics of the one or more vehicles in a closed-loop feedback manner.

Embodiments of the present invention are described in detail below with reference to the following drawings.

FIG. 1 is a schematic view of a development system for developing and testing vehicles and vehicle components in accordance with an embodiment of the invention;

FIG. 2 is a schematic view of a command and control architecture of the development system of FIG. 1 in accordance with a particular embodiment of the invention;

FIG. 3 is a flowchart of a control method corresponding to the command and control architecture of FIG. 2 in accordance with another exemplary embodiment of the invention;

FIG. 4 is a schematic view of a development system in accordance with an alternate embodiment of the invention;

FIG. 5 is an enlarged schematic view of a test vehicle of the development system of FIG. 4;

FIG. 6 is an plan view of the test vehicle and a health monitoring control board of the development system of FIG. 4;

FIG. 7 is a partial view of the development system of FIG. 4 in operation with the flight vehicle;

FIG. 8 is a schematic view of a computer system of the development system of FIG. 4;

FIG. 9 is a schematic overview of a health monitoring system of the development system of FIG. 4;

FIGS. 10 through 12 are representative screenshots provided by an analysis and display component of FIG. 9;

FIG. 13 is an isometric view of a test vehicle with an integrated health monitoring control board and processing component in accordance with another embodiment of the invention; and

FIG. 14 is an enlarged elevational view of the health monitoring control board and processing component of FIG. 13.

The present invention relates to systems and methods for rapid development and testing of algorithms and configurations of vehicles, including manned and unmanned flight vehicles, as well as water and land-based vehicles. Many specific details of certain embodiments of the invention are set forth in the following description and in FIGS. 1-14 to provide a thorough understanding of such embodiments. One skilled in the art, however, will understand that the present invention may have additional embodiments, or that the present invention may be practiced without several of the details described in the following description.

In general, systems and methods in accordance with embodiments of the present invention advantageously allow for rapid development and testing of a wide variety of vehicles and vehicle components (hardware and software) in a controlled environment. More specifically, embodiments of the invention may enable new software systems, avionics systems, control algorithms, computer hardware components, sensors, configurations, or other suitable parameters of interest to be quickly and repeatedly tested using one or multiple test vehicles. Because embodiments of the present invention may be scaled to fit within, for example, a suitable laboratory environment, testing and development of new vehicles and vehicle components may be performed rapidly, efficiently, and cost effectively. Thus, embodiments of the invention may provide a dramatic increase in test capability, allowing new vehicles (including air, water, and land-based vehicles) and vehicle components to be more quickly investigated and developed at lower cost.

FIG. 1 is a schematic view of a development system 100 for developing and testing vehicles and vehicle components in accordance with an embodiment of the invention. In this embodiment, the development system 100 includes a command and control computer 102 operatively coupled to a position reference system 120 and to a human interface 150. A plurality of test vehicles 110 are operatively positioned within a control space 122 monitored by the position reference system 120 during operation of the development system 100. As shown in FIG. 1, the test vehicles 110 may include Unmanned Aerial Vehicles (UAVs) 110A, Unmanned Ground Vehicles (UGVs) 110B, or any other suitable type of test vehicles 110C.

The command and control computer 102 operatively communicates with each of the test vehicles 110 via a communications link 105, which may be a wireless link, wire-based link, fiber-optic link, or any other suitable type of communications link. Each communications link 105 carries signals and data between the command and control computer 102 and the test vehicles 110. For example, in the embodiment shown in FIG. 1, the command and control computer 102 is configured to receive video and sensor signals 104 and health monitoring signals 106 from the test vehicles 110, and to transmit appropriate command signals 108 to the test vehicles 110. A command and control software program 112 may be implemented on the command and control computer 102 to perform a variety of functions associated with monitoring and controlling the test vehicles 110 and the various components of the development system 100. Alternately, the command and control computer 102 may include one or more programmable hardware components configured to perform one or more of these functions.

In operation, the command and control computer 102 causes appropriate command signals 108 to be transmitted to the one or more of the test vehicles 110, directing the test vehicles 110 to perform desired activities or functions. For example, the one or more UAV test vehicles 110A may be directed to fly in a desired flight path and to collect desired information using an onboard sensor. Similarly, the UGV and other test vehicles 110B, 110C may be directed to traverse a desired ground path, collect information, or perform other desired activities. The test vehicles 110 may be commanded to move independently of one another, or alternately, two or more of the test vehicles 110 may be commanded to move in a coordinated manner, such as in flocking, swarming, or ultraswarming movements, as described more fully, for example, in Beyond Swarm Intelligence: The Ultraswarm, presented at the IEEE Swarm Intelligence Symposium by Holland et al., Jun. 8, 2005, incorporated herein by reference.

During movement of the test vehicles 110 within the control space 122, the position reference system 120 monitors the positions of the test vehicles 110 and provides position feedback information 122 to the command and control computer 102. The command and control computer 102 compares the position feedback information 122 with the anticipated or desired positions of the test vehicles 110, and causes appropriate command signals 108 to be transmitted to the test vehicles 110 via the communication links 105 to controllably adjust (or maintain) the positions of the test vehicles 110 in their desired positions or along their desired headings. Thus, the position reference system 120 provides the development system 100 with a closed-loop feedback control capability for controllably adjusting the positions and courses of movement of the test vehicles 110. More specifically, the position reference system 120 may advantageously provide closed-loop feedback information 122 that enables the command and control computer 102 to determine and control not only positions and movements, but also attitude and stabilization control commands for proper control and stabilization of the test vehicles 110.

Specific embodiments will now be described in greater detail below in order to facilitate a more thorough understanding of various aspects of systems and methods in accordance with the invention. For example, FIG. 2 is a schematic view of a command and control architecture 200 of the development system 100 of FIG. 1 in accordance with a particular embodiment of the invention. In this embodiment, the command and control architecture 200 includes an interface with the real environment and with the corresponding real dynamics 210 and is configured to perform computations and data management associated with the test vehicles 110 (for simplicity, only the UAV test vehicles 110A and UGV test vehicles 110B are depicted in FIG. 2) within the control space 122. The real environment and dynamics interface 210 may reside on the command and control computer 102 as, for example, part of the command and control software 112. Each test vehicle 110 communicates via the corresponding communication link 105 with the real environment and dynamics interface 210.

As further shown in FIG. 2, the command and control architecture 200 further includes a vehicle control unit 212 for each of the test vehicles 110. Each vehicle control unit 212 may be located on board the respective test vehicle 110, such as for manned test vehicles, or alternately, may be separate (or remote) from the test vehicle 110 (e.g. a ground control unit, air-borne control unit, sea-borne control unit, etc.) as shown in FIG. 2. The vehicle control units 212 operatively communicate with a control data network 240 and with a health monitoring network 242.

The real environment and dynamics functionality 210 also operatively interacts with the position reference system 120. It will be appreciated that the position reference system 120 may be any suitable type of system capable of measuring the positions and movements of the test vehicles 110 within the control space 122. In preferred embodiments, the position reference system 120 is capable of measuring each of the six degrees of freedom that define the positions and movements of each test vehicle 110. For example, embodiments of suitable position reference systems may include laser scanner systems, such as those systems commercially-available from Mensi, Inc. of Alpharetta, Ga., and laser radar systems, such as those generally disclosed, for example, in U.S. Pat. No. 5,202,742 issued to Frank et al., U.S. Pat. No. 5,266,955 issued to Izumi et al., and U.S. Pat. No. 5,724,124 issued to Kai. In other embodiments, position reference systems may include imaging systems such as the Cyrax laser-based imaging system commercially-available from Cyra Technologies, Inc. of San Ramon, Calif. In further embodiments, suitable position reference systems may include radar and laser radar systems, such as, for example, the LR200 laser radar system commercially-available from Leica Geosystems, Inc. of Heerbrugg, Switzerland. Alternately, position reference systems may include global positioning systems (GPS) and infrared global positioning systems (IRGPS), such as those systems generally disclosed, for example, in U.S. Pat. No. 5,589,835 issued to Gildea et al., U.S. Pat. No. 6,452,668 B1, issued to Pratt, and U.S. Pat. Nos. 6,501,543 B2, 6,535,282 B2, 6,618,133 B2, and 6,630,993 B1 issued to Hedges et al., and those systems commercially-available from ARC Second, Inc. of Dulles, Va. Further embodiments of position reference systems may include sonar-based ultrasound systems, such as the type described, for example, in High Resolution Maps from Wide Angle Sonar by Moravec et al. of The Robotics Institute of Carengie-Mellon University, and laser-based point tracking systems of the type commercially-available from Automated Precision, Inc. of Rockville, Md., or any other suitable types of position-measuring systems.

With continued reference to FIG. 2, a reformatting module 214 is operatively coupled between the position reference system 120 and the control data network 240, and is configured to receive the position and motion data measured by the position reference system 120, to reformat these data as necessary, and to broadcast these data to various other components of the command and control architecture 200 via the control data network 240. It will be appreciated that the reformatting module 214 may be separate from the position reference system 120, as shown in FIG. 2, or alternately, may be incorporated within the position reference system 120. As described above, the position, attitude, and movement data provided by the position reference system 120 (and reformatted by the reformatting module 214) are used by the command and control architecture 200 in a closed-loop feedback manner by comparing the actual data with the desired or predicted data (e.g. using the real environment and dynamics functionality 210 (or a simulated environment and dynamics module 220 described below) and issuing appropriate control and stabilization commands (e.g. using the vehicle control modules 212) to controllably adjust (or maintain) one or more of the positions, attitudes, and movements of the test vehicles 110 accordingly.

In this embodiment, the command and control architecture 200 further includes a simulated environment and dynamics module 220 configured to perform computations and data management associated with one or more simulated vehicle modules 222. The simulated environment and dynamics module 220 may also reside on the command and control computer 102 as part of the command and control software 112. Each simulated vehicle module 222 operatively communicates with the control data network 240 and the health monitoring network 242. The simulated environment and dynamics module 220 is further configured to provide simulated position, attitude, and movement data associated with the simulated vehicles 222, as well as health management data associated with the simulated vehicles 222, to the reformatting module 214 for broadcast onto the control data network 240. Thus, the command and control architecture 200 may advantageously be used for developing test vehicles 110 operating in an environment having both real and simulated vehicle and environmental conditions.

One or more operators 224 may issue control commands to various components of the development system 100 via a command and control module 226 of the human interface 150, which operatively communicates with the control data network 240 and with the health monitoring network 242. For example, an operator 224A may transmit appropriate commands to a simulated vehicle module 222A to direct the movement, attitude, activity, or any other desired characteristic of the simulated vehicle module 222A via the control data network 240. In turn, the one or more operators 224 may monitor any desired characteristics of the development system 100 that may be of interest on a situational display module 228 of the human interface 150, including the positions, movements, and health characteristics of the simulated vehicle modules 222 and the test vehicles 110. For example, the operator 224A of the simulated vehicle module 222A may monitor desired characteristics (e.g. position, movement, health characteristics, etc.) on a simulated vehicle display portion 228A of the situational display software 228.

In the embodiment shown in FIG. 2, the command and control architecture 200 further includes a recharge station 230 operatively coupled to the control data and health monitoring networks 240, 242 by a recharge station control unit 232. The recharge station 230 is configured to provide replenishment of expendable resources to the test vehicles 110 based on the health monitoring information broadcast to the health monitoring network 242. A network manager module 234 is coupled to the control data and health monitoring networks 240, 242 and is configured to perform various conventional network management activities and functions. A recording module 236 is coupled to the control data and health monitoring networks 240, 242 and is configured to record the data broadcast on the control data and health monitoring networks 240, 242 during development tests for subsequent analyses. Playback functionality of recorded tests for post-test analysis and demonstration may be achieved using the control and health monitoring networks 240, 242 interface between the recording module 236 and the command and control module 226.

FIG. 3 is a flowchart of a control method 300 corresponding to the command and control architecture 200 of FIG. 2 in accordance with another exemplary embodiment of the invention. In this embodiment, the various components of the development system 100 are initiated at a block 302, and one or more test vehicles 110 (and if applicable, simulated vehicles) within the control space 122 are initiated at a block 304. An operator-defined test plan is input at a block 305, and control signals are communicated to the test vehicles 110 at a block 306. For example, in some embodiments, the command and control computer 102 may cause the vehicle control units 212 to transmit appropriate command signals 108 to the test vehicles 110, directing the test vehicles 110 to perform desired activities or functions. The command and control computer 102 may determine the appropriate command signals based on, for example, one or more control algorithms or software routines installed within the command and control software 112.

At a block 308, the position reference system 120 monitors the positions and movements of the test vehicles 110, and if applicable, the positions and dynamics of the simulated vehicles are also calculated. The position and dynamics data measured by the position reference system 120 (and computed for the simulated vehicles) are communicated to the command and control computer 102 at a block 310. In preferred embodiments, the position reference system 120 is capable of measuring each of the six degrees of freedom that define the position and movement of each test vehicle 110, however, in alternate embodiments, the position reference system 120 may suitably measure fewer than six degrees of freedom. Similarly, at a block 312, health monitoring data collected by sensors located on board each of the test vehicles 110, and if applicable, the simulated vehicle health data, are communicated to the command and control computer 102.

In the embodiment shown in FIG. 3, the control method 300 includes an update of the operator's situational display at a block 330. At a block 332, a determination (or series of determinations) is made to determine whether one or more of the test vehicles 110 and simulated vehicles are approaching or crossing safety limits. If so, the operator may issue appropriate control commands at a block 334 to correct the position or course of the one or more vehicles as needed, and the position and course of the one or more vehicles may be adjusted accordingly at a block 336.

As further shown in FIG. 3, the position and dynamics data measured by the position reference system 120, and health monitoring data transmitted by the onboard sensors, are compared with predicted and desired data values at a block 314. For example, the measured positions, attitudes, and velocities of the test vehicles 110 may be compared with desired values based on a pre-programmed mission profile stored within the command-and-control computer 102. Similarly, vehicle health data, such as battery charge levels, fuel levels, pressure and temperature levels, weapons status, other expendable resource levels, and any other desired health parameters may be compared with anticipated or desired values based on the pre-programmed mission profile.

Based on the comparisons performed at the block 314, a determination (or series of determinations) is made at a block 316 to determine whether a position adjustment of one or more of the test vehicles 110 is needed. If so, the position adjustment of the one or more test vehicles 110 is performed at a block 318. For example, the command and control computer 102 may cause appropriate position control commands to issue from the corresponding vehicle control units 212 to controllably adjust the position of the one more test vehicles 110.

Similarly, at a block 320, a determination (or series of determinations) is made to determine whether a stabilization adjustment of one or more of the test vehicles 110 is needed. If so, adjustment of the appropriate stabilization parameters of the one or more test vehicles 110 is accomplished at a block 322. Prior art for stabilization adjustment typically requires the use of sensors, such as rate gyros and accelerometers, on-board the test vehicles 110 to provide data for stabilization. In the current embodiment, the position reference system 420 provides data for the stabilization adjustment with sufficiently high accuracy and low latency to significantly reduce or eliminate the requirements for these sensors. The further benefit is the reduction in test vehicle weight associated with the requirement of carrying these sensors.

Next, at a block 324, a determination (or series of determinations) is made to determine whether health conditions of the one or more test vehicles 110 are unacceptable. If so, a determination is made at a block 325 to determine whether the unacceptable health condition is correctable. If so, then corrective action may be taken to adjust the efficient health conditions of the one more test vehicles 110 at a block 326, including using the health condition as a constraint in the control commands of the corresponding vehicle(s). If the health conditions of the vehicles are not unacceptable, or if any unacceptable conditions are not correctable, then at a block 328, a determination is made regarding whether the test or mission is complete. If not, the method 300 returns to block 306 and the above-described actions are repeated. Otherwise, the method 300 is complete.

It will be appreciated that the various steps shown in FIG. 3, where not expressly stated, apply equally to the simulated vehicles. The application to broadcast the position and attitude data 214 may combine the data from the real and simulated environments such that the individual vehicles, whether real or simulated, do not know the origin and are able to react to the other vehicles and to the provided data in a common manner. The simulated environment and dynamics 220 provides simulated health data to be used in the evaluation at block 324 and for stabilization and position adjustment of the vehicles 316, 320. The simulated data may be based upon models developed to emulate the real vehicles and may include deterministic and random behaviors in an effort to best reflect the uncertain real environment.

FIG. 4 is a schematic view of a development system 400 in accordance with an alternate embodiment of the invention. In this embodiment, the development system 400 includes a main processing computer 402 operatively coupled to a position reference system 420 via a data station 404, into an application computer 450 via a datalink 452 (e.g. an Ethernet connection). A remotely-controlled test vehicle 410 is positioned within a control (or capture) volume 422 monitored by the position reference system 420.

The position reference system 420 includes a plurality of motion capture devices 424 (e.g. cameras) operatively distributed about the control volume 422 and configured to monitor the positions and movements of a plurality of retro-reflective markers 426 disposed on the test vehicle 410. In the embodiment shown in FIG. 4, the motion capture devices 424 operate in the visible portion of the spectrum, however, in alternate embodiments, devices that operate in other portions of the spectrum (e.g. near infrared, infrared, etc.) may be used. The motion capture devices 424 are configured to monitor the retro-reflective markers 426 and to export the positions of the retro-reflective markers 426 to the main processing computer 402 in real-time. Alternately, using a priori knowledge of the relative positions of the retro-reflective markers 426 on the test vehicle 410, the motion capture devices 424 may internally process the measured marker position data to derive position and orientation data of the grouping of markers representing the test vehicle 410, and may output the position and orientation data of the test vehicle 410 to the main processing computer 402.

In one particular embodiment, a total of six motion capture devices 424 are distributed about an approximately room-sized control volume 422 (e.g. 25′×25′×8′) and are configured to provide sub-millimeter position accuracy of the positions of the retro-reflective markers 426 at refresh rates of up to 500 Hz with 10 ms processing latency. Thus, the position reference system 420 may provide six degree of freedom motion tracking of the test vehicle 410 in approximately real-time to enable closed-loop feedback control of the position, movement, and stabilization characteristics of the test vehicle 410. In alternate embodiments, any suitable number of motion capture devices 424 (e.g. two or more) may be used, and the control volume 422 may be scaled up or down to any desired size. For example, in another particular embodiment, eight motion capture devices 424 are used. Similarly, in alternate embodiments, the motion capture devices 424 may be configured to provide any suitable or desired resolution and operational frequency. Suitable motion capture devices 424 that may be used in the position reference system 420 include those camera systems commercially-available from Vicon Limited of Oxford, UK, as well as camera systems commercially-available from Motion Analysis Corp. of Santa Rosa, Calif. Additional embodiments and operational aspects of suitable position reference systems are described in the above-referenced, commonly-owned U.S. patent application No. Ser. No. 11/459,631 entitled “Closed-Loop Feedback Control of Vehicles Using Motion Capture Systems” filed concurrently herewith on Jul. 24, 2006, previously-incorporated herein by reference.

FIGS. 5 and 6 are enlarged schematic and plan views, respectively, of the test vehicle 410 in association with other components of the development system 400. FIG. 7 shows the test vehicle 410 in operation within the control volume 422 of the development system 400. As best shown in FIG. 5, in this embodiment, the test vehicle 410 includes an onboard controller 414 operatively coupled to a plurality of rotor assemblies 412 and to a power source 416 (e.g. a battery). A current sensor 417 monitors an electrical current drawn by each rotor assembly 412, and a thermistor 418 monitors a temperature of each rotor assembly 412. In a particular embodiment, the test vehicle 410 is a modified version of a Draganflyer RC helicopter commercially available from Draganfly Innovations, Inc. of Saskatoon, Saskatchewan.

The onboard controller 414 is operatively coupled to a control module 460. The control module 460 may be located on the test vehicle 410, or alternately, may be positioned remotely from the test vehicle 410 and may communicate with the onboard controller 414 via a wireless communication link. In the embodiment shown in FIG. 5, the control module 460 includes a health management component 462 coupled to a processing component 464 that is, in turn, coupled to a communication component 466. The processing and communication components 464, 466 may have somewhat overlapping functions and capabilities. For example, in a particular embodiment, the processing component 464 may perform data collection and relatively low-level onboard processing activities, while the communication component 466 may perform relatively high-level onboard processing activities as well as communication activities. The health management and/or processing components 462, 464 are adapted to monitor the voltage level of the power source 416, the outputs of the current sensors 417 and temperature sensors 418, and to buffer, filter, and condition the signals received from the test vehicle 410 for communication to the communication component 466. In turn, the communication component 466 is configured to transmit these data to a command and control component of the development system 400 to enable health monitoring of the various systems and parameters of the test vehicle 410. In a particular embodiment, the processing component 464 may be a Robostix microcontroller and the communication component 466 may be a Connex 400xm-bt platform, both of which are commercially available from gumstix inc. of Portola Valley, Calif. In another embodiment, a microcontroller (e.g. a Robostix microcontroller) may be combined with a printed circuit board to provide these functions on-board the test vehicle 410, as described below with reference to FIGS. 13 and 14.

FIG. 8 is a schematic view of a computer device 500 suitable for use with the development system 400 of FIG. 4. More specifically, the computer device 500 may be used as the main processing computer 402, or as the application computer 450, or both. In a very basic configuration, the computing device 500 includes at least one processing unit 502 and system memory 504. Depending on the exact configuration and type of computing device 500, the system memory 504 may be volatile (such as RAM), non-volatile (such as ROM and flash memory) or some combination of the two. The system memory 504 typically includes an operating system 506, one or more program modules 508, and may include program data 510.

For methods and systems in accordance with the present disclosure, the program modules 508 may include the process modules 509 that realize one or more the processes described herein. Other modules described herein may also be part of the program modules 508. As an alternative, process modules 509, as well as the other modules, may be implemented as part of the operating system 506, or it may be installed on the computing device and stored in other memory (e.g., non-removable storage 522) separate from the system memory 506.

The computing device 500 may have additional features or functionality. For example, the computing device 500 may also include additional data storage devices (removable and/or non-removable) such as, for example, magnetic disks, optical disks, or tape. Such additional storage is illustrated in FIG. 8 by removable storage 520 and non-removable storage 522. Computer storage media may include volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information, such as computer readable instructions, data structures, program modules, or other data. The system memory 506, removable storage 520 and non-removable storage 522 are all examples of computer storage media. Thus, computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by computing device 500. Any such computer storage media may be part of the device 500. Computing device 500 may also have input device(s) 524 such as keyboard, mouse, pen, voice input device, and touch input devices. Output device(s) 526 such as a display, speakers, and printer, may also be included. These devices are well know in the art and need not be discussed at length.

The computing device 500 may also contain a communication connection 528 that allow the device to communicate with other computing devices 530, such as over a network. Communication connection(s) 528 is one example of communication media. Communication media may typically be embodied by computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as a carrier wave or other transport mechanism, and includes any information delivery media.

Various modules and techniques may be described herein in the general context of computer-executable instructions, such as program modules, executed by one or more computers or other devices. Generally, program modules include routines, programs, objects, components, data structures, and so forth for performing particular tasks or implementing particular abstract data types. These program modules and the like may be executed as native code or may be downloaded and executed, such as in a virtual machine or other just-in-time compilation execution environment. Typically, the functionality of the program modules may be combined or distributed as desired in various embodiments. An implementation of these modules and techniques may be stored on or transmitted across some form of computer readable media.

FIG. 9 is a schematic overview of a health monitoring system 600 of a development system in accordance with an alternate embodiment of the invention. In this embodiment, the test vehicle 610 includes a vehicle controller 414, a power source 416, rotor assemblies 412, and current and temperature sensors 417, 418 as described above (FIG. 5). The test vehicle 610 also includes a processing component 614 and communication component 616. The processing component 614 receives health monitoring data from the various components on board the test vehicle 610, including the sensors 417, 418, an inertial measurement unit (IMU) 618, one or more piezogyros 619, or any other desired components of the test vehicle 610. The processing component 614 may filter, condition, and buffer the health monitoring data before transmitting it to the communication component 616, which then transmits the monitoring data to a command and control component 620. As noted above, the processing and communication components 464, 466 may have somewhat overlapping functions and capabilities, such as using the processing component 464 to perform data collection and relatively low-level onboard processing activities, and the communication component 466 to perform relatively high-level onboard processing activities. In some embodiments, for example, the processing component 614 (or component 616) may use Kalman filtering or other algorithms to monitor for a functional health degradation of an onboard sensor, such as a piezogyro or other stability or navigational device.

The command and control component 620 may reside in any suitable location within the development system, including, for example, on the main processing computer 402 or on the application computer 450 (FIG. 4). The command and control component 620 may reside on a single component or portion of the development system, or may be distributed across various components. As shown in FIG. 9, in this embodiment, the command and control component 620 includes a stability and control processor 622 that receives the vehicle information transmitted by the onboard communication component 616.

As further shown in FIG. 9, the stability and control processor 622 transmits at least a portion of the received vehicle information to a guidance manager processor 628. Similarly, the position reference system 420 captures motion information of the test vehicle 610 within the control volume 422, processes this information using a motion data processing component 621, and transmits the motion data (e.g. vehicle or marker positions and orientations) to the guidance manager processor 628. The guidance manager processor 628 communicates the motion data and vehicle information to a rack computer processing card 650. In a particular embodiment, the rack computer processing card 650 is the part of the main processing computer 402 (FIG. 4).

A surveillance camera 660 may be disposed on the test vehicle 610 and may transmit image data (e.g. composite video images) to a camera receiver 662 of the command and control component 620. The image data are then transmitted to a PCI card 664, which outputs the image data to the rack computer processing card 650.

In the embodiment shown in FIG. 9, the rack computer processing card 650 includes a vehicle task manager processing component 652, a mission planner processing component 654, and a GUI processing component 656. The mission planner processing component 654 is used to assign a high-level mission to the test vehicle(s) 410 based on human user input provided via the GUI processing component 656. Examples of such “missions” include providing reconnaissance, surveillance, or target acquisition (commonly known as RSTA) within the entire or a specific sub-volume of the control volume 422 and/or for another object or test vehicle 410 (or groups of test vehicles). The vehicle task manager processing component 652 is used to assign specific actions to individual or groups of vehicles in order to accomplish the higher-level mission objective. Examples of such tasks include one or a sequence of the following actions or behaviors: move to a specific target position (i.e. waypoint), point directional sensor(s), return to base, emergency land, activate/deactivate surveillance sensor(s), provide telemetry of specified data, manipulate robotic effector(s) (if test vehicle 410 is equipped with such), perform coordinated flight of a group of test vehicles (e.g. flocking). The GUI processing component 656 provides human users access to test vehicle telemetry (e.g. surveillance camera 660 video images), allows the human user to assign missions and monitor the effectiveness of the test vehicle(s) 410 for a defined mission, and enables the human user to remain “in the loop” to issue time- and/or safety-critical commands.

In operation, the rack computer processing card 650, the stability and control processor 622, and the vehicle guidance manager processor 628, analyze the vehicle information received from the test vehicle 610, as well as the motion data received from the position reference system 420 and the image data received from surveillance camera 660, to determine appropriate stability and control signals necessary to control the test vehicle 610. These stability and control signals are transmitted by the stability and control processor 622 through a PC-to-RC (personal computer to remote controller) converter 624 to a remote control unit 626. The remote control unit 626 transmits corresponding control signals to the onboard vehicle controller 414 which, in turn, communicates appropriate command signals to the various components of the test vehicle 610 (e.g. rotor assemblies 412, etc.) to maintain the desired position, velocity, direction, attitude, and stabilization of the test vehicle 610.

With continued reference to FIG. 9, the rack computer processing card 650 is operatively coupled to an analysis and display component 670 (e.g. the application computer 450 of FIG. 4, or the human interface 150 of FIG. 1) that includes a data capture and analysis component 672. The data capture and analysis component 672 receives the vehicle information, motion capture data, and image data from the rack processing card 650 for real-time analysis and display, or subsequent post-processing.

The components 670, 672 may be a real-time hardware-in-the-loop rapid prototyping tool using, for example, the dSPACE real-time hardware-in-the-loop rapid prototyping system available from dSPACE, Inc. of Novi, Mich., the North American subsidiary of dSPACE GmbH, Germany. Here, vehicle and system information with various sensor feedback signals from the vehicle and system as well as command and control signals are collected via an interface connector panel. The connector panel sends the signals to the real-time processing board where they get sampled and digitized for digital signal processing. The board may further interface with the software/PC application in real-time. A custom-developed rapid-prototyping tool may provide combined real-time data capture, presentation, in-flight and post-flight data analysis and rapid vehicle subsystem/system characterization and tuning. This enables rapid prototyping and characterization of subsystem and system components and related algorithms. As an example, the motor/rotor dynamics have been rapidly characterized and analyzed using this component. A summary of the rapid characterization results are captured in FIG. 12 below.

FIGS. 10 through 12 are representative screenshots provided by the analysis and display component 670 of FIG. 9. More specifically, FIG. 10 shows a screen display 700 depicting the position and attitude of the test vehicle 610. A portion 702 of the screen display 700 provides digital values of the positions, attitudes and headings of the test vehicle 610. A first trace 704 is plotted to show a desired mission profile of the test vehicle 610, and a second trace 706 is plotted to show the actual mission profile followed by the test vehicle 610. Thus, the analysis and display component 670 can provide a real time visual representation of the test vehicle 610 as well as a qualitative assessment of a performance of a component under test (e.g. a control algorithm).

FIG. 11 shows a screen display 710 depicting graphs of health monitoring information for the test vehicle 610 during flight testing. A first graph 712 shows a plot of power source voltage versus flight time. Second graphs 714 show plots of the temperatures at the rotor assemblies 412 as measured by the temperature sensors 418, and third graphs 716 show plots of the rotor assemblies 412 as measured by the current sensors 417. In this way, the analysis and display component 670 may provide real-time information regarding important health characteristics of the test vehicle 610 during flight testing, which may advantageously improve test quality and reduce failures and associated downtime.

FIG. 12 shows numerous graphs of various test parameters that may be monitored, controlled, and displayed using the analysis and display component 670 during vehicle development tests using embodiments of the present invention. More specifically, first and second graphs 720, 722 show plots of scaled motor speed and current time responses to a motor pulse-width modulation command duty cycle step input change for a rotor assembly having a foam blade, enabling the motor step response characteristics to be evaluated. A third graph 724 shows similar data as 722 but for a rotor assembly having a nylon blade. A fourth graph 726 shows a plot of motor speed versus input current for both the foam blade and the nylon blade, enabling an evaluation of the impact of differing blade materials on motor speed performance. Similarly, fifth and eighth graphs 728, 734 show plots of thrust versus motor speed for both the foam blade and the nylon blade, enabling an evaluation of the impact of differing blade materials on thrust performance. A sixth graph 730 (when analyzed with a ninth graph 736) enables an assessment to be made of the motor thermodynamic characteristics by showing a plot of scaled digitally-filtered motor speed and current responses to a duty cycle step input of a motor along with a corresponding plot of temperature versus time, enabling an assessment of the temperature characteristics of the rotor assembly 412 during a particular test flight or mission profile. A seventh graph 732 shows a plot of scaled motor speed and current responses of a motor using nylon blades for an alternate motor driver circuit, allowing motor driver circuit design changes to be evaluated. Of course, a wide variety of other graphs of various test parameters of interest during vehicle development tests may be monitored, controlled, and displayed using the analysis and display component 670.

FIG. 13 is an isometric view of a test vehicle 810 in accordance with another embodiment of the invention. In this embodiment, the test vehicle 810 includes a health monitoring control board 860 miniaturized on a processing component (or printed circuit board, or PCB) 870. FIG. 14 is an enlarged elevational view of the health monitoring control board 860 and the processing component 870 of the test vehicle 810 of FIG. 13. The processing component 870 includes various sensors (e.g. current sensors, temperature sensors, sensors, etc.) as well as the communication component (e.g. the above-referenced Connex component by gumstix, inc.), and associated circuitry for driving the rotor assemblies 412, and provides the above-described capabilities in a highly-integrated module that is both light weight and highly serviceable. Thus, the processing component 870 and health monitoring control board 860 may be positioned on each of the vehicles 810 of a development system, and may receive the one or more stability and control characteristics determined by the position reference system from a communication network, may compute the corresponding command signals in a distributed (rather than centralized) fashion, and may communicate the command signals to the onboard control system of each of the vehicles in a closed-loop feedback manner.

From the foregoing description, it may be appreciated that embodiments of systems and methods in accordance with the present invention may advantageously enable rapid development and testing of a wide variety of vehicles and vehicle components (hardware and software) in a controlled environment. Vehicle components, such as new software systems, avionics systems, control algorithms, computer hardware components, sensors, flight vehicle components and configurations, and other suitable parameters of interest may be quickly and repeatedly tested using development systems and methods in conjunction with one or multiple test vehicles. Embodiments of the present invention may be scaled to fit within a suitable laboratory environment, enabling testing and development of new vehicles and vehicle components to be performed rapidly, efficiently, and cost effectively.

While preferred and alternate embodiments of the invention have been illustrated and described, as noted above, many changes can be made without departing from the spirit and scope of the invention. Accordingly, the scope of the invention is not limited by the disclosure of these preferred and alternate embodiments. Instead, the invention should be determined entirely by reference to the claims that follow.

Troy, James J., Murray, Paul, Mansouri, Ali R., Vian, John L., Erignac, Charles A., Clark, Gregory J., Bieniawski, Stefan R., Abdel-Motagaly, Khaled, Saad, Emad W., How, Jonathan P., Pigg, Paul E. R., Provine, Ronald C., Valenti, Mario J., Bethke, Brett M.

Patent Priority Assignee Title
10019901, Aug 28 2015 State Farm Mutual Automobile Insurance Company Vehicular traffic alerts for avoidance of abnormal traffic conditions
10026130, May 20 2014 State Farm Mutual Automobile Insurance Company Autonomous vehicle collision risk assessment
10026237, Aug 28 2015 Hyundai Motor Company; Kia Corporation Shared vehicle usage, monitoring and feedback
10055794, May 20 2014 State Farm Mutual Automobile Insurance Company Determining autonomous vehicle technology performance for insurance pricing and offering
10089693, May 20 2014 State Farm Mutual Automobile Insurance Company Fully autonomous vehicle insurance pricing
10106083, Aug 28 2015 State Farm Mutual Automobile Insurance Company Vehicular warnings based upon pedestrian or cyclist presence
10134278, Jan 22 2016 State Farm Mutual Automobile Insurance Company Autonomous vehicle application
10156848, Jan 22 2016 State Farm Mutual Automobile Insurance Company Autonomous vehicle routing during emergencies
10157423, Nov 13 2014 State Farm Mutual Automobile Insurance Company Autonomous vehicle operating style and mode monitoring
10166994, Nov 13 2014 State Farm Mutual Automobile Insurance Company Autonomous vehicle operating status assessment
10185997, May 20 2014 State Farm Mutual Automobile Insurance Company Accident fault determination for autonomous vehicles
10185998, May 20 2014 State Farm Mutual Automobile Insurance Company Accident fault determination for autonomous vehicles
10223479, May 20 2014 State Farm Mutual Automobile Insurance Company Autonomous vehicle operation feature evaluation
10239638, May 10 2014 WING Aviation LLC Home station for unmanned aerial vehicle
10241509, Nov 13 2014 State Farm Mutual Automobile Insurance Company Autonomous vehicle control assessment and selection
10246097, Nov 13 2014 State Farm Mutual Automobile Insurance Company Autonomous vehicle operator identification
10255168, Nov 06 2015 Ford Global Technologies, LLC Method and device for generating test cases for autonomous vehicles
10266180, Nov 13 2014 State Farm Mutual Automobile Insurance Company Autonomous vehicle control assessment and selection
10295363, Jan 22 2016 State Farm Mutual Automobile Insurance Company Autonomous operation suitability assessment and mapping
10310509, Sep 28 2012 Waymo LLC Detecting sensor degradation by actively controlling an autonomous vehicle
10324463, Jan 22 2016 State Farm Mutual Automobile Insurance Company Autonomous vehicle operation adjustment based upon route
10336321, Nov 13 2014 State Farm Mutual Automobile Insurance Company Autonomous vehicle control assessment and selection
10343605, Aug 28 2015 STATE FARM MUTUAL AUTOMOTIVE INSURANCE COMPANY Vehicular warning based upon pedestrian or cyclist presence
10345803, Dec 28 2012 WING Aviation LLC Multi-part navigation process by an unmanned aerial vehicle for navigation
10353694, Nov 13 2014 State Farm Mutual Automobile Insurance Company Autonomous vehicle software version assessment
10354330, May 20 2014 State Farm Mutual Automobile Insurance Company Autonomous feature use monitoring and insurance pricing
10373259, May 20 2014 State Farm Mutual Automobile Insurance Company Fully autonomous vehicle insurance pricing
10386845, Jan 22 2016 State Farm Mutual Automobile Insurance Company Autonomous vehicle parking
10395332, Jan 22 2016 State Farm Mutual Automobile Insurance Company Coordinated autonomous vehicle automatic area scanning
10416670, Nov 13 2014 State Farm Mutual Automobile Insurance Company Autonomous vehicle control assessment and selection
10431018, Nov 13 2014 State Farm Mutual Automobile Insurance Company Autonomous vehicle operating status assessment
10467704, May 20 2014 State Farm Mutual Automobile Insurance Company Autonomous vehicle operation feature monitoring and evaluation of effectiveness
10475127, Jul 21 2014 State Farm Mutual Automobile Insurance Company Methods of providing insurance savings based upon telematics and insurance incentives
10503168, Jan 22 2016 State Farm Mutual Automobile Insurance Company Autonomous vehicle retrieval
10504306, May 20 2014 State Farm Mutual Automobile Insurance Company Accident response using autonomous vehicle monitoring
10510123, May 20 2014 State Farm Mutual Automobile Insurance Company Accident risk model determination using autonomous vehicle operating data
10529027, May 20 2014 State Farm Mutual Automobile Insurance Company Autonomous vehicle operation feature monitoring and evaluation of effectiveness
10540723, Jul 21 2014 State Farm Mutual Automobile Insurance Company Methods of providing insurance savings based upon telematics and usage-based insurance
10545024, Jan 22 2016 State Farm Mutual Automobile Insurance Company Autonomous vehicle trip routing
10546499, Feb 01 2017 Toyota Jidosha Kabushiki Kaisha Systems and methods for notifying an occupant of a cause for a deviation in a vehicle
10579070, Jan 22 2016 State Farm Mutual Automobile Insurance Company Method and system for repairing a malfunctioning autonomous vehicle
10591924, Sep 28 2012 Waymo LLC Detecting sensor degradation by actively controlling an autonomous vehicle
10604252, Nov 22 2016 WING Aviation LLC Landing and payload loading structures
10683102, May 10 2014 WING Aviation LLC Home station for unmanned aerial vehicle
10685403, May 20 2014 State Farm Mutual Automobile Insurance Company Fault determination with autonomous feature use monitoring
10691126, Jan 22 2016 Hyundai Motor Company; Kia Corporation Autonomous vehicle refueling
10691142, Dec 21 2017 WING Aviation LLC Anticipatory dispatch of UAVs to pre-staging locations
10719885, May 20 2014 State Farm Mutual Automobile Insurance Company Autonomous feature use monitoring and insurance pricing
10719886, May 20 2014 State Farm Mutual Automobile Insurance Company Accident fault determination for autonomous vehicles
10723312, Jul 21 2014 State Farm Mutual Automobile Insurance Company Methods of theft prevention or mitigation
10726498, May 20 2014 State Farm Mutual Automobile Insurance Company Accident fault determination for autonomous vehicles
10726499, May 20 2014 State Farm Mutual Automoible Insurance Company Accident fault determination for autonomous vehicles
10747234, Jan 22 2016 State Farm Mutual Automobile Insurance Company Method and system for enhancing the functionality of a vehicle
10748218, May 20 2014 State Farm Mutual Automobile Insurance Company Autonomous vehicle technology effectiveness determination for insurance pricing
10748419, Aug 28 2015 State Farm Mutual Automobile Insurance Company Vehicular traffic alerts for avoidance of abnormal traffic conditions
10769954, Aug 28 2015 State Farm Mutual Automobile Insurance Company Vehicular driver warnings
10788428, Sep 25 2017 The Boeing Company Positioning system for aerial non-destructive inspection
10802477, Jan 22 2016 State Farm Mutual Automobile Insurance Company Virtual testing of autonomous environment control system
10818105, Jan 22 2016 State Farm Mutual Automobile Insurance Company Sensor malfunction detection
10821971, Nov 13 2014 State Farm Mutual Automobile Insurance Company Autonomous vehicle automatic parking
10824144, Nov 13 2014 State Farm Mutual Automobile Insurance Company Autonomous vehicle control assessment and selection
10824145, Jan 22 2016 State Farm Mutual Automobile Insurance Company Autonomous vehicle component maintenance and repair
10825326, Jul 21 2014 State Farm Mutual Automobile Insurance Company Methods of facilitating emergency assistance
10828999, Jan 22 2016 Hyundai Motor Company; Kia Corporation Autonomous electric vehicle charging
10829063, Jan 22 2016 Hyundai Motor Company; Kia Corporation Autonomous vehicle damage and salvage assessment
10831191, Nov 13 2014 State Farm Mutual Automobile Insurance Company Autonomous vehicle accident and emergency response
10831204, Nov 13 2014 State Farm Mutual Automobile Insurance Company Autonomous vehicle automatic parking
10832327, Jul 21 2014 State Farm Mutual Automobile Insurance Company Methods of providing insurance savings based upon telematics and driving behavior identification
10915965, Nov 13 2014 State Farm Mutual Automobile Insurance Company Autonomous vehicle insurance based upon usage
10940866, Nov 13 2014 State Farm Mutual Automobile Insurance Company Autonomous vehicle operating status assessment
10943303, Nov 13 2014 State Farm Mutual Automobile Insurance Company Autonomous vehicle operating style and mode monitoring
10963969, May 20 2014 State Farm Mutual Automobile Insurance Company Autonomous communication feature use and insurance pricing
10974693, Jul 21 2014 State Farm Mutual Automobile Insurance Company Methods of theft prevention or mitigation
10977945, Aug 28 2015 State Farm Mutual Automobile Insurance Company Vehicular driver warnings
10997849, Jul 21 2014 State Farm Mutual Automobile Insurance Company Methods of facilitating emergency assistance
11010840, May 20 2014 State Farm Mutual Automobile Insurance Company Fault determination with autonomous feature use monitoring
11014567, Nov 13 2014 State Farm Mutual Automobile Insurance Company Autonomous vehicle operator identification
11015942, Jan 22 2016 State Farm Mutual Automobile Insurance Company Autonomous vehicle routing
11016504, Jan 22 2016 State Farm Mutual Automobile Insurance Company Method and system for repairing a malfunctioning autonomous vehicle
11022978, Jan 22 2016 State Farm Mutual Automobile Insurance Company Autonomous vehicle routing during emergencies
11023629, May 20 2014 State Farm Mutual Automobile Insurance Company Autonomous vehicle operation feature evaluation
11030696, Jul 21 2014 State Farm Mutual Automobile Insurance Company Methods of providing insurance savings based upon telematics and anonymous driver data
11062396, May 20 2014 State Farm Mutual Automobile Insurance Company Determining autonomous vehicle technology performance for insurance pricing and offering
11062414, Jan 22 2016 State Farm Mutual Automobile Insurance Company System and method for autonomous vehicle ride sharing using facial recognition
11068995, Jul 21 2014 State Farm Mutual Automobile Insurance Company Methods of reconstructing an accident scene using telematics data
11069221, Jul 21 2014 State Farm Mutual Automobile Insurance Company Methods of facilitating emergency assistance
11079760, Nov 28 2018 The Boeing Company Methods for maintaining difficult-to-access structures using unmanned aerial vehicles
11080794, May 20 2014 State Farm Mutual Automobile Insurance Company Autonomous vehicle technology effectiveness determination for insurance pricing
11119477, Jan 22 2016 State Farm Mutual Automobile Insurance Company Anomalous condition detection and response for autonomous vehicles
11124186, Jan 22 2016 State Farm Mutual Automobile Insurance Company Autonomous vehicle control signal
11126184, Jan 22 2016 State Farm Mutual Automobile Insurance Company Autonomous vehicle parking
11127083, May 20 2014 State Farm Mutual Automobile Insurance Company Driver feedback alerts based upon monitoring use of autonomous vehicle operation features
11127086, May 20 2014 State Farm Mutual Automobile Insurance Company Accident fault determination for autonomous vehicles
11127290, Nov 13 2014 State Farm Mutual Automobile Insurance Company Autonomous vehicle infrastructure communication device
11136024, Jan 22 2016 State Farm Mutual Automobile Insurance Company Detecting and responding to autonomous environment incidents
11173918, Nov 13 2014 State Farm Mutual Automobile Insurance Company Autonomous vehicle control assessment and selection
11173924, Apr 23 2018 Ford Global Technologies, LLC Test for self-driving motor vehicle
11175660, Nov 13 2014 State Farm Mutual Automobile Insurance Company Autonomous vehicle control assessment and selection
11181930, Jan 22 2016 State Farm Mutual Automobile Insurance Company Method and system for enhancing the functionality of a vehicle
11189112, Dec 14 2015 State Farm Mutual Automobile Insurance Company Autonomous vehicle sensor malfunction detection
11227409, Aug 20 2018 Waymo LLC Camera assessment techniques for autonomous vehicles
11247670, Nov 13 2014 State Farm Mutual Automobile Insurance Company Autonomous vehicle control assessment and selection
11256271, Dec 21 2017 WING Aviation LLC Anticipatory dispatch of UAVs to pre-staging locations
11257163, Jul 21 2014 State Farm Mutual Automobile Insurance Company Methods of pre-generating insurance claims
11275391, May 13 2019 The Boeing Company In-service maintenance process using unmanned aerial vehicles
11282143, May 20 2014 State Farm Mutual Automobile Insurance Company Fully autonomous vehicle insurance pricing
11288751, May 20 2014 State Farm Mutual Automobile Insurance Company Autonomous vehicle operation feature monitoring and evaluation of effectiveness
11312490, Nov 22 2016 WING Aviation LLC Landing and payload loading structures
11327501, Sep 28 2012 Waymo LLC Detecting sensor degradation by actively controlling an autonomous vehicle
11348193, Jan 22 2016 State Farm Mutual Automobile Insurance Company Component damage and salvage assessment
11385204, Dec 11 2018 The Boeing Company Fan-propelled surface-adhering apparatus for automated maintenance operations
11386501, May 20 2014 State Farm Mutual Automobile Insurance Company Accident fault determination for autonomous vehicles
11436685, May 20 2014 State Farm Mutual Automobile Insurance Company Fault determination with autonomous feature use monitoring
11440494, Jan 22 2016 State Farm Mutual Automobile Insurance Company Detecting and responding to autonomous vehicle incidents
11441916, Jan 22 2016 State Farm Mutual Automobile Insurance Company Autonomous vehicle trip routing
11450206, Aug 28 2015 State Farm Mutual Automobile Insurance Company Vehicular traffic alerts for avoidance of abnormal traffic conditions
11494175, Nov 13 2014 State Farm Mutual Automobile Insurance Company Autonomous vehicle operating status assessment
11500377, Nov 13 2014 State Farm Mutual Automobile Insurance Company Autonomous vehicle control assessment and selection
11513521, Jan 22 2016 STATE FARM MUTUAL AUTOMOBILE INSURANCE COPMANY Autonomous vehicle refueling
11526167, Jan 22 2016 State Farm Mutual Automobile Insurance Company Autonomous vehicle component maintenance and repair
11529777, Feb 05 2020 The Boeing Company Hot bond repair of structures using unmanned aerial vehicles
11532187, Nov 13 2014 State Farm Mutual Automobile Insurance Company Autonomous vehicle operating status assessment
11555693, May 12 2020 The Boeing Company Measurement of surface profiles using unmanned aerial vehicles
11565654, Jul 21 2014 State Farm Mutual Automobile Insurance Company Methods of providing insurance savings based upon telematics and driving behavior identification
11580604, May 20 2014 State Farm Mutual Automobile Insurance Company Autonomous vehicle operation feature monitoring and evaluation of effectiveness
11625802, Jan 22 2016 State Farm Mutual Automobile Insurance Company Coordinated autonomous vehicle automatic area scanning
11630459, Jan 29 2020 The Boeing Company Repair of structures using unmanned aerial vehicles
11634102, Jul 21 2014 State Farm Mutual Automobile Insurance Company Methods of facilitating emergency assistance
11634103, Jul 21 2014 State Farm Mutual Automobile Insurance Company Methods of facilitating emergency assistance
11645064, Nov 13 2014 State Farm Mutual Automobile Insurance Company Autonomous vehicle accident and emergency response
11656978, Jan 22 2016 State Farm Mutual Automobile Insurance Company Virtual testing of autonomous environment control system
11667402, Sep 08 2020 WING Aviation LLC Landing pad with charging and loading functionality for unmanned aerial vehicle
11669090, May 20 2014 State Farm Mutual Automobile Insurance Company Autonomous vehicle operation feature monitoring and evaluation of effectiveness
11682244, Jan 22 2016 State Farm Mutual Automobile Insurance Company Smart home sensor malfunction detection
11699207, Aug 20 2018 Waymo LLC Camera assessment techniques for autonomous vehicles
11710188, May 20 2014 State Farm Mutual Automobile Insurance Company Autonomous communication feature use and insurance pricing
11719545, Jan 22 2016 Hyundai Motor Company; Kia Corporation Autonomous vehicle component damage and salvage assessment
11720968, Nov 13 2014 State Farm Mutual Automobile Insurance Company Autonomous vehicle insurance based upon usage
11726763, Nov 13 2014 State Farm Mutual Automobile Insurance Company Autonomous vehicle automatic parking
11733716, Dec 21 2017 WING Aviation LLC Anticipatory dispatch of UAVs to pre-staging locations
11740885, Nov 13 2014 State Farm Mutual Automobile Insurance Company Autonomous vehicle software version assessment
11745872, Jun 19 2020 The Boeing Company Methods for marking surfaces using unmanned aerial vehicles
11748085, Nov 13 2014 State Farm Mutual Automobile Insurance Company Autonomous vehicle operator identification
11869092, May 20 2014 State Farm Mutual Automobile Insurance Company Autonomous vehicle operation feature monitoring and evaluation of effectiveness
11873091, Nov 22 2016 WING Aviation LLC Landing and payload loading structures
11879742, Jan 22 2016 State Farm Mutual Automobile Insurance Company Autonomous vehicle application
11891174, Feb 05 2020 The Boeing Company Repair of structures using unmanned aerial vehicles
11920938, Jan 22 2016 Hyundai Motor Company; Kia Corporation Autonomous electric vehicle charging
11954482, Nov 13 2014 State Farm Mutual Automobile Insurance Company Autonomous vehicle control assessment and selection
11977874, Nov 13 2014 State Farm Mutual Automobile Insurance Company Autonomous vehicle control assessment and selection
12055399, Jan 22 2016 State Farm Mutual Automobile Insurance Company Autonomous vehicle trip routing
12056898, Aug 20 2018 Waymo LLC Camera assessment techniques for autonomous vehicles
12086583, Nov 13 2014 State Farm Mutual Automobile Insurance Company Autonomous vehicle insurance based upon usage
12104912, Jan 22 2016 State Farm Mutual Automobile Insurance Company Coordinated autonomous vehicle automatic area scanning
12111165, Jan 22 2016 State Farm Mutual Automobile Insurance Company Autonomous vehicle retrieval
12140959, May 20 2014 State Farm Mutual Automobile Insurance Company Autonomous vehicle operation feature monitoring and evaluation of effectiveness
12151644, Jul 21 2014 State Farm Mutual Automobile Insurance Company Methods of facilitating emergency assistance
12159317, Aug 28 2015 State Farm Mutual Automobile Insurance Company Vehicular traffic alerts for avoidance of abnormal traffic conditions
12174027, Jan 22 2016 State Farm Mutual Automobile Insurance Company Detecting and responding to autonomous vehicle incidents and unusual conditions
12179695, Jul 21 2014 State Farm Mutual Automobile Insurance Company Methods of facilitating emergency assistance
8599044, Aug 11 2010 The Boeing Company System and method to assess and report a health of a tire
8712634, Aug 11 2010 The Boeing Company System and method to assess and report the health of landing gear related components
8773289, Mar 24 2010 The Boeing Company Runway condition monitoring
8788119, Dec 09 2010 The Boeing Company Unmanned vehicle and system
8812154, Mar 16 2009 The Boeing Company Autonomous inspection and maintenance
8909391, Dec 28 2012 WING Aviation LLC Responsive navigation of an unmanned aerial vehicle to a remedial facility
8930044, Dec 28 2012 WING Aviation LLC Multi-part navigation process by an unmanned aerial vehicle for navigating to a medical situatiion
8948935, Jan 02 2013 WING Aviation LLC Providing a medical support device via an unmanned aerial vehicle
8977391, Jul 15 2011 The Boeing Company Spacecraft development testbed system
8982207, Oct 04 2010 The Boeing Company Automated visual inspection system
8983682, Dec 28 2012 WING Aviation LLC Unlocking mobile-device and/or unmanned aerial vehicle capability in an emergency situation
9046892, Jun 05 2009 The Boeing Company Supervision and control of heterogeneous autonomous operations
9051043, Dec 28 2012 WING Aviation LLC Providing emergency medical services using unmanned aerial vehicles
9064222, May 14 2010 The Boeing Company Real time mission planning
9117185, Sep 19 2012 The Boeing Company Forestry management system
9251698, Sep 19 2012 The Boeing Company Forest sensor deployment and monitoring system
9274525, Sep 28 2012 GOOGLE LLC Detecting sensor degradation by actively controlling an autonomous vehicle
9310800, Jul 30 2013 The Boeing Company Robotic platform evaluation system
9418496, Feb 17 2009 The Boeing Company Automated postflight troubleshooting
9434473, Dec 28 2012 WING Aviation LLC Providing services using unmanned aerial vehicles
9443436, Dec 20 2012 The Johns Hopkins University System for testing of autonomy in complex environments
9541505, Feb 17 2009 The Boeing Company Automated postflight troubleshooting sensor array
9594379, Sep 28 2012 GOOGLE LLC Detecting sensor degradation by actively controlling an autonomous vehicle
9651458, Sep 19 2014 Swisslog Logistics Inc. Method and system for auto safety verification of AGV sensors
9671314, Aug 11 2010 The Boeing Company System and method to assess and report the health of landing gear related components
9671781, Dec 28 2012 WING Aviation LLC Responsive navigation of an unmanned aerial vehicle to a remedial facility
9823654, Dec 28 2012 WING Aviation LLC Multi-part navigation process by an unmanned aerial vehicle for navigation
9824592, Sep 22 2014 VINVELI UNMANNED SYSTEMS, INC. Method and apparatus for ensuring the operation and integrity of a three-dimensional integrated logistical system
9849979, Dec 28 2012 WING Aviation LLC Providing services using unmanned aerial vehicles
9852475, May 20 2014 State Farm Mutual Automobile Insurance Company Accident risk model determination using autonomous vehicle operating data
9858621, May 20 2014 State Farm Mutual Automobile Insurance Company Autonomous vehicle technology effectiveness determination for insurance pricing
9927813, Sep 28 2012 Waymo LLC Detecting sensor degradation by actively controlling an autonomous vehicle
9972054, May 20 2014 State Farm Mutual Automobile Insurance Company Accident fault determination for autonomous vehicles
ER499,
ER9914,
Patent Priority Assignee Title
4654949, Feb 16 1982 LMI TECHNOLOGIES INC Method for automatically handling, assembling and working on objects
5109345, Feb 20 1990 The United States of America as represented by the Administrator of the Closed-loop autonomous docking system
5148591, May 17 1981 Sensor Adaptive Machines Incorporated Vision target based assembly
5281901, Dec 03 1990 HARNISCHFEGER ENGINEERS, INC ; HK SYSTEMS, INC Downward compatible AGV system and methods
5380978, Jul 12 1991 Method and apparatus for assembly of car bodies and other 3-dimensional objects
5489830, Sep 09 1994 McDonnell Douglas Corporation Control system with loadfeel and backdrive
5506682, Feb 16 1982 GREAT LAKES INTELLECTUAL PROPERTY LTD Robot vision using targets
5637826, Feb 07 1996 The United States of America as represented by the Secretary of the Navy Method and apparatus for optimal guidance
5646845, Feb 05 1990 Caterpillar Inc. System and method for controlling an autonomously navigated vehicle
5721680, Jun 07 1995 Raytheon Company Missile test method for testing the operability of a missile from a launch site
5909218, Apr 25 1996 Panasonic Intellectual Property Corporation of America Transmitter-receiver of three-dimensional skeleton structure motions and method thereof
6114976, Feb 05 1999 The Boeing Company Vehicle emergency warning and control system
6317658, Oct 15 1997 The Boeing Company Neurocomputing control distribution system
6460004, Feb 06 1996 PERCEPTRON INC Method and apparatus for calibrating a non-contact gauging sensor with respect to an external coordinate system
6720949, Aug 22 1997 Man machine interfaces and applications
6896220, May 23 2003 Raytheon Company Munition with integrity gated go/no-go decision
6910657, May 30 2003 Raytheon Company System and method for locating a target and guiding a vehicle toward the target
6938853, Mar 15 2002 University of Maryland Biomimetic mechanism for micro aircraft
6955324, Oct 22 2003 The Boeing Company Laser-tethered vehicle
7039367, Jan 31 2003 The United States of America as represented by the Secretary of the Navy Communications using unmanned surface vehicles and unmanned micro-aerial vehicles
7054724, Jul 16 2001 Honda Giken Kogyo Kabushiki Kaisha Behavior control apparatus and method
7099747, Oct 24 2003 Sony Corporation; Yamaguchi, Jinichi Motion editing apparatus and method for robot device, and computer program
7313463, Mar 31 2005 Massachusetts Institute of Technology Biomimetic motion and balance controllers for use in prosthetics, orthotics and robotics
7363111, Dec 30 2003 The Boeing Company Methods and systems for analyzing engine unbalance conditions
7400943, Dec 30 2003 The Boeing Company Methods and systems for analyzing engine unbalance conditions
7400950, Sep 23 2002 IRON BIRD LLC Optical sensing system and system for stabilizing machine-controllable vehicles
7445536, Jun 21 2003 AIR ROBOT CO , LTD Micro aerial vehicle
7731588, Sep 28 2005 The United States of America as represented by the Secretary of the Navy; U S A AS REPRESENTED BY THE SECREATARY OF THE NAVY, THE Remote vehicle control system
7742036, Dec 22 2003 Immersion Corporation System and method for controlling haptic devices having multiple operational modes
20020142701,
20030090682,
20040073359,
20040221648,
20040249520,
20050001091,
20050027406,
20050125099,
20060082546,
20060121818,
20060238403,
20070200027,
20080033684,
20080125896,
20090152391,
20090315777,
DE19731724,
GB2345952,
WO3027599,
WO2005003676,
WO9918555,
////////////
Executed onAssignorAssigneeConveyanceFrameReelDoc
Jul 24 2006The Boeing Company(assignment on the face of the patent)
Jul 26 2006VIAN, JOHN L Boeing Company, theASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0182940051 pdf
Jul 26 2006SAAD, EMAD W Boeing Company, theASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0182940051 pdf
Jul 31 2006MANSOURI, ALI R Boeing Company, theASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0182940051 pdf
Aug 04 2006CLARK, GREGORY J Boeing Company, theASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0182940051 pdf
Aug 08 2006BIENIAWSKI, STEFAN R Boeing Company, theASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0182940051 pdf
Aug 11 2006ERIGNAC, CHARLES A Boeing Company, theASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0182940051 pdf
Aug 11 2006MURRAY, PAULBoeing Company, theASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0182940051 pdf
Aug 14 2006PIGG, PAUL E R Boeing Company, theASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0182940051 pdf
Aug 14 2006TROY, JAMES J Boeing Company, theASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0182940051 pdf
Aug 21 2006PROVINE, RONALD C Boeing Company, theASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0182940051 pdf
Aug 23 2006ABDEL-MOTAGALY, KHALEDBoeing Company, theASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0182940051 pdf
Date Maintenance Fee Events
Sep 30 2010ASPN: Payor Number Assigned.
Apr 14 2014M1551: Payment of Maintenance Fee, 4th Year, Large Entity.
Apr 12 2018M1552: Payment of Maintenance Fee, 8th Year, Large Entity.
Apr 12 2022M1553: Payment of Maintenance Fee, 12th Year, Large Entity.


Date Maintenance Schedule
Oct 12 20134 years fee payment window open
Apr 12 20146 months grace period start (w surcharge)
Oct 12 2014patent expiry (for year 4)
Oct 12 20162 years to revive unintentionally abandoned end. (for year 4)
Oct 12 20178 years fee payment window open
Apr 12 20186 months grace period start (w surcharge)
Oct 12 2018patent expiry (for year 8)
Oct 12 20202 years to revive unintentionally abandoned end. (for year 8)
Oct 12 202112 years fee payment window open
Apr 12 20226 months grace period start (w surcharge)
Oct 12 2022patent expiry (for year 12)
Oct 12 20242 years to revive unintentionally abandoned end. (for year 12)