The present invention provides a system and method of utilizing superimposed 3d imagery for remotely operated vehicles, namely 3d, reconstructed images of the environment of the ROV. In another aspect, it includes generating a virtual video of 3d elements in the operation environment, synchronizing the angle and position of the camera of a virtual video with the angle and position of a real camera, superimposing the virtual video and the real video from the real camera; superimposing these video feeds such that one is manipulated to show transparencies in areas of less interest, in order to show through the other video. It furthermore may include superimposing information, whether graphic, textual or both on to the hybrid virtual-real 3d imagery. The subject invention is also networked, such that the immersive visual interface described above is accessible to a plurality of users operating from a plurality of locations.
|
20. A computer-readable medium including code for exploration, the code when executed operable to:
generate a virtual video feed of from 3d elements representing objects disposed in an operation environment;
obtain a real video feed; and
superimpose said virtual video feed and said real video feed by dividing the virtual and real video feeds into a plurality of layers to permit the flattening and superimposition of the plurality of layers to produce hybrid 3d imagery that simulates spatial perception.
10. A method for exploration, comprising:
generating a virtual video feed of 3d elements representing objects disposed in an operation environment;
obtaining a real video feed; and
superimposing said virtual video feed and said real video feed, said superimposed virtual and real video feeds comprising hybrid 3d imagery, wherein superimposing comprises dividing the virtual and real video feeds into a plurality of layers to permit the flattening and superimposition of the plurality of layers to produce hybrid 3d imagery that simulates spatial perception.
1. A system for exploration comprising:
a visualization engine, wherein said visualization engine further comprises:
a database module of 3d elements representing objects disposed in an operation environment;
a virtual video generating module for generating a virtual video feed incorporating said 3d elements; and
a superimposition module for superimposing said virtual video feed and a real video feed, said superimposed virtual and real video feeds comprising hybrid 3d imagery, wherein the superimposition module is configured to divide the virtual and real video feeds into a plurality of layers to permit the flattening and superimposition of the plurality of layers to produce hybrid 3d imagery that simulates spatial perception; and
a navigation interface configured to display said hybrid 3d imagery.
2. The system according to
4. The system according to
5. The system according to
6. The system according to
7. The system according to
8. The system according to
9. The system according to
11. The method according to
12. The method according to
13. The method according to
14. The method according to
calculating an angle between a heading of a remote operated vehicle and a direction of a field of view of the video camera;
calculating an angle between a vertical orientation of the remote operated vehicle and the direction of the video camera field of view; and
calculating an angle between the vehicle and a geographic horizon.
15. The method according to
16. The method according to
17. The method according to
18. The method according to
19. The method according to
21. The computer readable medium according to
calculate an angle between a heading of a remote operated vehicle and a direction of a field of view;
calculate an angle between a vertical orientation of the vehicle and the direction of the field of view; and
calculate an angle between the vehicle and a geographic horizon.
22. The computer readable medium according to
23. The computer readable medium according to
|
This application is a continuation of U.S. application Ser. No. 14/935,979 filed Nov. 9, 2015, which is a continuation of U.S. application Ser. No. 14/357,100 filed May 8, 2014, now U.S. Pat. No. 9,195,231, which is a U.S. National Stage Application of International Application No. PCT/IB2012/002281 filed Nov. 8, 2012, which designates the United States and claims the benefit of U.S. Provisional Patent Application Ser. No. 61/681,411 filed on Aug. 9, 2012, and claims the benefit of Portugal Patent Application PPP 105989 filed Nov. 9, 2011, the entire disclosures of which are hereby incorporated by reference.
The disclosures of published patent documents referenced in this application are hereby incorporated in their entireties by reference into this application in order to more fully describe the state of the art to which this invention pertains.
The present invention relates to a three-dimensional (“3D”) navigation and control system for remotely operated vehicles (“ROV”), and methods for its use. In particular, the present invention provides for a navigation and control system that is standardized in order to be compatible with a wide range of ROV options.
Exploration of last frontier on earth, the sea, is largely driven by the continuing demand for energy resources. Because humans are not able to endure the pressures induced at the depths at which energy reconnaissance occurs, we have become increasingly reliant upon ROV technology. The future of the exploration of the oceans is only as fast, reliable and safe as the available technology.
Prior art related to augmented reality navigation, such as U.S. Pre-grant Publication 2011/0153189, disclose systems for superimposing 3D objects and a video feed, but do not provide crucial devices for dealing with the vicissitudes of undersea navigation. Meanwhile, powerful graphics systems used for undersea navigation, such as those described in U.S. Pre-grant Publication 2009/0040070 A1 and U.S. Pat. No. 8,015,507 fail to provide a navigation interface that creates an immersive visual experience for the pilot or user.
An important shortcoming of the available ROV navigation technology is its inability to provide complete spatial awareness, i.e., the ability to consistently know the past and current flight path. Current navigation systems rely on conventional telemetry information, including depth, pitch, roll, camera tilt and heading. However, positioning systems, which provide the geographic location of the ROV, have not been seamlessly integrated with the depth and orientation instruments that constitute conventional telemetry systems.
Another important aspect of undersea exploration is the acquisition and application of information relating to the seabed and subsea structures. Modern multibeam sonar devices with modeling software provide detailed, 3D bathymetric data, which is essential to planning and evaluating exploratory missions. This bathymetric data is used extensively by supervisors and client representatives in the energy resources industry. However, conventional systems do not integrate the use of bathymetric modeling with real-time navigation in a way that facilitates the work of the ROV pilots themselves.
Much like jet fighter pilots, ROV pilots must navigate in three dimensions, in real time, and in conditions where visibility may be limited to between 2 and 10 meters. Accordingly, both types of pilots must have fast, reliable and intelligent data presented to them under low visibility conditions. However, the dynamic user interfaces used for complex aviation missions, which overlay quantitative environmental information, textual plans and other important graphics, have not been made available for comparably complex undersea missions.
A key to establishing a successful operating system for ROV missions is providing for effective collaboration and communication between every person involved in the project. The ability to introduce new data and share that data among the system users, and particularly with the pilot, advantageously leads to increased efficiency and safety.
Accordingly, there is a need for an augmented approach to pilot-ROV interactions in which information can be visualized and logged in the three spatial dimensions and in real time.
This disclosure provides tools and features that implement systems and methods relating to the operation of ROV with superimposed 3D imagery and navigational information. Although embodiments and examples are provided in the context of undersea missions, one skilled in the art should appreciate that the aspects, features, functionalities, etc., discussed in this disclosure can also be extended to virtually any type of complex navigation project.
In an aspect of this disclosure, an operation and navigation system is provided to allow seamless integration between a wide variety of ROV control systems. That is, the invention enables engineers and supervisors to plan one or several missions on one common, standard system that may be used for the operation and navigation of a wide range of ROV's by pilots and operators.
In another aspect, the invention provides a navigational system that visually presents any relevant geographical information relating to the planned flight path, waypoints, checkpoints, work sites and procedures provided by an operating system. It provides for collection and transfer of data, such that a user may log procedures during operation, update the system when the tasks are completed, and produce video- and/or text-based reports of the mission with the collected data. The system thus provides fully updated status information regarding the progress of the mission and the position of the ROV.
In a preferred embodiment, the invention includes computing hardware, one or more display screens, sonar technology, and software for a graphic user interface, all of which may interact with one or more ROV. The present invention also provides databases and software for execution by the computing hardware, including numerous modules for: obtaining, saving and modeling 3D elements in the operation environment of the ROV; synchronizing a virtual camera for viewing the modeled image with the real camera of an ROV; providing hybrid 3D imagery by superimposing real camera images and modeled images such that areas of heightened interest are rendered more visible or opaque than areas of lesser interest; and superimposing graphical or textual information on the hybrid 3D imagery. Additional software modules, which cooperate with the user interface, are provided for planning, supervising, logging, sharing and reporting all aspects of ROV-mediated exploration.
By implementing the various tools and features outlined above and discussed in detail below, the present system can greatly improve the way subsea operations are conducted. For example, the invention will provide ROV operators with immersive operating systems for mission planning, supervision and reporting. Improved visualization of the subsea environment and improved communication with supervisors and other pilots will lead to better overall efficiency, and less stress during long piloting shifts. The efficiencies of data transfer and task completion created by the subject operating system will lead to still further benefits, such as operational cost reduction and consequent revenue increase. Additionally, more efficient use of ROVs due to improved mission logistics will lead to an increase in the utilization period of those ROVs. The benefits to entities initiating and funding undersea research include improved access to detailed reports, and even to data acquired in real-time, without having to visit the research site. These are just some of the benefits that the system brings.
The aforementioned and other aspects, features and advantages can be better understood from the following detailed description with reference to the accompanying drawings wherein:
The invention provides system for operating a remotely operated vehicle comprising:
In one embodiment of the subject invention, the superimposition module is configured to superimpose graphic information, textual information, or both, onto the hybrid 3D imagery. The superimposing may be based on a luminance threshold, wherein luminance in the Red-Green-Blue hexadecimal format may be set to values between 0-0-0 and 255-255-255, and preferably between 0-0-0 and 40-40-40.
The invention also provides a system for undersea exploration comprising:
In another embodiment of the invention, the operating system is capable of sharing data with a plurality of remote monitors, wherein said plurality of monitors may include up to 12, and preferably between 3 and 9 monitors.
In yet another embodiment of the invention, the system further comprises an external system configured to determine whether the system is working or is in a fail state. The external system may be configured to switch the monitor input from hybrid 3D imagery to a live video feed if the system is in a fail state.
In yet another embodiment of the invention, the navigation interface comprises at least three networked monitors, wherein said monitors are arranged adjacent to one another such that the middle monitor displays video and augmented reality, while both side monitors display an expanded view of a field of operation.
In yet another embodiment of the invention, the navigation interface comprises one or more touch screens, one or more speakers for providing audio warnings and sounds, one or more microphones for receiving voice commands, one or more joysticks, one or more gamepads, and/or one or more computer mice.
In yet another embodiment of the invention, the functions of the operating system are abstracted such that the operating system is compatible with a plurality of hardware options.
The invention also provides a method of operating a remotely operated vehicle comprising
In a further embodiment, a method according to the invention farther comprises the step of superimposing graphic information, textual information, or both, onto the hybrid 3D imagery.
In a still further embodiment of the invention, the step of modulating the transparency or opaqueness further comprises establishing a luminance threshold. In an exemplary embodiment, establishing the luminance threshold comprises maintaining a background of the virtual video at a higher transparency than the background of the real video.
In yet another embodiment of the invention, synchronizing the angle and position of the cameras comprises the steps of:
In a further embodiment, a method according to the invention also comprises the step of displaying said hybrid 3D imagery on a navigation interface. In an exemplary embodiment, the step of displaying further comprises modulating the amount of 2D or 3D information superimposed onto the video screen based the position of the ROV relative to undersea structures. In another exemplary embodiment, the step of displaying further comprises providing a minimap, said minimap defining a computer-generated graphic showing either or both cardinal points or a position of an object in 3D.
In yet another embodiment, a method according to the invention comprises analyzing the displayed information to determine the status of procedures.
In yet another embodiment, a method according to the invention comprises updating the displayed information as tasks are completed.
In yet another embodiment, a method according to the invention comprises determining the positioning of said 3D elements using a global positioning system.
In yet another embodiment, a method according to the invention comprises the step of planning an undersea exploration mission, wherein said planning comprises entering user-determined information for display on a navigation interface, said user-determined information comprising any one or combination of bathymetry information, ROV waypoint information, ROV checkpoint information, procedure information, timed procedure information, flight path information, 3D modeled elements, GPS-determined position information and pilot information. In an exemplary embodiment, the method also comprises the step of configuring the operating system such that entering, leaving, or remaining longer than a designated time at a GPS-determined position triggers any one or combination of an alarm, notification, procedure change or task change.
In yet another embodiment, a method according to the invention comprises the step of logging an undersea exploration mission, wherein said logging comprises recording any one or combination of telemetry data, sonar data, 3D models, bathymetry data, flight path information, ROV waypoint information, ROV checkpoint information, procedure information, positioning information and inertial data. In an exemplary embodiment, the method also comprises reporting information that has been logged. In an exemplary embodiment, the method also comprises saving information that has been logged to a networked database. In an exemplary embodiment, the method also comprises producing a report based on the saved information, wherein said report provides a 3D recreation of the operation.
The invention also provides a computer program product, stored on a computer-readable medium, for implementing any method according to invention as described herein.
As mentioned supra, various features and functionalities are discussed herein by way of examples and embodiments in a context of ROV navigation for use in undersea exploration. In describing such examples and exemplary embodiments, specific terminology is employed for the sake of clarity. However, this disclosure is not intended to be limited to the examples and exemplary embodiments discussed herein, nor to the specific terminology utilized in such discussions, and it is to be understood that each specific element includes all technical equivalents that operate in a similar manner.
Definitions
The following terms are defined as follows:
Hardware and Devices
Referring now to the drawings, wherein like reference numerals designate identical or corresponding parts throughout the several views,
As seen from
In one embodiment of the invention, the hardware for the operating system 3 includes a high-end rack computer that can be easily integrated with any ROV control system. The several software modules that further define the operating system will be described in further detail infra.
With reference to
Functional Modules
Rather than developing a different operating system 3 for each brand and model of ROV 1, the present invention works by abstraction, such that the disclosed operating system 3 and associated hardware work the same way with all ROVs 1. For example, if one component delivers “SDBS, 14.0, 10.3” as a depth and heading coordinates, and another component delivers “SHD,15.3, 16.4” as heading and depth coordinates, these data strings are parsed into their respective variables: Depth1=14.0, Depth2=16.4, Heading1=6.4, Heading2=15.3. This parsing allows both system to work the same way, regardless of the data format details.
By developing a layer of abstraction of drivers for communication between the operating system 3 and the ROV hardware, the user 4 is provided with seamless data communication, and is not restricted to using particular ROV models. This abstraction further allows users 4 and systems 3 to communicate and network information between several systems, and share information among several undersea projects. The use of a single system also allows for cost reduction in training, maintenance and operation of this system.
Visualization Engine
As seen from
A 3D database module 10 includes advanced 3D rendering technology to allow all the stages of ROV operation to be executed with reference to a visually re-created 3D deep-water environment. This environment is composed by the seabed bathymetry and modeled equipment, e.g., structures of ocean energy devices.
As discussed above, the main sources of image data are pre-recorded 3D modeling of sonar data (i.e., computer-generated 3D video) and possibly other video data; live sonar data and video data obtained in real time; user-determined 3D elements; and textual or graphical communications intended to be displayed on the user interface screen. The geographical position and depth of any elements or regions included in the image data are known by GPS positioning, by use of acoustic and/or inertial positioning systems, and/or by reference to maps.
In a preferred embodiment of the invention, a virtual video generation module 11 is provided for using the aforementioned stored 3D elements or real-time detected 3D elements to create a virtual video of such 3D elements. The virtual video generation module 11 may work in concert with a synchronization module 12.
The synchronization module 12 aligns the position of the virtual camera of the virtual video with the angle and position of a real camera on an ROV. According to one embodiment the virtual camera defines a field of view for the virtual video, which may preferably extend between 45 and 144 degrees from a central point of view. As illustrated in
A superimposition module 13, whose function is additionally diagrammed in
Yet another feature of the superimposition module 13 is that either one or both of the virtual 20 or real videos 21 may be manipulated, based upon a luminance threshold, to be more transparent in areas of lesser interest, thus allowing the corresponding area of the other video feed to show through. According to one embodiment of the invention, luminance in the Red-Green-Blue hexadecimal format may be between 0-0-0 and 255-255-255, and preferably between 0-0-0 and 40-40-40. Areas of lesser interest may be selected by a system default, or by the user. The color intensity of images in areas of lesser interest is set at the luminance threshold, and the corresponding region of the other video is set at normal luminance. For the example shown in
Navigation Engine
The on-screen, 2D Navigation Interface for the ROV pilot involve superimposing geopositioned data or technical information on a 2D rendering system. Geopositioning or geotagging of data and elements is executed by reference to maps or to global positioning satellites. The resulting Navigation Interface, as seen in
The planning module enables engineers and/or supervisors to plan one or several ROV missions. Referring again to
In another aspect of the invention, procedures 35, including timed procedures (fixed position observation tasks, for example), may be included on the Navigation Interface as text. Given this procedural information, a ROV pilot is enabled to anticipate and complete tasks more accurately. A user may also use the system to define actionable areas. Actionable areas are geopositioned areas in the undersea environment that trigger a system action when entering, leaving, or staying longer than a designated time. The triggered action could be an alarm, notification, procedure change, task, change, etc. Referring to
With reference to
Data Engine
The data engine, which mediates the data warehousing and data transfer functions of the invention, therefore incorporates the logging and supervising modules.
The logging module logs or records all information made available by the operating system and saves such data in a central database for future access. The available information may include any or all telemetry, sonar data, 3D models, bathymetry, waypoints, checkpoints, alarms or malfunctions, procedures, operations, and navigation records such as flight path information, positioning and inertial data, etc.
An essential part of any offshore operation providing critical data to the client after the operation is concluded. After the operation, during the debriefing and reporting stage, the debriefing and reporting module may provide a full 3D scenario or reproduction of the operation. The debriefing and reporting module may provide a report on the planned flight path versus the actual flight path, waypoints, checkpoints, several deviations on the plan, alarms given by the ROV, including details of alarm type, time and location, procedures, checkpoints, etc. ready to be delivered to the client. Accordingly, the operating system is configured to provide four-dimensional (three spatial dimensions plus time) interactive reports for every operation. This enables fast analysis and a comprehensive understanding of operations.
Yet another software element that interacts with of the Navigation Interface is the supervisor module. Execution of the supervisor module enables one or more supervisors to view and/or utilize the Navigation Interface, and by extension, any ROV 1 being controlled from the interface. These supervisors need not share the location of the ROV pilot or pilots, but rather may employ the connectivity elements depicted in
Thus, there has been shown and described a system and method relating to navigation and control of ROVs. The method and system are not limited to any particular hardware or software configuration. The many variations, modifications and alternative applications of the invention that would be apparent to these skilled in the art, and that do not depart from the scope of the invention are deemed to be covered by the invention.
Parente Da Silva, Manuel Alberto
Patent | Priority | Assignee | Title |
11010975, | Mar 06 2018 | VELAN STUDIOS, INC | Remote camera augmented reality system |
Patent | Priority | Assignee | Title |
5412569, | Mar 29 1994 | General Electric Company | Augmented reality maintenance system with archive and comparison device |
5706195, | Sep 05 1995 | General Electric Company | Augmented reality maintenance system for multiple rovs |
9195231, | Nov 09 2011 | OCEAN INFINITY PORTUGAL , S A | System and method of operation for remotely operated vehicles with superimposed 3D imagery |
20010037163, | |||
20040179104, | |||
20050285875, | |||
20080027591, | |||
20080071431, | |||
20090276105, | |||
20100091036, | |||
20110264303, | |||
20130197801, | |||
FR2949167, | |||
GB2464985, | |||
JP11139390, | |||
WO124536, | |||
WO2006011153, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
May 06 2014 | PARENTE DA SILVA, MANUEL ALBERTO | ABYSSAL S A | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 043053 | /0293 | |
Jul 20 2017 | Abyssal S.A. | (assignment on the face of the patent) | / | |||
May 12 2022 | ABYSSAL, S A | OCEAN INFINITY PORTUGAL , S A | CHANGE OF NAME SEE DOCUMENT FOR DETAILS | 063693 | /0762 |
Date | Maintenance Fee Events |
Mar 08 2023 | M1551: Payment of Maintenance Fee, 4th Year, Large Entity. |
Date | Maintenance Schedule |
Sep 24 2022 | 4 years fee payment window open |
Mar 24 2023 | 6 months grace period start (w surcharge) |
Sep 24 2023 | patent expiry (for year 4) |
Sep 24 2025 | 2 years to revive unintentionally abandoned end. (for year 4) |
Sep 24 2026 | 8 years fee payment window open |
Mar 24 2027 | 6 months grace period start (w surcharge) |
Sep 24 2027 | patent expiry (for year 8) |
Sep 24 2029 | 2 years to revive unintentionally abandoned end. (for year 8) |
Sep 24 2030 | 12 years fee payment window open |
Mar 24 2031 | 6 months grace period start (w surcharge) |
Sep 24 2031 | patent expiry (for year 12) |
Sep 24 2033 | 2 years to revive unintentionally abandoned end. (for year 12) |