systems and methods are provided for displaying information on a display device associated with an aircraft. A method comprises rendering a synthetic perspective view of terrain on the display device, wherein the synthetic perspective view of terrain is based on a set of terrain data corresponding to a region proximate the aircraft. The method further comprises obtaining location data for a first object, wherein the location data is based at least in part on a beacon signal associated with the first object, and rendering a graphical representation of the first object on the display device. The graphical representation of the first object overlies the synthetic perspective view of terrain and is positioned in accordance with the location data.
|
14. A method for displaying information on a display device associated with an aircraft, the method comprising:
rendering a synthetic perspective view of terrain on the display device, the synthetic perspective view of terrain being based on a set of terrain data corresponding to a region proximate the aircraft;
obtaining location data for an object, the location data being based at least in part on a beacon signal emitted by a distress radio beacon associated with the object;
determining an object type for the object based on a beacon identifier obtained from the beacon signal; and
rendering a graphical representation of the object on the display device using a symbology with a shape representative of the object type, wherein the symbology overlies the synthetic perspective view of terrain and is positioned in accordance with the location data.
21. A method for displaying information on a display device associated with an aircraft, the method comprising:
rendering a primary flight display on the display device, wherein the primary flight display comprises a synthetic perspective view of terrain for a region proximate a current location of the aircraft, and wherein the synthetic perspective view of terrain corresponds to a flight deck viewpoint;
receiving a locating signal for a target object, the locating signal being based at least in part on a beacon signal emitted by a ground based beacon associated with the target object, the locating signal comprising identification information for the target object;
determining an object type for the target object based on the identification information; and
in response to the locating signal, rendering, in a portion of the primary flight display, a first symbology with a shape representative of the object type for the target object, wherein the first symbology is rendered in a manner that is influenced by the locating signal.
1. A method for displaying information on a display device associated with an aircraft, the method comprising:
rendering a primary flight display on the display device, wherein the primary flight display comprises a synthetic perspective view of terrain for a region proximate a current location of the aircraft, and wherein the synthetic perspective view of terrain corresponds to a flight deck viewpoint;
receiving a locating signal for a target object, the locating signal being based at least in part on a beacon signal emitted by a distress radio beacon associated with the target object, the locating signal comprising identification information for the target object;
determining an object type for the target object based on the identification information; and
in response to the locating signal, rendering, in a portion of the primary flight display, a first symbology with a shape representative of the object type for the target object, wherein the first symbology is rendered in a manner that is influenced by the locating signal.
9. A display system for an aircraft, the display system comprising:
a display device;
a communications system;
a flight management system coupled to the communications system, the flight management system and the communications system being cooperatively configured to:
obtain a locating signal for a target object, the locating signal being based at least in part on a beacon signal emitted by a distress radio beacon associated with the target object, the locating signal including identification information related to the target object based on a beacon identifier obtained from the beacon signal; and
determine an object type for the target object based on the identification information; and
a graphics system coupled to the flight management system and the display device, wherein the graphics system and the flight management system are cooperatively configured to:
render a primary flight display on the display device, wherein the primary flight display comprises a synthetic perspective view of terrain for a region proximate a current location of the aircraft, and wherein the synthetic perspective view of terrain corresponds to a flight deck viewpoint;
obtain location data for the target object based on the locating signal; and
render, in a portion of the primary flight display, a first symbology with a shape representative of the object type, wherein the first symbology is positioned in accordance with the location data.
2. The method of
3. The method of
calculating navigational parameters for the target object based on the location data and the current location of the aircraft; and
displaying the navigational parameters in the primary flight display.
4. The method of
5. The method of
6. The method of
7. The method of
8. The method of
10. The display system of
11. The display system of
12. The display system of
13. The method of
15. The method of
the synthetic perspective view of terrain comprises a three-dimensional conformal view of terrain based on the set of terrain data; and
rendering the graphical representation of the object comprises rendering the symbology in a conformal manner with respect to the synthetic perspective view of terrain.
16. The method of
determining navigational parameters for navigating to the object based on the location data and a current location of the aircraft; and
displaying the navigational parameters on the display device overlying the synthetic perspective view of terrain proximate the symbology.
17. The method of
18. The method of
19. The method of
20. The method of
rendering a lateral map display on the display device concurrently to rendering the synthetic perspective view of terrain, the lateral map display corresponding to a second region proximate the aircraft and including an aircraft symbol; and
rendering a second graphical representation of the object on the lateral map display using a second symbology representative of the object type, wherein the second symbology is positioned relative to the aircraft symbol in accordance with the location data.
|
The subject matter described herein relates generally to avionics systems, and more particularly, embodiments of the subject matter relate to cockpit display systems adapted for displaying objects associated with a beacon configured to emit a signal.
Modern flight deck displays (or cockpit displays) for vehicles (such as aircraft or spacecraft) display a considerable amount of information, such as vehicle position, speed, altitude, attitude, navigation, target, and terrain information. In the case of an aircraft, most modern displays additionally display a flight plan from different views, either a lateral view, a vertical view, or a perspective view, which can be displayed individually or simultaneously on the same display.
The lateral view, generally known as a lateral map display, is basically a top-view of the flight plan, and may include, for example, a top-view aircraft symbol, waypoint symbols, line segments that interconnect the waypoint symbols, and range rings. The lateral map may also include various map features including, for example, weather information, terrain information, political boundaries, and navigation aids. The terrain information may include situational awareness (SA) terrain, as well as terrain cautions and warnings which, among other things, may indicate terrain that may obstruct the current flight path of the aircraft. The perspective view provides a three-dimensional view of the vehicle flight plan and may include one or more of the above-mentioned features that are displayed on the lateral map, including the terrain information. In this regard, some modern flight deck display systems incorporate a synthetic terrain display, which generally represents a virtual or computer simulated view of terrain rendered in a conformal manner. The primary perspective view used in existing synthetic vision systems emulates a forward-looking cockpit viewpoint. Such a view is intuitive and provides helpful visual information to the pilot and crew.
Often, such aircraft are utilized when performing search and rescue (SAR) operations in conjunction with a beacon-based SAR system. Most beacon-based SAR systems utilize a beacon (e.g., a distress radio beacon or locator beacon), which is a transmitter associated with a person, vehicle, or vessel. In an emergency situation, the beacon is activated, which causes the beacon to emit a beacon signal that is received by one or more satellites in a satellite system. The satellite system processes the beacon signal(s) and determines the approximate real-world location of the beacon (e.g., via triangulation, trilateration, global positioning system (GPS) techniques, and the like). The location of the beacon is provided to the aircraft, which then utilizes this location when attempting to locate the person, vehicle, or vessel.
In many situations, SAR operations are performed in inaccessible or remote locations, for example, in marine environments (e.g., over open ocean) or alpine environments (e.g., mountainous locations). In these various locations, a SAR aircraft might encounter rough weather conditions, such as, high winds, clouds, precipitation, and/or low visibility. This increases the difficulty on behalf of the pilot and/or crew to safely operate the aircraft while simultaneously attempting to locate the source of the beacon signal(s) (e.g., a person or vehicle). In addition, pilot and/or crew are often operating the aircraft at a reduced flight level during SAR operations, which further increases the demands on the pilot and/or crew. In these situations, the three-dimensional perspective view used in existing synthetic vision systems aids the pilot in safely navigating and operating the aircraft to avoid terrain and/or obstacles, however, the pilot and/or crew are still left with the task of manually navigating to the identified location and locating the source of the beacon signal(s).
A method is provided for displaying information on a display device associated with an aircraft. The method comprises rendering a synthetic perspective view of terrain on the display device, wherein the synthetic perspective view of terrain is based on a set of terrain data corresponding to a region proximate the aircraft. The method further comprises obtaining location data for a first object, wherein the location data is based at least in part on a beacon signal associated with the first object, and rendering a graphical representation of the first object on the display device. The graphical representation of the first object overlies the synthetic perspective view of terrain and is positioned in accordance with the location data.
In another embodiment, a method is provided for displaying information on a display device associated with an aircraft. The method comprises rendering a primary flight display on the display device, wherein the primary flight display comprises a synthetic perspective view of terrain for a region proximate a current location of the aircraft, and wherein the synthetic perspective view of terrain corresponds to a flight deck viewpoint. The method further comprises receiving a locating signal for a target object and, in response to the locating signal, rendering, in a portion of the primary flight display, a first symbology representative of the target object, wherein the first symbology is rendered in a manner that is influenced by the locating signal.
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
Embodiments of the subject matter will hereinafter be described in conjunction with the following drawing figures, wherein like numerals denote like elements, and
The following detailed description is merely exemplary in nature and is not intended to limit the subject matter of the application and uses thereof. Furthermore, there is no intention to be bound by any theory presented in the preceding background or the following detailed description.
Techniques and technologies may be described herein in terms of functional and/or logical block components, and with reference to symbolic representations of operations, processing tasks, and functions that may be performed by various computing components or devices. It should be appreciated that the various block components shown in the figures may be realized by any number of hardware, software, and/or firmware components configured to perform the specified functions. For example, an embodiment of a system or a component may employ various integrated circuit components, e.g., memory elements, digital signal processing elements, logic elements, look-up tables, or the like, which may carry out a variety of functions under the control of one or more microprocessors or other control devices.
The following description refers to elements or nodes or features being “coupled” together. As used herein, unless expressly stated otherwise, “coupled” means that one element/node/feature is directly or indirectly joined to (or directly or indirectly communicates with) another element/node/feature, and not necessarily mechanically. Thus, although the drawings may depict one exemplary arrangement of elements, additional intervening elements, devices, features, or components may be present in an embodiment of the depicted subject matter. In addition, certain terminology may also be used in the following description for the purpose of reference only, and thus are not intended to be limiting. For example, the terms “first”, “second” and other such numerical terms referring to structures do not imply a sequence or order unless clearly indicated by the context
For the sake of brevity, conventional techniques related to graphics and image processing, navigation, flight planning, aircraft controls, distress radio beacons or locator beacons, satellite-based search and rescue (SAR) systems, and other functional aspects of the systems (and the individual operating components of the systems) may not be described in detail herein. Furthermore, the connecting lines shown in the various figures contained herein are intended to represent exemplary functional relationships and/or physical couplings between the various elements. It should be noted that many alternative or additional functional relationships or physical connections may be present in an embodiment of the subject matter.
Technologies and concepts discussed herein relate to display systems adapted for displaying, on a display device associated with an aircraft, a graphical representation of a targeted object overlying a synthetic perspective view of terrain for a region proximate the aircraft in response to receiving a locating signal associated with the targeted object. In an exemplary embodiment, the targeted object is rendered and/or displayed in a conformal manner overlying a three-dimensional perspective view of terrain in a primary flight display. The target object may thereby be presented in a manner that enhances the situational awareness and thereby improving SAR operations in situations involving rough weather, poor visibility, and/or rugged terrain. Navigational parameters for the target object may be determined relative to the current location and/or heading of the aircraft and displayed proximate the graphical representation of the target object to provide further assistance in locating the target object. In addition, various rescue related criteria may be identified and utilized to influence rendering of the target object, allowing relevant information to be presented in a quick and intuitive manner.
In an exemplary embodiment, the display device 102 is coupled to the graphics system 110. The graphics system 110 is coupled to the flight management system 108, and the flight management system 108 and the graphics system 110 are cooperatively configured to display, render, or otherwise convey one or more graphical representations or images associated with operation of the aircraft 114 on the display device 102, as described in greater detail below. The flight management system 108 is coupled to the navigation system 104 for obtaining real-time navigational data and/or information regarding operation of the aircraft 114 to support operation of the flight management system 108. In an exemplary embodiment, the communications system 106 is coupled to the flight management system 108 and configured to support communications between the aircraft 114 and a beacon-based SAR system 116, as described in greater detail below.
In an exemplary embodiment, the display device 102 is realized as an electronic display configured to graphically display flight information or other data associated with operation of the aircraft 114 under control of the graphics system 110. In an exemplary embodiment, the display device 102 is located within a cockpit of the aircraft 114. It will be appreciated that although
In an exemplary embodiment, the navigation system 104 is configured to obtain one or more navigational parameters associated with operation of the aircraft 114. The navigation system 104 may be realized as a global positioning system (GPS), inertial reference system (IRS), or a radio-based navigation system (e.g., VHF omni-directional radio range (VOR) or long range aid to navigation (LORAN)), and may include one or more navigational radios or other sensors suitably configured to support operation of the navigation system 104, as will be appreciated in the art. In an exemplary embodiment, the navigation system 104 and the flight management system 108 are cooperatively configured to obtain and/or determine the current location of the aircraft 114 (e.g., the latitude and longitude) and the heading of the aircraft 114 (i.e., the direction the aircraft is traveling in relative to some reference) during operation of the aircraft 114.
In an exemplary embodiment, the communications system 106 is suitably configured to support communications between the aircraft 114 and a beacon-based SAR system 116. In this regard, a beacon-based SAR system 116 should be understood as referring to a system or network that utilizes a beacon 118 (e.g., a distress radio beacon or locator beacon) to locate an object (e.g., a person, vehicle, vessel, aircraft, or another suitable object) that is indicating a desire to be located, rescued, or receive some other type of attention (e.g., the object is in distress or otherwise experiencing a state of emergency). In this regard, the beacon 118 is realized as a transmitter associated with a particular person (e.g., a personal locator beacon or PLB), aircraft (e.g., an emergency locator transmitter or ELT), vessel (e.g., an emergency position-indicating radio beacon or EPIRB), or another vehicle or object. When activated (either manually or automatically), the beacon 118 emits a beacon signal which is utilized to determine the approximate real-world location of the beacon 118 and thereby locate the object associated with the beacon 118, as described in greater detail below. In an exemplary embodiment, the beacon 118 emits a beacon signal comprising a burst of digital information at 406 MHz, although in alternative embodiments, the beacon 118 may emit an analog beacon signal or emit a beacon signal having a different frequency, such as, for example, 121.5 MHz, 243 MHz, or another suitable frequency. In an exemplary embodiment, the beacon 118 has a unique identifier (referred to herein as the beacon identifier) which is utilized to establish an association between the beacon 118 (or beacon signal) and the object associated with the beacon 118. For example, the beacon identifier may comprise a unique hexadecimal identification code, a maritime mobile service identity (MMSI) code, an international civil aviation organization (ICAO) code, or the like that may be used to identify the object associated with the beacon 118. Various aspects of beacon-based SAR systems are well known and so, in the interest of brevity, will only be mentioned briefly herein or will be omitted entirely without providing the well known details.
As shown, in an exemplary embodiment, the communications system 106 communicates with a satellite-based SAR system 116, such as, for example, Cospas-Sarsat, which is configured to receive one or more beacon signals from the beacon 118. In this regard, the satellite-based SAR system 116 may include a plurality of satellites, signal processing stations, mission control centers and/or rescue coordination centers. For example, one or more satellites of the plurality of satellites may receive a beacon signal from the beacon 118, and communicate the beacon signal (or the information embodied by or derived from the beacon signal) to one or more ground based signal processing stations (or ground terminals). The ground stations may calculate and/or determine the approximate location of the beacon 118 (e.g., via triangulation, trilateration, GPS techniques, and the like). In an exemplary embodiment, SAR system 116 also identifies the object associated with the beacon 118 using the beacon identifier. For example, to register the beacon 118 with the SAR system 116, in addition to a unique hexadecimal identification code, the beacon may also be registered with information pertaining to the type of vehicle (or object), a vehicle (or object) identifier (e.g., the name, call sign, MMSI, or an identification number), along with information pertaining to the owner of the beacon 118 (or the associated vehicle or object) and emergency contact information. In this regard, in an exemplary embodiment, the SAR system 116 includes a registration database that maintains the association between the beacon identifier and the type and/or identity of the object or vehicle.
In an exemplary embodiment, a mission control center and/or rescue coordination center receives the data and/or information from the satellites and/or ground stations, and in response, communicates a locating signal that comprises the information derived from the received beacon signals (e.g., the approximate real-world location of the beacon 118 and/or the identity of the object based on the beacon identifier) to the aircraft 114 for conducting a SAR operation, as described in greater detail below. It should be appreciated in the art that although
Referring now to
In the illustrated embodiment, the primary flight display 202 and the lateral map display 204 represent defined sections or windows rendered on a single display device (e.g., display device 102), with the lateral map display 204 overlying a portion of the primary flight display 202. In other embodiments, the primary flight display 202 and the lateral map display 204 can be rendered on one or more physically distinct display devices. Although not a requirement, the general positioning, size, boundaries, and orientation of primary flight display 202 and lateral map display 204 within flight deck display 200 remain fixed during operation. It should be appreciated that flight deck display 200 as depicted in
In the illustrated embodiment, primary flight display 202 includes several features that are graphically rendered, including, without limitation a synthetic perspective view of terrain 206 and a graphical representation of one or more objects 208, 210, each object 208, 210 being associated with a locating signal received and/or obtained by the aircraft, as described in greater detail below. For the sake of clarity, simplicity, and brevity, the additional graphical elements of the primary flight display 202 (e.g., pilot guidance elements, altimeters, airspeed indicators, and the like) will not be described herein. As described in greater detail below, the objects 208, 210 comprise symbology representative of the type of object associated with the respective locating signal. In this regard, the each object 208, 210 is positioned within the primary flight display 202 overlying the terrain 206 in a manner that accurately reflects the approximate real-world location of the object 208, 210 based on one or more beacon signals emitted by and/or received from a beacon associated the object 208, 210, as described in greater detail below.
In an exemplary embodiment, the terrain 206 is based on a set of terrain data that corresponds to a region proximate the current location of aircraft. In this regard, the graphics system 110 includes or otherwise accesses one or more databases 112, and in conjunction with navigational information from flight management system 108 and/or navigation system 104, the graphics system 110 controls the rendering of the terrain 206 on the display device 102 and updates the set of terrain data being used for rendering as needed as the aircraft 114 travels. In this regard, the display system 100 includes one more databases 112 to rendering of the flight deck display 200 including, for example, a terrain database, a geopolitical database, a navigational aid (or NAVAID) database, an obstacle database, a marine database, or another other suitable commercial or military database. In addition, in some embodiments, the graphics system 110 may be coupled to one or more sensors (or sensor systems) and adapted to utilize real-time sensor data (e.g., infrared imagery) to augment the data used in rendering the terrain 206 or otherwise enhance the situational awareness and/or accuracy of the primary flight display 202 (e.g., an enhanced synthetic-vision system).
As shown, in an exemplary embodiment, the graphics system 110 is configured to render the terrain 206 in a perspective or three dimensional view that corresponds to a flight deck (cockpit) viewpoint. In other words, terrain 206 is displayed in a graphical manner that simulates a flight deck viewpoint, that is, the vantage point of a person in the cockpit of the aircraft. Thus, features of terrain 206 are displayed in a conformal manner, relative to the earth. For example, the relative elevations and altitudes of features in terrain 206 are displayed in a virtual manner that emulates reality. Moreover, as the aircraft navigates (e.g., turns, ascends, descends, rolls, etc.), the graphical representation of terrain 206 and other features of the perspective display can shift to provide a continuously updated virtual representation for the flight crew. It should be appreciated that although the perspective view associated with primary flight display 202 need not always include a perspective view of terrain 206. For example, in the absence of terrain data, the perspective view of the display may appear flat, blank, or otherwise void of conformal terrain graphics.
In an exemplary embodiment, the lateral map display 204 is concurrently rendered with primary flight display 202, preferably in a real-time and synchronized manner. The illustrated lateral map display 204 includes a top-view aircraft symbol 212 and top-view graphical representations 214, 216 of the one or more objects 208, 210. For the sake of clarity, simplicity, and brevity, the additional graphical elements of the lateral map display 204 will not be described herein. Lateral map display 204 also preferably includes various map features including, but not limited to, a lateral two-dimensional view of terrain 218 corresponding to a region proximate the current location of aircraft. The depicted state of the lateral map display 204 corresponds to the depicted state of the primary flight display 202. In other words, the terrain 218 and objects 214, 216 are positioned relative to the aircraft symbol 212 in a manner corresponding to the depiction of the terrain 206 and objects 208, 210 relative to the flight-deck viewpoint for the aircraft 114. In a similar manner as described above, each object 214, 216 is positioned within the lateral map display 204 overlying the terrain 218 in a manner that reflects the approximate real-world location of the object 214, 216 based on one or more beacon signals emitted by and/or received from the respective beacon associated with the object 214, 216, as described in greater detail below.
Referring now to
Referring again to
In an exemplary embodiment, the SAR display process 300 continues by determining or otherwise obtaining location data for the target object (task 304). In this regard, location data comprises data and/or information (e.g., latitude and longitude, GPS coordinates, and the like) used to identify the approximate real-world location or position of the source of the beacon signal(s) (e.g., the beacon associated with the target object) embodied by the locating signal. In accordance with one embodiment, the location data may be calculated by the SAR system 116 and encapsulated or otherwise contained within the locating signal. For example, the SAR system 116 may calculate the location of the target object by performing triangulation on a plurality of beacon signals received by a plurality of satellites. In this manner, the aircraft 114 may determine and/or identify location data for the target object based on the contents of the locating signal obtained from the SAR system 116. In another embodiment, the locating signal may contain information derived from one or more beacon signals, wherein the aircraft 114 calculates and/or determines the location data for the target object based on the locating signal.
Referring again to
In an exemplary embodiment, the SAR display process 300 continues by determining and/or identifying rescue criteria for the target object (task 308). In this regard, the rescue criteria represent information pertaining one or more categories deemed relevant to conducting the SAR operation, such as, for example, the object type of the target object, the number of persons associated with the target object, the type and/or amount of cargo associated with the target object, and/or geographic data (e.g., altitude or sea level depth) at or near the location of the target object. As described in greater detail below, in an exemplary embodiment, the SAR display process 300 may be adapted to display the identified rescue criteria on the display device and/or render a graphical representation of the target object in a manner that is influenced by the identified rescue criteria.
In accordance with one embodiment, the SAR display process 300 may identify an object type for the target object based on the locating signal (e.g., based on the associated beacon identifier). For example, as described above, the SAR system 116 and/or aircraft 114 includes a registration database that maintains the association between a beacon identifier and the type and/or identity of the object or vehicle. In this regard, the object associated with the beacon may be classified based on the type of vehicle and/or vessel, the type of cargo, the size of the vehicle and/or vessel, and/or the amount of persons associated with a particular object.
In an exemplary embodiment, in response to identifying the object type for the target object, the SAR display process 300 renders a graphical representation of the target object using a symbology representative (or indicative) of the identified object type, as described in greater detail below. For example, a maritime object (M1, M2, or M3) may be rendered in the shape of a ship or boat, with a different size depending on the classification, an aircraft (A) may be rendered with the shape of a plane, a person (P) may be rendered with the shape of a person, a ground station (G) with the shape of a building, and so on. It should be appreciated in the art that the symbology provided herein is for explanatory purposes, and is not intended to limit the subject matter in any way.
In accordance with one embodiment, the SAR display process 300 may identify and/or determine the number of persons associated with the target object. In response, the SAR display process 300 may render the symbology representing the target object using a first visually distinguishable characteristic based on the number of persons associated with the object, as described in greater detail below. In a similar manner, the SAR display process 300 may identify and/or determine the type of cargo associated with the target object, and in response, render the symbology representing the target object using a second visually distinguishable characteristic based on the type of cargo associated with the target object. In accordance with another embodiment, the SAR display process 300 may identify geographic data for the location of the target object, such as, for example, the altitude or depth of sea level at the identified location. In response, the SAR display process 300 may render the symbology representing the target object using a third visually distinguishable characteristic based on the geographic data for the location of the target object.
Referring now to
In addition, in an exemplary embodiment, the SAR display process 300 renders the symbology for a target object using one or more visually distinguishable characteristics based on one or more additional rescue criteria identified for the target object (e.g., task 308). In this regard, the visually distinguishable characteristic(s) is chosen to convey a particular level of urgency for a particular target object. Depending on the embodiment, the visually distinguishable characteristic may be realized by using one more of the following: color, hue, tint, brightness, graphically depicted texture or pattern, contrast, transparency, opacity, animation (e.g., strobing, flickering or flashing), and/or other graphical effects. For example, if the target object is associated with a maritime vessel carrying a high number of passengers or potentially hazardous cargo, the symbology may be rendered using a color, such as red, that indicates a greater urgency than another object (e.g., an unmanned vessel) which may be rendered using a different color, such as amber, to indicate lesser urgency. Similarly, another visually distinguishable characteristic may be used to indicate other rescue criteria. For example, if the depth of sea level is greater than a particular threshold value (e.g., deeper than a threshold value) for the location of a targeted maritime vessel, the symbology for the targeted vessel may be rendered using a visually distinguishable characteristic (e.g., flashing or strobing) to indicate a greater urgency than a targeted vessel in shallower water. In this regard, as shown in
In an exemplary embodiment, the SAR display process 300 continues by displaying the navigational parameters (e.g., from task 306) for the target object on the display device (task 312). In an exemplary embodiment, the navigational parameters are displayed proximate the graphical representation of the target object, however, in other embodiments, the positioning of the navigational parameters may vary depending on human factors and other concerns. For example, as shown in
In an exemplary embodiment, the SAR display process 300 also displays identified rescue criteria (e.g., from task 308) for the target object on the display device (task 314). Similarly, in the illustrated embodiment, the rescue criteria are displayed proximate the graphical representation of the target object, however, in other embodiments, the positioning of the rescue criteria may vary depending on human factors and other concerns. For example, as shown in
Referring now to
In the illustrated embodiment, primary flight display 502 includes a synthetic perspective view of terrain 506 and a graphical representation of two target objects 508, 510. As shown, the first target object 508 corresponds to a personal locator beacon or person (e.g., object type P) and the second target object 510 corresponds to a ground based beacon (e.g., object type G). The symbology of each respective object 508, 510 is positioned within the primary flight display 502 and rendered overlying the terrain 506 in a manner that reflects the approximate real-world location of the target object in a similar manner as described above. The terrain 506 for a region proximate the aircraft is rendered in a perspective or three dimensional view that corresponds to a flight deck (cockpit) viewpoint, and the terrain 506 is rendered in a conformal manner relative to the earth, with the symbology representative of the objects 508, 510 being rendered in a conformal manner relative to the terrain 506. As shown, the symbology for the first object 508 (e.g., a person) is rendered using a visually distinguishable characteristic (e.g., the color red) that indicates a greater urgency than for the second object 510. The respective navigational parameters (e.g., distance and bearing) for the objects 508, 510 along with the respective object types are displayed proximate each respective object 508, 510 in a similar manner as described above. It should be appreciated that the lateral map display 504 may also be modified to display graphical representations 514, 516 of the objects 508, 510 in a similar manner as described above in the context of
To briefly summarize, the methods and systems described above allow an aircraft operator (e.g., a pilot and/or crew member) to navigate to a target object and locate the target object using the flight deck display. By rendering a graphical representation of the target object overlying the terrain in a manner that accurately reflects the real-world location of the target object, the situational awareness of the aircraft operator is improved and the aircraft operator can more easily navigate the aircraft to the appropriate location and conduct the search in rough weather conditions (e.g., low visibility). The target object may also be rendered using symbology that provides relevant rescue information (e.g., rescue criteria) to the aircraft operator in a quick and intuitive fashion. In addition, the relevant navigational parameters and/or rescue criteria may be displayed proximate the symbology. The workload on the aircraft operator and others involved in the SAR is reduced, thereby improving the effectiveness of SAR operations.
While at least one exemplary embodiment has been presented in the foregoing detailed description, it should be appreciated that a vast number of variations exist. It should also be appreciated that the exemplary embodiment or exemplary embodiments are only examples, and are not intended to limit the scope, applicability, or configuration of the subject matter in any way. Rather, the foregoing detailed description will provide those skilled in the art with a convenient road map for implementing an exemplary embodiment of the subject matter. It being understood that various changes may be made in the function and arrangement of elements described in an exemplary embodiment without departing from the scope of the subject matter as set forth in the appended claims.
Pal, Subhadeep, Rathour, Sharad
Patent | Priority | Assignee | Title |
10909865, | Feb 06 2019 | Honeywell International Inc. | System and method to identify, depict and alert distress and special traffic based on at least squawk codes |
8649915, | May 28 2010 | LIGHTSPEED-TECHNOLOGIES LLC | Method and apparatus to facilitate providing a synthetic view of terrain for use in a moving vehicle |
9875659, | Nov 18 2014 | Honeywell International Inc.; Honeywell International Inc | System and method for exocentric display of integrated navigation |
Patent | Priority | Assignee | Title |
3750166, | |||
4293857, | Aug 10 1979 | Collision avoidance warning system | |
4510497, | Jul 20 1981 | Oki Electric Industry Co. Ltd. | Ramark beacon apparatus |
4547778, | Jun 09 1981 | Texas Instruments Incorporated | Method and apparatus for automatic distress call signal transmission |
5724045, | Sep 22 1994 | Mitsubishi Denki Kabushiki Kaisha | Radar transponder |
6275164, | Dec 11 1998 | Emergency locator system | |
6317049, | Feb 17 1998 | Apparatus and method for locating missing persons, animals, and objects | |
6658349, | May 14 2001 | Method and system for marine vessel tracking system | |
6670920, | Aug 15 2002 | Bae Systems Information and Electronic Systems Integration INC | System and method for single platform, synthetic aperture geo-location of emitters |
6708090, | Feb 29 2000 | Honeywell International Inc.; Honeywell International Inc | Method, apparatus and computer program product for managing line-of-sight communications |
6771969, | Jul 06 2000 | Harris Corporation | Apparatus and method for tracking and communicating with a mobile radio unit |
6853302, | Oct 10 2001 | TELESIS GROUP, INC , THE; E-WATCH, INC | Networked personal security system |
6943700, | Mar 01 2002 | Airbus Operations SAS | Distress beacon, process and device for monitoring distress signals, and vehicle on which such a device is installed |
6992626, | Mar 05 1999 | Harris Corporation | Method and apparatus to correlate aircraft flight tracks and events with relevant airport operations information |
7003278, | Jun 28 2001 | Tadiran Spectralink Ltd. | Portable search and rescue system |
7123192, | Feb 29 2000 | Harris Corporation | Correlation of flight track data with other data sources |
7239264, | May 25 2002 | BIRMINGHAM, THE UNIVERSITY OF | Radar transponder |
7312725, | Jul 08 2003 | Supersonic Aerospace International, LLC | Display system for operating a device with reduced out-the-window visibility |
7352292, | Jan 20 2006 | MERCURY SYSTEMS, INC | Real-time, three-dimensional synthetic vision display of sensor-validated terrain data |
7400249, | Oct 10 2001 | TELESIS GROUP, INC , THE; E-WATCH, INC | Networked personal security system |
7420501, | Mar 24 2006 | SAAB, INC | Method and system for correlating radar position data with target identification data, and determining target position using round trip delay data |
7440427, | Mar 12 2008 | MOBIT TELECOM LTD | Increasing channel capacity of TDMA transmitters in satellite based networks |
7495562, | Oct 10 2001 | E-WATCH, INC ; THE TELESIS GROUP, INC | Networked personal security system |
7564404, | Aug 10 2007 | MOBIT TELECOM LTD | Determining precise direction and distance to a satellite radio beacon |
7830305, | Sep 03 2004 | PROCON, INC | Locator beacon system having global communication capability |
8010282, | May 28 2003 | Megadata Corporation | System and method to display operational and revenue data for an airport facility |
8106753, | Aug 27 2008 | The Boeing Company | Determining and providing vehicle conditions and capabilities |
20070018887, | |||
20070219677, | |||
20070297696, | |||
20080024365, | |||
20080039985, | |||
20080158256, | |||
20090209227, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Feb 16 2009 | Honeywell International Inc. | (assignment on the face of the patent) | / | |||
Feb 16 2009 | PAL, SUBHADEEP | Honeywell International Inc | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 022262 | /0440 | |
Feb 16 2009 | RATHOUR, SHARAD | Honeywell International Inc | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 022262 | /0440 |
Date | Maintenance Fee Events |
Jun 27 2016 | M1551: Payment of Maintenance Fee, 4th Year, Large Entity. |
Jul 01 2020 | M1552: Payment of Maintenance Fee, 8th Year, Large Entity. |
Jun 25 2024 | M1553: Payment of Maintenance Fee, 12th Year, Large Entity. |
Date | Maintenance Schedule |
Jan 08 2016 | 4 years fee payment window open |
Jul 08 2016 | 6 months grace period start (w surcharge) |
Jan 08 2017 | patent expiry (for year 4) |
Jan 08 2019 | 2 years to revive unintentionally abandoned end. (for year 4) |
Jan 08 2020 | 8 years fee payment window open |
Jul 08 2020 | 6 months grace period start (w surcharge) |
Jan 08 2021 | patent expiry (for year 8) |
Jan 08 2023 | 2 years to revive unintentionally abandoned end. (for year 8) |
Jan 08 2024 | 12 years fee payment window open |
Jul 08 2024 | 6 months grace period start (w surcharge) |
Jan 08 2025 | patent expiry (for year 12) |
Jan 08 2027 | 2 years to revive unintentionally abandoned end. (for year 12) |