Methods and systems for assisting a pilot during flight of an aircraft are disclosed. The method includes receiving data indicative of a current location of the aircraft during flight and data indicative of a flight boundary, and determining that the flight boundary is relevant to the aircraft. The method also includes causing a display device to display a surface representing the flight boundary relative to the current location of the aircraft.
|
1. A system for assisting a pilot during flight of an aircraft, the apparatus comprising:
a display device;
one or more processors operatively connected to the display device; and
a non-transitory machine-readable memory operatively connected to the one or more processors, storing instructions executable by the one or more processors and configured to cause the one or more processors to:
using data indicative of a current location of the aircraft during flight and data indicative of a flight boundary to be crossed by the aircraft, determine that the flight boundary is within a predetermined distance threshold between the current location of the aircraft and the flight boundary; and
generate an output configured to cause the display device to display, relative to the current location of the aircraft, a surface graphically representing the flight boundary.
2. The system as defined in
8. The system as defined in
9. The system as defined in
using data indicative of a trajectory of the aircraft, identify one of the cells through which the aircraft is expected to penetrate the flight boundary; and
cause the identified cell to be visually distinguished from one or more other cells in the mesh of cells.
10. The system as defined in
11. The system as defined in
12. The system as defined in
14. The system as defined in
15. The system as defined in
16. The system as defined in
17. The system as defined in
the flight boundary is a virtual terrain protection floor disposed a predetermined distance above a ground; and
the instructions are configured to cause the one or more processors to, using data indicative of terrain, cause the surface to depict a change in elevation indicative of a corresponding change in elevation in the terrain.
|
This application relies for priority on U.S. Provisional Patent Application Ser. No. 62/827,455 filed on Apr. 1, 2019, the entire content of which is incorporated herein by reference.
The disclosure relates generally to aircraft, and more particularly to systems and methods for assisting a pilot during flight by providing enhanced situational awareness.
During an aircraft's approach to landing, some aircraft can be operated according to visual flight rules (VFR) or instrument flight rules (IFR). Under VFR, a pilot operates the aircraft in weather conditions (i.e., visual meteorological conditions (VMC)) generally clear enough to allow the pilot to operate the aircraft using visual references external to the aircraft. If the meteorological conditions do not meet VMC, pilots must operate the aircraft under instrument flight rules (IFR) and operation of the aircraft must be conducted primarily through referencing the instruments rather than visual references. If the VMC are not met, the meteorological conditions are considered instrument meteorological conditions (IMC) and the aircraft may only be operated under IFR. IFR operation can require specific training and certification of the pilot and can also mandate higher equipment requirements for the aircraft.
An instrument approach conducted under IFR terminates either by the pilot acquiring sufficient visual reference to proceed to the runway, or with a missed approach because the required visual reference is not acquired in time. An instrument approach procedure is intended to orderly transfer the aircraft under instrument flight conditions from the beginning of the initial approach to a landing or to a point from which the landing can be made visually. In a precision approach, a decision height (DH) or decision altitude (DA) is a specified lowest height or altitude during the approach descent at which, if the required visual reference to continue the approach (such as the runway markings or runway environment) is not visible to the pilot, the pilot must initiate a missed approach (i.e., go-around). The specific values for DH and/or DA at a given airport are established with the intention to allow a pilot sufficient time to safely reconfigure the aircraft to climb and execute the missed approach procedure while avoiding terrain or other obstacles. A DH is a value measured in reference to a ground level while a DA is a value measured in reference to a mean sea level.
In a non-precision approach when no electronic glideslope is provided, the minimum descent altitude (MDA) is the lowest altitude, expressed in reference to the mean sea level, to which descent is authorized on final approach or during circle-to-land maneuvering. Unlike a DH or DA, a missed approach does not need to be initiated immediately upon reaching the MDA. A pilot flying a non-precision approach may descend to the MDA and maintain it until reaching the missed approach point (MAP), then initiate a missed approach if the required visual reference was not obtained. An aircraft must not descend below the MDA until visual reference is obtained and the aircraft can land while performing normal maneuvers.
Existing aircraft systems provide situational awareness relating to DH, DA or MDA that can require a significant amount of the pilots focus and attention during flight phases of high workload. Improvement is desirable.
In one aspect, the disclosure describes a system for assisting a pilot during flight of an aircraft. The apparatus comprises:
The output may be configured to cause the display device to simultaneously display an attitude indicator and the surface representing the flight boundary.
The surface may be planar.
The surface may be non-planar.
The surface may be at least partially transparent.
The flight boundary may be a lateral flight boundary.
The flight boundary may be a vertical flight boundary.
The output may be configured to cause the display device to display a mesh of cells on the surface.
The instructions may be configured to cause the one or more processors to:
The instructions may be configured to cause the one or more processors to cause a size of the cells in the mesh of cells to be defined as a function of a distance between the aircraft and the flight boundary.
The instructions may be configured to cause the one or more processors to cause the size of the cells to dynamically vary as the distance between the aircraft and the flight boundary varies.
The instructions may be configured to cause the one or more processors to cause an aspect ratio of the cells to be defined as a function of the current location of the aircraft.
Some or all of the cells may be hexagons.
The instructions may be configured to cause the one or more processors to cause an appearance of the surface to vary as a distance between the aircraft and the flight boundary varies.
The flight boundary may be a decision height, a decision altitude or a minimum descent altitude associated with an approach conducted under instrument flight rules.
The flight boundary may be a virtual terrain protection floor disposed a predetermined distance above a ground. The instructions may be configured to cause the one or more processors to, using data indicative of terrain, cause the surface to depict a change in elevation indicative of a corresponding change in elevation in the terrain.
The flight boundary may demarcate a restricted airspace.
The display device may be a head-down display device.
The display device may be a head-up display device.
The display device may be a primary flight display device.
Embodiments can include combinations of the above features.
In another aspect, the disclosure describes an aircraft comprising a system as described herein.
In another aspect, the disclosure describes a computer-implemented method for assisting a pilot during flight of an aircraft. The method comprises:
The method may comprise causing the display device to simultaneously display an attitude indicator and the surface representing the flight boundary.
The surface may be planar.
The surface may be non-planar.
The surface may be at least partially transparent.
The flight boundary may be a lateral flight boundary.
The flight boundary may be a vertical flight boundary.
The method may comprise causing the display device to display a mesh of cells on the surface.
The method may comprise:
The method may comprise causing a size of the cells in the mesh of cells to be defined as a function of a distance between the aircraft and the flight boundary.
The method may comprise dynamically varying the size of the cells in the mesh of cells as the distance between the aircraft and the flight boundary varies.
The method may comprise defining an aspect ratio of the cells in the mesh of cells as a function of the current location of the aircraft.
Some or all of the cells may be hexagons.
The method may comprise causing an appearance of the surface to vary as a distance between the aircraft and the flight boundary varies.
The flight boundary may be a decision height, a decision altitude or a minimum descent altitude associated with an approach conducted under instrument flight rules.
The flight boundary may be a virtual terrain protection floor disposed a predetermined distance above a ground.
The flight boundary may be a virtual terrain protection floor disposed a predetermined distance above a ground and the method may comprise using data indicative of terrain to cause the surface to depict a change in elevation indicative of a corresponding change in elevation in the terrain.
The flight boundary may demarcate a restricted airspace.
The display device may be a head-down display device.
The display device may be a head-up display device.
The display device may be a primary flight display device.
Embodiments can include combinations of the above features.
In another aspect, the disclosure describes a computer program product for a method for assisting a pilot during flight of an aircraft. The computer program product may comprise a non-transitory computer readable storage medium containing program code. The program code may be readable/executable by a computer, processor or logic circuit to perform a method as described herein.
Further details of these and other aspects of the subject matter of this application will be apparent from the detailed description included below and the drawings.
Reference is now made to the accompanying drawings, in which:
The following description describes systems and methods for assisting a pilot during flight of an aircraft. In various embodiments, the systems and methods disclosed herein can improve the operation of an aircraft flight deck by providing a contextual synthetic visual (e.g., graphical) representation of one or more flight boundaries to the pilot in an intuitive and non-obstructive manner that can enhance the situational awareness of the pilot during phases of flight of relatively high workload. Such enhancement in situational awareness can improve flight safety by helping reduce the potential for pilot error such as the unintended crossing of such flight boundaries. The systems and methods described herein can be used during approaches to landing and/or during other phases of flight.
As explained below, such flight boundaries can include high or low vertical flight boundaries (e.g., decision height (DH), decision altitude (DA), minimum descent altitude (MDA), a virtual terrain protection floor and high or low altitude limits) and/or left or right lateral flight boundaries that demarcate restricted airspace, mountains or other obstacles for example. The flight boundaries referenced herein can be virtual such that they do not represent surfaces of real physical objects (e.g., ground, buildings or mountains) in the vicinity of the aircraft. Instead, the flight boundaries can be disposed above the ground and be spaced apart from actual objects in order to, for example, provide a safe clearance distance between the aircraft and such object(s).
Aspects of various embodiments are described through reference to the drawings.
Flight deck 12 can include additional or fewer elements than those shown and described herein. Flight deck 12 can include left portion 12A intended to be used by a pilot (sometimes referred as “captain”) of aircraft 10 and right portion 12B intended to be used by a co-pilot (sometimes referred as “first officer”) of aircraft 10. As referenced herein, the term “pilot” is intended to encompass an individual responsible for the operation of aircraft 10 during flight such as a pilot or a co-pilot. Left portion 12A and right portion 12B can have functionally identical components so that at least some operational redundancy can be provided between left portion 12A and right portion 12B of flight deck 12.
Flight deck 12 can include one or more display devices 14A, 14B (referred generally herein as “display device 14”) providing respective display areas. In the exemplary configuration of flight deck 12 shown in
Flight deck 12 can also include one or more head-up display devices 14B (referred hereinafter in the singular as “HUD 14B”) which can be transparent displays that can present data without requiring the pilot to look away from his/her usual point out the windshield of aircraft 10. HUD 14B can present information to the pilot while the pilot's head is positioned “up” and looking forward, instead of angled down looking at lower instruments or displays such as HDDs 14A. In various embodiments, right and left portions 12A, 12B of flight deck 12 can each have a HUD 14B or, alternatively, flight deck 12 can include only one HUD 14B disposed in left portion 12A of flight deck 12 for example. HUDs 14A can either be fix-mounted displays or head-mounted displays (including helmet-mounted displays). In various embodiments, HUD 14B can include a CRT configured to generate an image on a phosphor screen, a solid state light source such as a LED that is modulated by an LCD screen to display an image, optical waveguides that produce an image directly in a combiner, or a scanning laser configured to display an image on a transparent medium.
It is understood that aspects of this disclosure, including the display of flight boundaries, are not limited to one or more display devices 14 that are part of flight deck 12 or that are onboard aircraft 10. For example, the display of flight boundaries and associated steps could, alternatively or in addition, be performed off of aircraft 10 using a system and/or display device 14 that are off of aircraft 10 and are used to control aircraft 10 remotely. For example, relevant information could be transmitted between aircraft 10 and a location remote from aircraft 10 (e.g., ground station) in order to be used by an operator (i.e., pilot) controlling/flying aircraft 10 remotely.
Computer 24 can receive input(s) 34 in the form of data or information that can be processed by data processor 26 according to instructions 30 in order to generate suitable output for controlling display device 14. Input 34 can include information (data) indicative of a condition of aircraft 10. Input 34 can be received via manual entry by the pilot using one or more pilot input devices such as cursor control device 18 and/or multi-function keypad 20 for example. Alternative or in addition, input 34 can be received automatically from one or more data sources (e.g., aircraft systems 36) operatively connected to computer 24 such as a navigation system, a terrain database of a synthetic vision system, a flight management system, an air data system and/or a (e.g., radar) altimeter for example. Input 34 can include operating parameters of aircraft 10 measured via suitable sensors or derived from data acquired via such sensors. Input 34 can include data indicative of a current location of aircraft 10 during flight, data indicative of a trajectory of aircraft 10, data indicative of a speed (e.g., air speed or ground speed) of aircraft 10, data indicative of terrain in proximity of aircraft 10, data indicative of an altitude of aircraft 10, data indicative of one or more flight boundaries for example. Input 34 can include information (data) indicative of a substantially real-time status of aircraft 10.
Computer 24 can be part of an avionics suite of aircraft 10 and be operatively integrated with avionic components of aircraft 10. For example, in some embodiments, computer 24 can be configured to carry out additional functions than those described herein including the management of one or more graphic user interfaces of flight deck 12 and/or other part(s) of aircraft 10. The methods disclosed herein (or part(s) thereof) could be performed using a plurality of computers 24 or data processors 26, or, alternatively, be performed entirely using a single computer 24 or data processor 26. In some embodiments, computer 24 could be physically integrated with (e.g., embedded in) display device 14.
Data processor 26 can include any suitable device(s) configured to cause a series of steps to be performed by computer 24 so as to implement a computer-implemented process such that instructions 30, when executed by computer 24, can cause the functions/acts specified in the methods described herein to be executed. Data processor 26 can include, for example, any type of general-purpose microprocessor or microcontroller, a digital signal processing (DSP) processor, an integrated circuit, a field programmable gate array (FPGA), a reconfigurable processor, other suitably programmed or programmable logic circuits, or any combination thereof.
Memory 28 can include any suitable known or other machine-readable storage medium. Memory 28 can include non-transitory computer readable storage medium such as, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. Memory 28 can include a suitable combination of any type of computer memory that is located either internally or externally to computer 24. Memory 28 can include any storage means (e.g. devices) suitable for retrievably storing machine-readable instructions 30 executable by data processor 26.
Various aspects of the present disclosure can be embodied as apparatus, devices, methods and/or computer program products. Accordingly, aspects of the present disclosure can take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects. Furthermore, aspects of the present disclosure can take the form of a computer program product embodied in one or more non-transitory computer readable medium(ia) (e.g., memory 28) having computer readable program code (e.g., instructions 30) embodied thereon. The computer program product can, for example, be executed by computer 24 to cause the execution of one or more methods disclosed herein in entirety or in part.
Instructions 30 (e.g., computer program code) for carrying out operations for aspects of the present disclosure can be written in any combination of one or more programming languages, including an object oriented programming language and conventional procedural programming languages. Such program code can be executed entirely or in part by computer 24 or other data processing device(s). It is understood that, based on the present disclosure, one skilled in the relevant arts could readily write computer program code for implementing the methods disclosed herein.
Surface 32 can be at least partially transparent so as to not significantly obstruct the view of other objects shown on display device 14. For example, surface 32 can be displayed in the form of an additional layer of information that is superimposed on other objects. In some embodiments, surface 32 can have the form of a wireframe representing a mesh of cells 44A, 44B on surface 32. The outlines of cells 44A, 44B can be represented by lines and the areas within cells 44A, 44B can be substantially transparent to limit the visual obstruction of other objects shown within the same display area. Cells 44A, 44B can be hexagons or have another suitable shape (e.g., square, rectangular, octagonal). In some embodiments, hexagonal cells 44A, 44B can be more distinctive and therefore preferred to avoid potential confusion with other objects shown simultaneously by the same display device 14.
The scenario depicted in
Trajectory 38 can be a projected flight path ahead of aircraft 10 indicating the expected flight path of aircraft 10 for a period of time in the future. Trajectory 38 may or may not be displayed together with surface 32 on display device 14. Nevertheless, information indicative of trajectory 38 can be used to calculate an intersection point between trajectory 38 (e.g., a line) and surface 32 (e.g., a plane). Such intersection point can be used to graphically indicate a location on surface 32 at which aircraft 10 is expected to penetrate the flight boundary represented by surface 32. Such intersection point can be indicated on surface 32 using a symbol, arrow, marker or other graphical object. In embodiments where a mesh of cells 44A, 44B is shown on surface 32, the intersection point of trajectory 38 and surface 32 can be used to identify cell 44A in which the intersection point lies. The identified cell 44A can then be visually distinguished from other cells 44B in the mesh of cells in order to indicate which cell 44A (e.g., penetration zone/window) on surface 32 aircraft 10 is expected to pass through. Visually distinguishing cell 44A from other cells 44B can be done by highlighting, changing the color, changing the line width and/or changing the line type of the lines used to outline the identified cell 44A for example. Alternatively, visually distinguishing cell 44A from other cells 44B can be done by applying a translucent shading in the area of the identified cell 44A while keeping the other cells 44B transparent. In some embodiments, a shadow of the wireframe displayed on surface 32 can be displayed on other objects such as airfield 40 and runway 42 in order to enhance the perception of the spacing between surface 32 and such other objects.
In some embodiments, surface(s) 32 can be displayed on display device 14 only when the corresponding flight boundary(ies) are relevant to aircraft 10 and hence important to the pilot. For example, the determination of whether or not surface(s) 32 should be displayed can be made based on the proximity of aircraft 10 to the corresponding flight boundary(ies). Accordingly, instructions 30 can be configured to cause computer 24 to use data indicative of the current location of aircraft 10 and data indicative of the location of the flight boundary to determine whether the distance between aircraft 10 and the flight boundary is within a predetermined distance (i.e., a relevancy distance threshold) that indicates that the flight boundary is relevant to aircraft 10 at the current time. Similarly, instructions 30 can be configured to cause computer 24 to determine when the one or more flight boundaries are no longer relevant (e.g., no longer within the relevancy distance threshold) and should be removed from display device 14. Different types of flight boundaries can have different relevancy distance thresholds associated with them. In some embodiments, locations, types and corresponding relevancy distance thresholds could be stored in a suitable look-up table in memory 28 or otherwise be accessible to computer 24. Computer 24 can be configured via instructions 30 to automatically determine when one or more flight boundaries become relevant or irrelevant and cause surface(s) 32 to be displayed or not displayed on display device 14.
The presence of surface 32 displayed within the same scene as airfield 40, runway 42 and/or other objects can provide a clear and intuitive representation of the flight boundary and enhance situational awareness for the pilot. For example, surface 32 can graphically represent the notion of a floor, ceiling or wall of which the pilot should be aware. In some situations, the display of surface 32 can reduce the risk of a pilot inadvertently descending below the DH, DA or MDA, especially in single-pilot operation of aircraft 10. In some embodiments, surface 32 can make it obvious to the pilot(s) that they are about to cross an altitude/height below which they cannot descend without first having a visual reference of runway 42.
In reference to
The change in appearance of surface 32 as aircraft 10 approaches the DH can provide a sense of anticipation and closure, and thereby contributes toward enhancing the situational awareness of the approaching flight boundary by requiring less of the pilot's attention compared to existing systems. In various embodiments, the presentation of surface 32 can be used instead of or in addition to any other alerting method(s) associated with the applicable flight boundary. In some embodiments, the change of appearance of surface 32 can be a change in the scale of the mesh of cells 44A, 44B. For example, a size of the cells in the mesh of cells could be defined as a function of the distance between aircraft 10 and the DH. In some embodiments, computer 24 can, via instructions 30, be configured to cause the size of at least some cells 44A, 44B to change (e.g., dynamically and in substantially real-time) as the distance between aircraft 10 and the DH changes. For example, the size of cells 44A, 44B could be increases as aircraft 10 gets closer to the applicable flight boundary.
In some embodiments, the appearance of surface 32 can also be varied as a function of the speed of aircraft 10. For example, computer 24 can, via instructions 30, be configured to cause the mesh of cells to be displaced as a function of the displacement of aircraft 10 to provide an indication of speed and displacement of aircraft 10 relative to the flight boundary. In some embodiments, an aspect ratio of W/H (see
The use of surface(s) 32 can facilitate the operation of aircraft 10 during required navigation performance (RNP) approaches which can require specific, sometimes non-straight corridors and confinement zones. Accordingly, such flight boundaries could be graphically represented at the pilot's discretion to improve overall safety of RNP approaches and also potentially improve RNP approach performance when an auto-pilot system of aircraft 10 is not engaged.
Method 100 can include causing display device 14 to simultaneously display an attitude indicator and surface 32 representing the flight boundary. Surface 32 can be planar or non-planar (e.g., curved). Surface 32 can be at least partially transparent.
Flight boundary can be a lateral or vertical flight boundary.
Surface 32 can have a mesh of cells 44A, 44B displayed thereon. Using data indicative of trajectory 38 of aircraft 10, cell 44A through which aircraft 10 is expected to penetrate the flight boundary can be identified. The identified cell 44A can be visually distinguished from one or more other cells 44B in the mesh of cells.
An appearance of surface 32 can be varied as the distance between aircraft 10 and the flight boundary varies to provide height perception. For example, data indicative of the current location of aircraft 10 relative to the flight boundary can be used to cause a size of cells 44A, 44B to be defined as a function of a distance between aircraft 10 and the flight boundary. Method 100 can include dynamically varying the size of cells 44A, 44B as the distance between aircraft 10 and the flight boundary varies.
An appearance of surface 32 can be dynamically varied as the speed of aircraft 10 varies to provide some speed perception. Method 100 can include defining an aspect ratio of cells 44A, 44B as a function of the current location of aircraft 10. At least some cells 44A, 44B can be hexagons.
The flight boundary represented by surface 32 can be a DH, a DA or a MDA associated with an approach conducted under instrument flight rules. The flight boundary represented by surface 32 can be a virtual terrain protection floor disposed above a ground by a predetermined distance D. Data indicative of terrain can be used to cause surface 32 to depict a change in elevation indicative of a corresponding change in elevation in the terrain. The flight boundary can demarcate restricted airspace.
The above description is meant to be exemplary only, and one skilled in the relevant arts will recognize that changes may be made to the embodiments described without departing from the scope of the invention disclosed. The present disclosure may be embodied in other specific forms without departing from the subject matter of the claims. The present disclosure is intended to cover and embrace all suitable changes in technology. Modifications which fall within the scope of the present invention will be apparent to those skilled in the art, in light of a review of this disclosure, and such modifications are intended to fall within the appended claims. Also, the scope of the claims should not be limited by the preferred embodiments set forth in the examples, but should be given the broadest interpretation consistent with the description as a whole.
Lebegue, Olivier, Ouellette, Benoit
Patent | Priority | Assignee | Title |
Patent | Priority | Assignee | Title |
10565882, | Sep 13 2017 | Rockwell Collins, Inc.; Rockwell Collins, Inc | Vertical situation display past and projected path depiction |
10598932, | Jan 06 2016 | Rockwell Collins, Inc. | Head up display for integrating views of conformally mapped symbols and a fixed image source |
5838262, | Dec 19 1996 | Sikorsky Aircraft Corporation | Aircraft virtual image display system and method for providing a real-time perspective threat coverage display |
7352292, | Jan 20 2006 | MERCURY SYSTEMS, INC | Real-time, three-dimensional synthetic vision display of sensor-validated terrain data |
7609200, | May 29 2007 | Rockwell Collins, Inc | Radar derived perspective display system |
8615337, | Sep 25 2008 | Rockwell Collins, Inc.; Rockwell Collins, Inc | System supporting flight operations under instrument meteorological conditions using precision course guidance |
8718915, | Sep 08 2008 | Rockwell Collins, Inc. | System, module, and method for generating an image of a flight route corridor on a display unit |
9527601, | Feb 05 2013 | Honeywell International Inc. | Method and apparatus for generating a virtual inner marker for an aircraft landing approach |
9936191, | Jan 27 2016 | Honeywell International Inc. | Cockpit display systems and methods for generating cockpit displays including enhanced flight visibility indicators |
20110202206, | |||
20120296499, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Apr 02 2019 | LEBEGUE, OLIVIER | Bombardier Inc | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 052260 | /0836 | |
Apr 29 2019 | OUELLETTE, BENOIT | Bombardier Inc | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 052260 | /0836 | |
Mar 30 2020 | Bombardier Inc. | (assignment on the face of the patent) | / |
Date | Maintenance Fee Events |
Mar 30 2020 | BIG: Entity status set to Undiscounted (note the period is included in the code). |
Date | Maintenance Schedule |
Sep 20 2025 | 4 years fee payment window open |
Mar 20 2026 | 6 months grace period start (w surcharge) |
Sep 20 2026 | patent expiry (for year 4) |
Sep 20 2028 | 2 years to revive unintentionally abandoned end. (for year 4) |
Sep 20 2029 | 8 years fee payment window open |
Mar 20 2030 | 6 months grace period start (w surcharge) |
Sep 20 2030 | patent expiry (for year 8) |
Sep 20 2032 | 2 years to revive unintentionally abandoned end. (for year 8) |
Sep 20 2033 | 12 years fee payment window open |
Mar 20 2034 | 6 months grace period start (w surcharge) |
Sep 20 2034 | patent expiry (for year 12) |
Sep 20 2036 | 2 years to revive unintentionally abandoned end. (for year 12) |