In one aspect, a system for monitoring sensor performance on an agricultural machine may include a controller configured to receive a plurality of images from the vision-based sensor mounted on an agricultural machine. The controller may be configured to determine an image parameter value associated with each of a plurality of pixels contained within each of the plurality of images. For each respective pixel of the plurality of pixels, the controller may be configured to determine a variance associated with the image parameter values for the respective pixel across the plurality of images. Furthermore, when the variance associated with the image parameter values for a given pixel of the plurality of pixels falls outside of a predetermined range, the controller may be configured to identify the given pixel as being at least one of obscured or inoperative.
|
1. A system for monitoring sensor performance on an agricultural machine, the system comprising:
an agricultural machine;
a vision-based sensor mounted on the agricultural machine, the vision-based sensor being configured to capture first and second images; and
a controller communicatively coupled to the vision-based sensor, the controller being configured to:
receive the first and second images from the vision-based sensor;
determine an image parameter value associated with each of a plurality of pixels contained within each of the first and second images;
for each respective pixel of the plurality of pixels, determine a variance between the image parameter value of the respective pixel in the first image and the image parameter value of the respective pixel in the second image; and
when the variance associated with the image parameter values for a given pixel of the plurality of pixels falls below a threshold value, identify the given pixel as being at least one of obscured or inoperative.
13. A method for monitoring sensor performance on an agricultural machine, the method comprising:
receiving, with a computing device, first and second images from the vision-based sensor;
determining, with the computing device, an image parameter value associated with each of a plurality of pixels contained within each of the first and second images;
for each respective pixel of the plurality of pixels, determining, with the computing device, a variance between the image parameter value of the respective pixel in the first image and the image parameter value of the respective pixel in the second image;
when the variance associated with the image parameter values for a given pixel of the plurality of pixels falls below a threshold value, identifying, with the computing device, the given pixel as being at least one of obscured or inoperative;
determining, with the computing device, when the vision-based sensor is obscured or inoperative based on a number or a density of individual pixels of the plurality of pixels that have been identified as being obscured or inoperative; and
initiating, with the computing device, a control action when it is determined that the vision-based sensor is obscured or inoperative.
2. The system of
3. The system of
4. The system of
5. The system of
6. The system of
7. The system of
aggregate the individual pixels of the plurality of pixels that have been identified as being obscured or inoperative into one or more pixel groups; and
identify the vision-based sensor as being obscured or inoperative when a density of the individual pixels within the one or more the pixel groups exceeds a predetermined threshold density value.
8. The system of
9. The system of
10. The system of
11. The system of
12. The system of
14. The method of
15. The method of
16. The method of
17. The method of
identifying, with the computing device, the given pixel as being obscured or inoperative when an intensity of the light detected by the given pixel falls below a predetermined intensity threshold.
18. The method of
19. The method of
aggregating, with the computing device, the individual pixels of the plurality of pixels that have been identified as being obscured or inoperative into one or more pixel groups; and
identifying, with the computing device, the vision-based sensor as being obscured or inoperative when a density of the individual pixels within the one or more the pixel groups exceeds a predetermined threshold density value.
20. The system of
|
The present disclosure generally relates to agricultural machines and, more particularly, to systems and methods for monitoring sensor performance, such as vision-based sensor performance, on an agricultural machine.
Agricultural implements, such as cultivators, disc harrows, seeders, and/or the like, perform one or more agricultural operations while being towed across a field by a suitable work vehicle, such as in agricultural tractor. In this regard, agricultural implements often include one or more sensors mounted thereon to monitor various parameters associated with the performance of such agricultural operations. For example, some agricultural implements include one or more cameras or other vision-based sensors that capture images of the soil and/or plants within the field. Thereafter, such images may be processed or analyzed to determine one or more parameters associated with the condition of soil and/or plants, such as parameters related to soil roughness, plant health, weed growth, and/or the like.
During the performance of many agricultural operations, the implement typically generates large amounts of dust and other airborne particulate matter. In this regard, dust may adhere to the lens(es) of the camera(s) mounted on the implement in such a manner that one or more pixels of the cameras(s) are obscured or otherwise blocked from receiving light. Furthermore, large amounts of dust present within the field(s) of view of the camera(s) may also obscure various pixels of the camera(s). Image data captured by cameras having obscured pixels may have low quality, thereby resulting in poor camera performance.
Accordingly, an improved system and method for monitoring sensor performance on an agricultural machine would be welcomed in the technology.
Aspects and advantages of the technology will be set forth in part in the following description, or may be obvious from the description, or may be learned through practice of the technology.
In one aspect, the present subject matter is directed to a system for monitoring sensor performance on an agricultural machine. The system may include an agricultural machine and a vision-based sensor mounted on the agricultural machine, with the vision-based sensor being configured to capture a plurality of images. The system may also include a controller communicatively coupled to the vision-based sensor. The controller may be configured to receive the plurality of images from the vision-based sensor and determine an image parameter value associated with each of a plurality of pixels contained within each of the plurality of images. For each respective pixel of the plurality of pixels, the controller may also be configured to determine a variance associated with the image parameter values for the respective pixel across the plurality of images. Furthermore, when the variance associated with the image parameter values for a given pixel of the plurality of pixels falls outside of a predetermined range, the controller may also be configured to identify the given pixel as being at least one of obscured or inoperative.
In another aspect, the present subject matter is directed to a method for monitoring sensor performance on an agricultural machine. The method may include receiving, with a computing device, the plurality of images from the vision-based sensor. The method may also include determining, with the computing device, an image parameter value associated with each of a plurality of pixels contained within each of the plurality of images. Moreover, the method may include, for each respective pixel of the plurality of pixels, determining, with the computing device, a variance associated with the image parameter values for the respective pixel across the plurality of images. Furthermore, the method may include, when the variance associated with the image parameter values for a given pixel of the plurality of pixels falls outside of a predetermined range, identifying, with the computing device, the given pixel as being at least one of obscured or inoperative. The method may further include determining, with the computing device, when the vision-based sensor is obscured or inoperative based on a number or a density of individual pixels of the plurality of pixels that have been identified as being obscured or inoperative. Additionally, the method may include initiating, with the computing device, a control action when it is determined that the vision-based sensor is obscured or inoperative.
These and other features, aspects and advantages of the present technology will become better understood with reference to the following description and appended claims. The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the technology and, together with the description, serve to explain the principles of the technology.
A full and enabling disclosure of the present technology, including the best mode thereof, directed to one of ordinary skill in the art, is set forth in the specification, which makes reference to the appended figures, in which:
Repeat use of reference characters in the present specification and drawings is intended to represent the same or analogous features or elements of the present technology.
Reference now will be made in detail to embodiments of the invention, one or more examples of which are illustrated in the drawings. Each example is provided by way of explanation of the invention, not limitation of the invention. In fact, it will be apparent to those skilled in the art that various modifications and variations can be made in the present invention without departing from the scope or spirit of the invention. For instance, features illustrated or described as part of one embodiment can be used with another embodiment to yield a still further embodiment. Thus, it is intended that the present invention covers such modifications and variations as come within the scope of the appended claims and their equivalents.
In general, the present subject matter is directed to systems and methods for monitoring sensor performance on an agricultural machine. In general, a controller of the disclosed system may be configured to determine when one or more pixels of a vision-based sensor (e.g., a camera) mounted on the agricultural machine are obscured or otherwise effectively inoperative. Specifically, in several embodiments, the controller may be configured to receive a plurality of images from the vision-based sensor. Furthermore, the controller may then be configured to determine an image parameter value associated with each of a plurality of pixels contained within each of the received images. For example, in one embodiment, the controller may be configured to determine an intensity value associated with each pixel contained within each of the received images. Thereafter, for each respective pixel, the controller may be configured to determine a variance associated with the image parameter values for each respective pixel across the received images. For instance, in one embodiment, the controller may be configured to determine a differential or range between the determined intensity values for each respective pixel across the received images. When the variance associated with the image parameter values for a given pixel falls outside of a predetermined range, the controller may be configured to identify the given pixel as obscured or inoperative.
Referring now to drawings,
As shown in
Additionally, as shown in
Furthermore, in accordance with aspects of the present subject matter, the agricultural machine 10 may include one or more vision-based sensors 104 coupled thereto and/or supported thereon. As will be described below, each vision-based sensor 104 may be configured to capture image data and and/or other vision-based data from the field (e.g., of the soil present within the field) across which the implement 14 is moved. Specifically, in several embodiments, the vision-based sensor(s) 104 may be provided in operative association with the work vehicle 12 and/or the implement 14 such that the vision-based sensor(s) 104 has a field of view or sensor detection range directed towards a portion(s) of the field adjacent to the work vehicle 12 and or the implement 14. For example, as shown in
Moreover, it should be appreciated that the vision-based sensor(s) 104 may correspond to any suitable sensing device(s) configured to detect or capture image data or other vision-based data (e.g., point cloud data) associated with the soil present within an associated field of view. For example, in several embodiments, the vision-based sensor(s) 104 may correspond to a suitable camera(s) configured to capture images of the field, such as three-dimensional images of the soil surface or the plants present with in the associated field of view. For instance, in a particular embodiment, the vision-based sensor(s) 104 may correspond to a stereographic camera(s) having two or more lenses with a separate image sensor for each lens to allow the camera(s) to capture stereographic or three-dimensional images. However, in alternative embodiments, the vision-based sensor(s) 104 may correspond to Light Detection and Ranging (LIDAR) sensor(s) or any other suitable vision-based sensing device(s).
Additionally, it should be further appreciated that the configurations of the agricultural machine 10 described above and shown in
Referring now to
Referring now to
As shown in
Moreover, the system 100 may further include a controller 122 configured to electronically control the operation of one or more components of the agricultural machine 10, such as one or more components of the work vehicle 12 and or the implement 14. In general, the controller 122 may comprise any suitable processor-based device known in the art, such as a computing device or any suitable combination of computing devices. Thus, in several embodiments, the controller 122 may include one or more processor(s) 124 and associated memory device(s) 126 configured to perform a variety of computer-implemented functions. As used herein, the term “processor” refers not only to integrated circuits referred to in the art as being included in a computer, but also refers to a controller, a microcontroller, a microcomputer, a programmable logic controller (PLC), an application specific integrated circuit, and other programmable circuits. Additionally, the memory device(s) 126 of the controller 122 may generally comprise memory element(s) including, but not limited to, a computer readable medium (e.g., random access memory (RAM)), a computer readable non-volatile medium (e.g., a flash memory), a floppy disk, a compact disc-read only memory (CD-ROM), a magneto-optical disk (MOD), a digital versatile disc (DVD) and/or other suitable memory elements. Such memory device(s) 126 may generally be configured to store suitable computer-readable instructions that, when implemented by the processor(s) 124, configure the controller 122 to perform various computer-implemented functions, such as one or more aspects of the method 200 described below with reference to
It should be appreciated that the controller 122 may correspond to an existing controller of the work vehicle 12 or the implement 14 or the controller 122 may correspond to a separate processing device. For instance, in one embodiment, the controller 122 may form all or part of a separate plug-in module that may be installed within the work vehicle 12 or implement 14 to allow for the disclosed system and method to be implemented without requiring additional software to be uploaded onto existing control devices of the work vehicle 12 or implement 14.
Furthermore, in one embodiment, the system 100 may also include a user interface 102. Specifically, the user interface 102 may be communicatively coupled to the controller 122 via a wired or wireless connection to allow feedback signals (e.g., as indicated by dashed line 144 in
In several embodiments, the controller 122 may be configured to determine an image parameter value for a plurality of images received from the vision-based sensor(s) 104. Specifically, the controller 122 may be communicatively coupled to the vision-based sensor(s) 104, via a wired or wireless connection to allow image data (e.g., as indicated by dashed line 128 in
Referring now to
Thereafter, for each respective pixel, the controller 122 may be configured to determine a variance associated with the determined image parameter values (e.g., light intensity) for such respective pixel across the received images. Specifically, in several embodiments, the controller 122 may be configured to calculate the variance in the determined image parameter values for each analyzed pixel across the received images. Such variance may correspond to a differential defined between the image parameter values, the standard deviation of the image parameter values, the range of the image parameter values, and/or any other suitable statistical parameter associated with variance of the image parameter values. Furthermore, in one embodiment, when determining the variance, the controller 122 may be configured to assign weights to the determined image parameter values, such as based on the time when the images were captured by the vision-based sensor(s) 104. In such embodiment, the assigned weights may impact the effect each image parameter value has on the variance calculation. For example, the controller 122 may be configured to assign greater weights to the image parameter values associated with more recently captured images and lower weights to the image parameter values associated with older images. In this regard, the controller 122 may include suitable a mathematical formula(s) stored within its memory 126 for calculating or otherwise determining the variance based on the determined image parameter values.
Referring again to
Furthermore, the controller 122 may be configured to identify when a given pixel is obscured or inoperative. Specifically, as the agricultural machine 10 is moved across the field, various objects (e.g., plants, residue, soil, and/or the like) within the field move into and subsequently out of the field(s) of view of the vision-based sensor(s) 104. That is, the images captured by the vision-based sensor(s) 104 may generally change as the agricultural machine 10 is moved across the field. As such, the determined image parameter values (e.g., intensity, color, and/or the like) may vary for each pixel across the received images. In this regard, little or no variance in determined image parameter values for a given pixel may generally be indicative of the given pixel being obscured or inoperative. For example, dust or another particulate may be present on the lens(es) 106 of the vision-based sensor(s) 104 that obscures one or more of the pixels associated image sensor 108. In general, a pixel may be obscured when a translucent or opaque particulate (e.g., dust, dirt, plant matter, and/or the like) reduces the intensity of the light sensed by that pixel of the image sensor 108. As such, a pixel may be obscured when the particulate partially or entirely blocks the light. Furthermore, an inoperative or “dead” pixel may generally be a pixel on the image sensor 108 that is unable to detect light, such as due to a failure of that pixel. Accordingly, in several embodiments, the controller 122 may be configured to compare the determined variance associated with each analyzed pixel to a predetermined variance range. In the event that the determined variance for a given pixel falls outside of the predetermined variance range, the controller 122 may be configured to identify the given pixel as being obscured or inoperative. It should be appreciated that, in alternative embodiments, the controller 122 may be configured to compare the determined variance associated with each analyzed pixel to only one of a predetermined variance maximum threshold or a predetermined variance minimum threshold.
Additionally, in one embodiment, the controller 122 may be configured to identify a given pixel as being obscured or inoperative based on the intensity value of the given pixel. In certain instances (e.g., during daytime operations), when the image sensor(s) 108 of the vision-based sensor(s) 104 are properly functioning, the pixels contained within the captured images generally have a certain threshold level of intensity. As such, a given pixel may generally be obscured or inoperative when the intensity of the given pixel falls below a certain intensity threshold. In this regard, the controller 122 may be configured to compare the determined intensity values for each analyzed pixel within the received images to a predetermined intensity threshold. In the event that the determined intensity for a given pixel falls below the predetermined intensity threshold, the controller 122 may be configured to identify the given pixel as being obscured or inoperative.
In several embodiments, the controller 122 may be configured to determine when the entire vision-based sensor(s) 104 is effectively obscured or inoperative. Specifically, when a number of pixels of the associated image sensor 108 of a vision-based sensor 104 are determined to be obscured or inoperative, the images captured by such vision-based sensor 104 may be of low quality. In such instances, the vision-based sensor 104 as a whole may be considered obscured or inoperative. For example, in one embodiment, the controller 122 may be configured to compare the number of individual pixels that have been identified as being obscured or inoperative to a predetermined threshold amount. In the event that the total number of obscured or inoperative pixels exceeds the predetermined threshold amount, the controller 122 may be configured to identify the associated vision-based sensor 104 as being obscured or inoperative.
In another embodiment, the controller 122 may be configured to determine when the vision-based sensor(s) 104 is effectively obscured or inoperative based on the density of the identified obscured or inoperative pixels. In certain instances, although the total number of obscured or inoperative pixels may be low, such obscured or inoperative pixels may be clustered or grouped together in a manner that obscures a large enough portion of the captured image such that the overall image quality is low. In such embodiment, the controller 122 may be configured to aggregate or group the individual pixels that have been identified as being obscured or inoperative into one or more pixel groups. For example, such pixel groups may be groups of obscured or inoperative pixels clustered together or regions of the captured images. Thereafter, the controller 122 may be configured to determine the density of the pixels (i.e., the number of obscured or inoperative pixels per unit of area) within each pixel group and compare the determined density to a predetermined threshold density value. In the event that the determined density of one or more of the pixel groups exceeds the predetermined threshold density value, the controller 122 may be configured to identify the associated vision-based sensor 104 as being effectively obscured or inoperative.
Referring now to
Referring back to
Moreover, in one embodiment, the controller 122 may be configured to automatically adjust the speed at which the work vehicle 12 is towing the implement 14 across the field when one or more of the vision-based sensors 104 are identified as being obscured or inoperative. Specifically, the controller 122 may be communicatively coupled to the engine 23 and/or the transmission 25 of the work vehicle 12 via a wired or wireless connection to allow control signals (e.g., as indicated by dashed lines 146, 148 in
Additionally, in another embodiment, the controller 122 may be configured to automatically initiate a cleaning operation of the vision-based sensor(s) 104 that has been identified as obscured or inoperative. Specifically, the controller 122 may be communicatively coupled to the cleaning system(s) 110 (e.g., the associated actuator 118) of the vision-based sensor(s) 104 via a wired or wireless connection to allow control signals (e.g., as indicated by dashed lines 150 in
Referring now to
As shown in
Additionally, at (204), the method 200 may include determining, with the computing device, an image parameter value associated with each of a plurality of pixels contained within each of the images received from the vision-based sensor. For instance, as described above, the controller 122 may be configured to determine image parameter value (e.g., a value associated with light, intensity, color, etc.) associated with at least a portion of the pixels contained within each of the received images.
Moreover, as shown in
Furthermore, at (208), the method 200 may include, when the variance associated with the image parameter values for a given pixel of the plurality of pixels is outside of a predetermined range, identifying, with the computing device, the given pixel as being at least one of obscured or inoperative. For instance, as described above, the controller 122 may be configured to identify a given pixel contained within the received images as being obscured or inoperative when the determined variance of the image parameter values for the given pixel falls below a predetermined variance range.
As shown in
Additionally, at (212), the method 200 may include initiating, with the computing device, a control action when it is determined that the vision-based sensor is obscured or inoperative. As described above, such control actions may include controlling one or more components of the implement 14 and/or the work vehicle 12. For instance, as indicated above, the controller 122 may be configured to automatically initiate a control action that results in the ground speed of the implement 14 and/or the work vehicle 12 being adjusted, such as by automatically controlling the operation of the vehicle's engine 23 and/or transmission 25. Moreover, the controller 122 may also be configured to automatically control the operation of a cleaning system(s) 110 of the vision-based sensor(s) 104 to remove particulates from the associated lens(es) 106.
This written description uses examples to disclose the technology, including the best mode, and also to enable any person skilled in the art to practice the technology, including making and using any devices or systems and performing any incorporated methods. The patentable scope of the technology is defined by the claims, and may include other examples that occur to those skilled in the art. Such other examples are intended to be within the scope of the claims if they include structural elements that do not differ from the literal language of the claims, or if they include equivalent structural elements with insubstantial differences from the literal language of the claims.
Henry, James W., Posselius, John H., Ferrari, Luca, Turpin, Bret T., Bybee, Taylor C.
Patent | Priority | Assignee | Title |
Patent | Priority | Assignee | Title |
10255670, | Jan 08 2017 | Dolly Y. Wu PLLC | Image sensor and module for agricultural crop improvement |
10261962, | Sep 04 2012 | Shutterfly, LLC | System and method for intelligently determining image capture times for image applications |
5148378, | Nov 18 1988 | OMRON CORPORATION, 10, TSUCHIDO-CHO, HANAZONO, UKYO-KU, KYOTO-SHI KYOTO-FU, JAPAN | Sensor controller system |
5227121, | Nov 02 1989 | WESTINGHOUSE ELECTRIC CO LLC | Advanced nuclear plant control room complex |
6555986, | Oct 29 1998 | MiniMed Inc. | Method and apparatus for detection of occlusions |
6662091, | Jun 29 2001 | Battelle Memorial Institute | Diagnostics/prognostics using wireless links |
6687654, | Sep 10 2001 | The Johns Hopkins University | Techniques for distributed machinery monitoring |
8519348, | Sep 08 2009 | CARESTREAM HEALTH, INC | Image quality monitor for digital radiography system |
8566047, | Apr 14 2008 | CORPORATION NUVOLT INC | Electrical anomaly detection method and system |
8818567, | Sep 11 2008 | Deere & Company | High integrity perception for machine localization and safeguarding |
8868304, | Feb 10 2012 | Deere & Company | Method and stereo vision system for facilitating the unloading of agricultural material from a vehicle |
8928486, | Aug 14 2012 | CNH Industrial Canada, Ltd | Pressure-based blockage detection system and method in crop production systems |
8963733, | Feb 13 2012 | Honeywell International, Inc | System and method for blind fault detection for rotating machinery |
9282688, | Apr 25 2014 | Deere & Company | Residue monitoring and residue-based control |
9554098, | Apr 25 2014 | Deere & Company | Residue monitoring and residue-based control |
9574903, | Dec 19 2013 | UChicago Argonne, LLC | Transient multivariable sensor evaluation |
9785653, | Jul 16 2010 | Shutterfly, LLC | System and method for intelligently determining image capture times for image applications |
20040075661, | |||
20080231027, | |||
20090087078, | |||
20090174773, | |||
20120328190, | |||
20140232869, | |||
20160379067, | |||
20170112043, | |||
20180004775, | |||
20180338422, | |||
20190150357, | |||
CN204362560, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
May 29 2018 | TURPIN, BRET T | AUTONOMOUS SOLUTIONS, INC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 046004 | /0447 | |
May 29 2018 | BYBEE, TAYLOR C | AUTONOMOUS SOLUTIONS, INC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 046004 | /0447 | |
May 29 2018 | FERRARI, LUCA | AUTONOMOUS SOLUTIONS, INC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 046004 | /0447 | |
May 29 2018 | TURPIN, BRET T | CNH Industrial Canada, Ltd | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 046004 | /0447 | |
May 29 2018 | BYBEE, TAYLOR C | CNH Industrial Canada, Ltd | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 046004 | /0447 | |
May 29 2018 | FERRARI, LUCA | CNH Industrial Canada, Ltd | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 046004 | /0447 | |
Jun 01 2018 | POSSELIUS, JOHN H | CNH Industrial Canada, Ltd | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 046004 | /0447 | |
Jun 01 2018 | POSSELIUS, JOHN H | AUTONOMOUS SOLUTIONS, INC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 046004 | /0447 | |
Jun 04 2018 | HENRY, JAMES W | CNH Industrial Canada, Ltd | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 046004 | /0447 | |
Jun 04 2018 | HENRY, JAMES W | AUTONOMOUS SOLUTIONS, INC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 046004 | /0447 | |
Jun 06 2018 | Autonomous Solutions, Inc. | (assignment on the face of the patent) | / | |||
Jun 06 2018 | CNH Industrial Canada, Ltd. | (assignment on the face of the patent) | / |
Date | Maintenance Fee Events |
Jun 06 2018 | BIG: Entity status set to Undiscounted (note the period is included in the code). |
Nov 30 2023 | M1551: Payment of Maintenance Fee, 4th Year, Large Entity. |
Date | Maintenance Schedule |
Jul 14 2023 | 4 years fee payment window open |
Jan 14 2024 | 6 months grace period start (w surcharge) |
Jul 14 2024 | patent expiry (for year 4) |
Jul 14 2026 | 2 years to revive unintentionally abandoned end. (for year 4) |
Jul 14 2027 | 8 years fee payment window open |
Jan 14 2028 | 6 months grace period start (w surcharge) |
Jul 14 2028 | patent expiry (for year 8) |
Jul 14 2030 | 2 years to revive unintentionally abandoned end. (for year 8) |
Jul 14 2031 | 12 years fee payment window open |
Jan 14 2032 | 6 months grace period start (w surcharge) |
Jul 14 2032 | patent expiry (for year 12) |
Jul 14 2034 | 2 years to revive unintentionally abandoned end. (for year 12) |