In a method and a computer system for screening of medical cases known proven positive and/or known proven normal cases are inserted in the flow of cases during the screening. On a user's option the known cases can be infiltrated in the case stream in accordance with a fixed pattern or on a random basis. When the radiologist makes a misdiagnosis of a known case this is recognized by the system and a corresponding output message is provided to the radiologist. Further the system has a database for tracing all user actions, including the diagnostic findings for the purposes of generating a user action report, quality control and assurance, and support in litigation.
|
1. A computer system for in-service monitoring of a user screening medical cases comprising:
a case stack of undiagnosed real cases to be reviewed by the user;
a library of known cases;
a user interface component for requesting a consecutive case, for display of the consecutive case, and for entering a diagnosis of the consecutive case;
a program component for receiving a request for the consecutive case from the user interface, the program component selecting the consecutive case from the case stack of real cases and the library of known cases for the display and the diagnosis; and
feedback component for outputting a message to the user if the user diagnosis of the selected known case is incorrect.
8. A computer system for in-service monitoring of a user screening medical cases, comprising:
a case stack of undiagnosed real cases to be reviewed by the user;
a library of known cases having verified diagnoses;
a user interface component for requesting a consecutive case, for displaying the consecutive case, and for entering a diagnosis of the consecutive case;
a program component for receiving a request for the consecutive case from the user interface to be displayed and diagnosed, the program component selecting the consecutive case from the case stack of real cases and the library of known cases; and
a feedback component for outputting a message to the user when a threshold of the known cases have been misdiagnosed per a given number of the real cases preceding the misdiagnosis.
2. A computer system as set forth in
3. A computer system as set forth in
4. A computer system as set forth in
5. A computer system as set forth in
6. A computer system as set forth in
7. A computer system as set forth in
10. The computer system according to
11. The computer system according to
|
This application is related to the following patent applications, filed at the same day as this application and assigned to the same assignee, MeVis Technology GmbH & Co. KG
The invention relates a method and apparatus for use in the field of screening of medical cases, and more particularly to training, quality control and quality assurance for the screening of medical cases. Further, the invention relates to a method and system for in-service monitoring and training for a radiologic workstation.
In a radiologic screening procedure, such as screening mammography, true abnormalities such as cancers are believed to occur at a typical rate of about three to four cases per one thousand patient examinations. Apparently any misdiagnosis of the radiologist can have drastic consequences for the patient. However when a large number of cases is screened it is unavoidable that the radiologist's attention decreases over time.
U.S. Pat. No. 5,917,929 shows a user interface for a computer aided diagnosis (CAD) system. The CAD system serves as an electronic reminder or second reader to assist radiologists in obtaining higher detection rates or higher sensitivity for abnormalities. In addition the CAD system assists radiologists in reducing the misdiagnosis rate, or lowering the false negative rate. However, the usage of a CAD system and a user-friendly interface does not address the problem of lapse of the attention of the radiologist, which decreases due to fatigue or other reasons.
U.S. Pat. No. 5,987,345 discloses a method and system for displaying of medical images. The system displays an image along with corresponding ones of the same image with computer-aided diagnostic information added. The CAD computer output is used as a second opinion prior to the final decisions of the radiologist. Again the problem of a decreasing attention of the radiologist is not addressed here.
It is an object of the present invention to provide a novel method, apparatus and system for screening of medical images.
It is another object of the present invention to provide for improved training, quality assurance and quality control for the screening of medical images.
It is a further object of the present invention to provide a quality report for the screening procedure performed by a radiologist, which can be relied upon by regulatory authorities, health insurance providers, and/or legal authorities in case of litigations.
These and other objects of the invention are achieved by a method, apparatus and system which infiltrates known proven positive and/or known normal cases, in the sequence of cases to be reviewed, in a fixed or random manner. In brief, the infiltration of known positive cases serves to monitor and control the radiologist's attention. This mode of operation is also referred to as “in-service monitoring” or “radiologist's performance monitoring”.
Another application of the present invention is for the purposes of training. For this application, the radiologist does not review actual cases; rather the radiologist reviews known cases having a certain user definable absolute number, statistical frequency and/or distribution of a variety of medical case categories.
The very low incidence rate of 3 to 4 per 1000, make screening mammography particularly demanding on radiologist's concentration. The infiltration of known positive cases is a way to artificially increase the incidence rate of cancers in screening.
In a preferred embodiment the absolute number or percentage of known cases to be infiltrated is specified by the super user. Further, the categories of known cases to be infiltrated can also be selected, as well as, the statistical distribution of various categories of the known cases to be infiltrated.
The invention is advantageous in that it provides an immediate feedback to the radiologist during the screening procedure so that the radiologist can recognize a decreasing level of concentration and/or fatigue. For example when one or more misdiagnosis of known cases occurs, the user can be asked by the system to take a break. Alternatively, the system can be disabled for the respective radiologist for a certain predetermined time period for appropriate recovery of the radiologist.
All user actions (e.g. keyboard, mouse clicks and system tools used) concerning the real cases to be reviewed, as well as, the known cases and the diagnosis and feedback provided to the user can be traced. Based on the tracing, a user action report can be generated for the purposes of quality monitoring and control. Such a report can even be relied upon for the defence against claims for damages due to a misdiagnosis.
A more complete appreciation of the invention and many of the attendant advantages thereof will be readily understood from the following detailed description of preferred embodiments when considered in connection with the accompanying drawings, wherein:
Referring now to the drawings, in particular
The user interface 2 is coupled to workflow memory 6, which stores a workflow for the screening procedure. The workflow memory 6 contains a case stack 7 for storage of the case numbers to be reviewed in a particular screening procedure to be performed by a particular user. In the example considered here, the case stack 7 contains the cases with ID numbers ranging from i to n. The case stack 7 has a pointer 8 for pointing to the current case that is being reviewed by the user of the computer 1. At the time considered here, the pointer 8 points to the case i, which is the first case of the case stack 7.
The workflow memory 6 is coupled to the image selection module 9 that contains a program 10 and a pseudo-random number generator 11. Further the image selection module 9 contains a retrieval program module 12 for coupling to a database 13 and for retrieval of image data from the database 13. The database 13 can be implemented on a mass storage of the computer 1 or on an external server computer, such as an external image archive, as is the case in the example of
The database 13 contains a table 14 of cases and the corresponding image data for each of the cases. In the example considered here, the table 14 contains the cases from case 1 to case m, each of the cases having respective images a, b, c, d.
For screening mammography, typically four (or eight) images are taken per case: a left and a right craniocaudal image and a left and a right mediolateral oblique image (and the priors from the previous screening round). Each of the cases contained in the table 14 of the database 13 can be accessed by means of the respective case identifier (ID) that serves as a key for performing the database query.
The image selection module 9 further comprises an infiltration program module 15 that is coupled to a database 16. Again, the database 16 can be stored on a mass storage of the computer 1, such as a CD-ROM, or it can be stored on a separate server computer.
The database 16 has a table 17 for storage of known proven positive and/or known proven normal cases. Each of the known cases of the table 17 has a unique case ID that serves to retrieve a particular case from the table 17. Each of the cases has a data record containing the image data for the case as well as data indicating the category of the case. Again, in the case of screening mammography, each case can have four images a, b, c, d, corresponding to a left and a right craniocaudal and a left and a right mediolateral oblique view, or eight images if the priors from the previous round are used also.
The case category can contain four different data fields corresponding to the category of the tumour, the breast density, the mammographic findings (lesion type) and a subtlety rating. The table below gives an example of a classification scheme for the case category. Each sub-category of a case is identified by a specific value as indicated in the table below.
TABLE I
Case
Category
Name of case category
Value for case category
1
Category of tumor
00 = normal (no tumor)
11 = benign tumor
22 = cancer screen detected
33 = interval cancer (over-
looked, false negative)
44 = interval cancer (true)
2
Breast density
1 = <5%
2 = 5-25%
3 = 25-50%
4 = 50-75%
5 = >75%
3
mammographic findings
mass
(lesion type)
calcification
architectural distortion
asymmetry
other (specify. . .)
(l) left, (r) right, link to annotations
4
subtlety rating
1. extremely subtle
2. very subtle
3. subtle
4. relatively obvious
5. obvious
The infiltration program module 15 can query the database 16 in order to identify the known cases in the table 17 having a required category or corresponding to a certain profile of sub-categories in accordance with the above table I. In the latter case the required case category or sub-category serves as a key to identify suitable cases for the infiltration.
The image selection module 9 is coupled to the session preparation module 18. The session preparation module 18 allows specifying the cases to be screened in order to initialize the case stack 7. Further the session preparation module 18 allows inputting an absolute number or percentage of known cases to be inserted into the flow of cases being screened by the user. Further the session preparation module 18 enables specifying the absolute number or percentage of known cases of different categories and/or different category profiles in accordance with the above table I. The session preparation module 18 can serve both to initialize a real screening session as well as a training session with training cases.
The image selection module has an output connected to the display system 19, which serves to display a current image on the monitor 20. After having reviewed the current image, which is displayed on the monitor 20, the radiologist can input a diagnosis and/or an annotation via the user interface 2. The diagnosis and/or annotation is stored in the diagnosis database 21 of the database module 22.
The database module 22 further contains a user action database 23 for the tracing of user actions that are input via the user interface 2. The user action database 23 also serves for the purposes of generating the user action report in accordance with rules 24.
It is important to note that the computer system of
For the purposes of in-service radiologist's performance monitoring of a real screening session the session preparation by means of the session preparation module 18 is typically done by a super user and not by the radiologist who actually performs the screening operation. Typically the super user can initialize the computer system for a variety of different users, which are identified by respective user IDs.
For a particular user the case stack 7 is initialized to contain the cases to be reviewed by that user as well as a certain profile of known cases to be inserted into the flow of images. When the user considered here logs on, the corresponding case stack 7 and the other entries made by means of the session preparation module 18 by the super user are retrieved by means of the user profile 3.
When the user starts the screening procedure the program 10 obtains a pseudo-random number from the pseudo-random number generator 11 in order to decide whether the real case i to which the pointer 8 points is to be displayed or if a known case is to be displayed. If a real case of the case stack 7 is to be displayed the database 13 is queried in order to retrieve the case and to display the case.
If a known case is to be inserted into the flow of cases the program 10 performs an access operation to the database 16 via the infiltration program module 15 in order to retrieve an appropriate known case for display. It is not transparent to the radiologist whether a case that is currently displayed on the monitor 20 is a real case or a known case.
In both instances a diagnosis of the radiologist is entered via the user interface 2 and stored in the diagnosis database 21. In the case of a known case the diagnosis is compared to the ground truth and/or pathology. If a mismatch between the diagnosis and the ground truth and/or pathology occurs, this is recognized by the program 10, and a corresponding message is displayed on the monitor 20. For example the message can be “you missed a cancer” or similar.
The fact that a misdiagnosis occurred is stored in the user action database 23 as well as all other user actions. After the user has entered his or her diagnosis for the current case he or she can go to the next case by pressing the next-step button 5 such that the pointer 8 is shifted to the consecutive case but only if the last case has not been a known case. Again the program 10 makes a determination whether to display the consecutive case as identified by the pointer 8 or to display a known case from the database 16.
This procedure continues until all cases of the case stack 7 have been processed. If the number of misdiagnosis recognized by the program 10 gets above a certain predefined threshold level the program 10 can display a corresponding message to the radiologist and/or it can disable the operation of the computer 1 for a certain predefined period of time in order to allow an appropriate recovery of the radiologist. At the end of the screening or training session a report can be generated based on the contents of the user action database 23 in accordance with the rules 24. For example the user action report generation can utilize the following in-service monitoring indicators of tables II and III:
TABLE II
malignant
non-malignant
objectives
No finding
(data)
(no lesion
subjectives
Finding (lesion visible)
visible)
(diagnosis)
malignant
benign
Negative
Positive
A
B1
B2
Benign
C1
D1
D2
Negative
C2
D3
D4
TABLE III
Vari-
malig-
able
Result
finding
nant
Comment
A
true positive
YES-ok
YES-ok
Lesion found and correctly
interpreted as malignant
B1
False positive
YES-ok
YES-
Classification error: a
not ok
benign lesion is interpreted
as malignant
B2
False positive
YES-not
YES-
something seen that is not
ok
not ok
there: something normal is
interpreted as malignant
C1
False negative
YES-ok
NO-
Classification error: a
not ok
malignant lesion is
interpreted as benign
C2
False negative
NO-not ok
NO-
overseen: a malignant
not ok
lesion is overseen
D1
True negative
YES-ok
NO-ok
Lesion found and correctly
interpreted as benign
D2
True negative
YES-not
NO-ok
something seen that is not
ok
there, but classified as
benign
D3
True negative
NO-not ok
NO-ok
overseen: benign lesion not
found, but the result is
correct (benign)
D4
True negative
NO-ok
NO-ok
no finding at all, result is
correct
The field “user actions” contains a trace of all user actions performed with respect to the case with the case ID. Examples for such user actions are the sequence of images requested by the radiologist for the review of the case, the format of the images requested, such as tiling of the monitor and magnification of the images, as well as image transformations and computer aided diagnosis functions which the radiologist uses for the review of the case. The following data field “time spent on case” contains the amount of time the radiologist has spent on the review of the particular case.
In step 31 an absolute number or percentage of known cases to be infiltrated in the stream of cases is entered. Again this can be done by the super user or the radiologist himself, depending on the application.
Further in step 32 a distribution of the categories of the known cases can be entered such as by specifying a percentage value for each of the categories of above table 1. Further in step 33 the mode can be specified. If the random mode is selected the known cases are infiltrated randomly into the stream of cases. If the fixed mode is selected the known cases are infiltrated based on a fixed predefined pattern.
In
Based on the pseudo-random number a decision is made in step 42. If the pseudo-random number is equal to 0 this means that a real case from a case stack is to be displayed to the radiologist. A corresponding case from the case stack is identified in step 43 and displayed to the radiologist in step 44. In step 45 the radiologist can enter his or her diagnosis. Alternatively the radiologist can also decide not to enter the diagnosis. In this instance the actual case is shifted to the bottom of the case stack (cf. FIG. 1—case stack 7) for later review and diagnosis.
In step 46 the user action database 23 is updated for the storage of data corresponding to the data fields of the table of
In step 41 again a pseudo-random number is requested and it is again decided in step 42 whether the pseudo-random number equals 0. If the pseudo-random number is not equal to 0 the control goes to step 47 in order to select a known case.
The known case selected in step 47 is displayed in step 48 to the radiologist. After review of the case displayed in step 48 the radiologist enters the diagnosis in step 49. In step 50 it is decided whether the diagnosis entered in step 49 is correct or not.
If the radiologist classified the known e.g. positive case as a normal case (a verified case with no malignant abnormality) or as a negative case (a verified case with no abnormality, whether benign or malignant) the diagnosis for the known positive case is incorrect and the control goes to step 51 in order to output a corresponding message to the radiologist. For example a corresponding message can be displayed and/or an aural output such as a beep, can be outputted.
From step 51 the control goes to step 52 corresponding to step 46 in the case of a real medical case. If the diagnosis is correct the control goes directly from step 50 to step 52. From step 52 the control goes to step 41, such that the procedure is repeated again. This sequence of steps continues until all cases from the case stack have been processed in the screening procedure.
Further the user can specify the number of suspicious cases to be infiltrated in the case stream. The user can input a minimum number of suspicious cases in the input field 26 and the user can input a maximum number of suspicious cases in the input field 27. If only one of the input fields 26 and 27 is used or if the same number is input in both fields 26 and 27 the user can chose between the random and the fixed mode by clicking on the word “random” or “fixed”. By clicking on “start training” the user can start the screening training procedure.
Further the average time spent for each individual case is indicated. On request a more detailed report can be generated based on the user action database.
By clicking on “show solutions” the system displays the correct diagnosis of the known positive cases. By clicking on “new training set” the control goes back to the screens of
The in-service monitoring statistic is indicative of the radiologist's performance. For example it contains the statistic of the success rate of the radiologist's diagnosis of the known cases. The statistic can be specific on the type of cases and can thus serve to identify case categories in which the radiologist's diagnoses are frequently incorrect.
The in-service monitoring statistic is a basis for the mammogram preparation (step 81). In step 81 a set of mammograms is selected depending on the in-service monitoring statistic provided in step 80. For example if the radiologist's performance is particularly weak for a certain case category known cases of this category can be selected and prepared for display.
Alternatively individual settings can be inputted in step 82. By means of the individual settings a specific case profile can be specified for selection and preparation in step 81. In step 83 a screening training is performed based on the mammograms prepared in step 81. The corresponding cases are displayed in steps 84 and 85 with and without computer aided diagnosis (CAD) software support, respectively. The time required for the radiologist for inputting his or her respective diagnosis is traced in step 86. As an option the radiologist gets immediate feedback in step 87, in particular in case of a wrong diagnosis.
The overall result of the screening is evaluated in step 88 and the evaluation is displayed to the user.
As an alternative to the steps 83 to 88 for the screening training a lesion training can be performed in step 89. In contrast to the screening training the lesion training cases only consist of known positive cases with a variety of different lesion categories to be correctly diagnosed by the user. Again the cases are displayed with and without CAD support in steps 90 and 91, respectively.
The time required for the diagnosis can be traced in step 92 at the user's option. Alternatively the inputting of the diagnosis can be performed in step 93 without time control. In either case the radiologist has the option to get immediate feedback in step 94, in particular in case of a misdiagnosis. In step 95 an overall evaluation of the performance of the radiologist for the lesion training is generated and displayed.
The screen allows inputting the total number of cases—in this case 42 cases. A slide 96 can be moved along a line 97. The position of the slide 96 determines the distribution of normal and positive cases. As the slide 96 is in a middle position the same number of normal and positive cases is selected. Moving the slide 96 to the left proportionally increases the number of normal cases and decreases the number of positive cases such that the total number of cases remains the same. Moving the slide 96 to the right will correspondingly increase the number of positive cases and decreases the number of normal cases.
Clicking on one of the fields 98, 99 or 100 specifies the age distribution of patients from which the mammograms are taken—in this case 50 to 59, years, 60 to 70 years and greater 70 years.
In the entry field 104 the age distribution can be specified. The entry field 105 serves to specify the type of lesion (mass, microcalcifications, architectural distortion (DIST.) asymmetric and other).
The entry field 106 serves to specify cancer features, such as pathology, tumor category and tumor size. The field 107 serves to input the Wolfe class. By clicking on the search button 108 a number of cases is selected from a case database corresponding to the search profile entered by means of the fields 103 to 107. The search result is displayed in the window 109.
Although the present invention has been shown and described with respect to preferred embodiments, nevertheless, changes and modifications will be evident to those skilled in the art from the teachings of the invention. Such changes and modifications that embody the spirit, scope and teachings of the invention are deemed to fall within the purview of the invention as set forth in the appended claims.
computer
01
user interface
02
user profile
03
keypad
04
next-step button
05
workflow memory
06
case stack
07
printer
08
image selection module
09
program
10
generator
11
retrieval program module
12
database
13
table
14
infiltration program module
15
database
16
table
17
session preparation module
18
display system
19
monitor
20
diagnosis database
21
database module
22
user action database
23
rules
24
input field
25
input field
26
input field
27
input field
28
slide
96
line
97
field
98
field
99
field
100
field
101
field
102
entry field
103
entry field
104
entry field
105
entry field
106
field
107
search button
108
window
109
Patent | Priority | Assignee | Title |
7930193, | Dec 27 2007 | Systems and methods for workflow processing | |
7937277, | Dec 27 2007 | Systems and methods for workflow processing | |
8031838, | Jan 29 2009 | GEARBOX, LLC | Diagnostic delivery service |
8041008, | Jan 29 2009 | GEARBOX, LLC | Diagnostic delivery service |
8047714, | Jan 29 2009 | GEARBOX, LLC | Diagnostic delivery service |
8083406, | Jan 29 2009 | GEARBOX, LLC | Diagnostic delivery service |
8111809, | Jan 29 2009 | GEARBOX, LLC | Diagnostic delivery service |
8116429, | Jan 29 2009 | GEARBOX, LLC | Diagnostic delivery service |
8130904, | Jan 29 2009 | GEARBOX, LLC | Diagnostic delivery service |
8249218, | Jan 29 2009 | GEARBOX, LLC | Diagnostic delivery service |
8254524, | Jan 29 2009 | GEARBOX, LLC | Diagnostic delivery service |
8924237, | Jul 24 2001 | Fred Herz Patents, LLC | Database for pre-screening potentially litigious patients |
9477809, | Dec 27 2007 | Systems and methods for workflow processing |
Patent | Priority | Assignee | Title |
4411628, | Jun 01 1979 | Texas Instruments Incorporated | Electronic learning aid with picture book |
4807110, | Apr 06 1984 | International Business Machines Corporation | Prefetching system for a cache having a second directory for sequentially accessed blocks |
5061187, | Apr 12 1990 | Ultrasound training apparatus | |
5306154, | Mar 07 1991 | Hitachi, Ltd. | Intelligent education and simulation system and method |
5560360, | Mar 09 1992 | FILLER, AARON G; NEUROGRAFIX; Washington Research Foundation | Image neurography and diffusion anisotropy imaging |
5819288, | Oct 16 1996 | Microsoft Technology Licensing, LLC | Statistically based image group descriptor particularly suited for use in an image classification and retrieval system |
5890911, | Mar 22 1995 | BANCROFT, WILLIAM M | Method and system for computerized learning, response, and evaluation |
5917929, | Jul 23 1996 | Hologic, Inc | User interface for computer aided diagnosis system |
5987345, | Nov 29 1996 | Arch Development Corporation | Method and system for displaying medical images |
6011862, | Apr 25 1995 | Arch Development Corporation | Computer-aided method for automated image feature analysis and diagnosis of digitized medical images |
6021404, | Aug 18 1997 | CONNECTANCE, INC | Universal computer assisted diagnosis |
6041135, | May 06 1996 | AGFA HEALTHCARE N V | Fast interactive off-line processing method for radiographic images |
6058322, | Jul 25 1997 | Arch Development Corporation | Methods for improving the accuracy in differential diagnosis on radiologic examinations |
6084594, | Jun 24 1997 | Fujitsu Limited | Image presentation apparatus |
6098064, | May 22 1998 | Cisco Technology, Inc | Prefetching and caching documents according to probability ranked need S list |
6127669, | Jan 29 1997 | University of Maryland | Computer-aided determination of window and level settings for filmless radiology |
6128002, | Jul 08 1996 | STAIR SYSTEMS, INC NORTH CAROLINA CORP | System for manipulation and display of medical images |
6151662, | Dec 02 1997 | GLOBALFOUNDRIES Inc | Data transaction typing for improved caching and prefetching characteristics |
6154767, | Jan 15 1998 | Microsoft Technology Licensing, LLC | Methods and apparatus for using attribute transition probability models for pre-fetching resources |
6154826, | Nov 16 1994 | VIRGINIA PATENT FOUNDATION, UNIVERSITY OF | Method and device for maximizing memory system bandwidth by accessing data in a dynamically determined order |
6260021, | Jun 12 1998 | Philips Electronics North America Corporation | Computer-based medical image distribution system and method |
6283761, | Sep 08 1992 | GTJ VENTURES, LLC | Apparatus and method for processing and/or for providing healthcare information and/or healthcare-related information |
6535714, | Jul 02 2001 | FLORIDA, UNIVERSITY OF | Method, system, and apparatus for medical device training |
6540679, | Dec 28 2000 | Guided Therapy Systems, LLC | Visual imaging system for ultrasonic probe |
6551107, | Nov 03 2000 | SCITENT, INC | Systems and methods for web-based learning |
6669482, | Jun 30 1999 | Method for teaching interpretative skills in radiology with standardized terminology | |
20020076091, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
May 29 2001 | MeVis Breastcare GmbH & Co. KG | (assignment on the face of the patent) | / | |||
Aug 06 2001 | EVERTSZ, CARL J G | MEVIS-CENTRUM FUR MEDIZINISCHE DIAGNOSESYSTEME UND VISUALISIERUNG GMBH | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 012148 | /0882 | |
Oct 25 2001 | MEVIS-CENTRUM FUR MEDIZINISCHE DIAGNOSESYSTEME UND VISUALIZIERUNG GMBH | MEVIS BREASTCARE GMBH CO KG | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 015707 | /0525 | |
Oct 25 2001 | MEVIS-CENTRUM FUR MEDIZINISCHE DIAGNOSESYSTEME UND VISUALISIERUNG GMBH | MEVIS BREASCARE GMGH & CO KG | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 012398 | /0476 |
Date | Maintenance Fee Events |
Jul 27 2011 | ASPN: Payor Number Assigned. |
Sep 20 2011 | M2551: Payment of Maintenance Fee, 4th Yr, Small Entity. |
Sep 22 2011 | M1559: Payment of Maintenance Fee under 1.28(c). |
Oct 14 2011 | STOL: Pat Hldr no Longer Claims Small Ent Stat |
Oct 14 2015 | M1552: Payment of Maintenance Fee, 8th Year, Large Entity. |
Dec 16 2019 | REM: Maintenance Fee Reminder Mailed. |
Jun 01 2020 | EXP: Patent Expired for Failure to Pay Maintenance Fees. |
Date | Maintenance Schedule |
Apr 29 2011 | 4 years fee payment window open |
Oct 29 2011 | 6 months grace period start (w surcharge) |
Apr 29 2012 | patent expiry (for year 4) |
Apr 29 2014 | 2 years to revive unintentionally abandoned end. (for year 4) |
Apr 29 2015 | 8 years fee payment window open |
Oct 29 2015 | 6 months grace period start (w surcharge) |
Apr 29 2016 | patent expiry (for year 8) |
Apr 29 2018 | 2 years to revive unintentionally abandoned end. (for year 8) |
Apr 29 2019 | 12 years fee payment window open |
Oct 29 2019 | 6 months grace period start (w surcharge) |
Apr 29 2020 | patent expiry (for year 12) |
Apr 29 2022 | 2 years to revive unintentionally abandoned end. (for year 12) |