A methodology in which a learner-constructed response is provided in answer to a question presented by the system, the response being evaluated by comparison with pre-defined expected responses and, based upon the evaluation, the system determining whether to proceed to another question or to offer remedial feedback. Such a learner-constructed response based evaluation methodology greatly reduces the potential for “guess-work” based correct responses and improves the training process through remedial feedback and advancement upon demonstration of knowledge.
|
0. 119. A method for implementing an automated learning system, the method performed by a computer system having a processor, a memory, and a display, the method comprising:
presenting on the display, using a graphical user interface, at least one knowledge topic to the learner;
prompting the learner to enter a learner-constructed response thereto;
presenting the learner-constructed response on the display using the graphical user interface;
comparing keyword data that corresponds to the knowledge topic with the learner-constructed response; and
determining success or failure of the learner to know the knowledge topic, the success or failure being determined by whether or not expected keyword data appears in the learner-constructed response, wherein upon a determination of failure of the learner, remedial information is provided to the learner before the learner is prompted to enter a last learner-constructed response.
3. A method for implementing an automated learning system, the method performed by a computer system having a processor, a memory, and a display, the method comprising:
presenting at least one knowledge topic on the display, using a graphical user interface, to the learnerand for ;
prompting the learner to enter a learner constructed learner-constructed response thereto;
presenting on the display, using the graphical user interface, the learner-constructed response;
comparing keyword data that corresponds to the knowledge topic with the learner-constructed response; and
determining success of failure of the learner to know the knowledge topic, the success or failure being determined by whether or not expected keyword data appears in the learner-constructed response, wherein upon a determination of failure of the learner, remedial information is provided to the learner before the learner is prompted to enter another learner-constructed response.
0. 133. A method for implementing an automated learning system, the method performed by a computer system having a processor, a memory, and a display, the method comprising:
presenting on the display, using a graphical user interface, at least one knowledge topic to the learner;
prompting the learner to enter a learner-constructed response to the at least one knowledge topic;
presenting the learner-constructed response on the display using the graphical user interface;
comparing keyword data that corresponds to the knowledge topic with the learner-constructed response; and
determining success or failure of the learner to know the knowledge topic, the success or failure being determined by whether or not expected keyword data appears in the learner-constructed response, wherein upon a determination of failure of the learner, remedial information is provided to the learner after the learner is prompted to enter the learner-constructed response.
0. 161. A method for implementing an automated learning system, the method performed by a computer system having a processor, a memory, and a display, the method comprising:
presenting on the display, using a graphical user interface, a series of knowledge topics to the learner;
prompting the learner to enter a learner-constructed response to each topic;
presenting the learner-constructed response on the display using the graphical user interface;
comparing keyword data that corresponds to the knowledge topics with the learner-constructed responses;
determining success or failure of the learner knowing each one of the knowledge topics, the success or failure being determined by whether or not expected keyword data appears in each learner-constructed response; and
upon a determination of failure of the learner, providing remedial information to the learner at a later time and prompting the learner to enter another learner-constructed response.
0. 105. A method for implementing an automated learning system, the method performed by a computer system having a processor, a memory, and a display, the method comprising:
presenting on the display, using a graphical user interface, at least one knowledge topic to the learner;
prompting the learner to enter a learner-constructed response to one of the at least one knowledge topic;
presenting the learner-constructed response on the display using the graphical user interface;
comparing keyword data that corresponds to the one knowledge topic with the learner-constructed response; and
determining success or failure of the learner to know the knowledge topic, the success or failure being determined by whether or not expected keyword data appears in the learner-constructed response, wherein upon a determination of failure of the learner, remedial information is provided to the learner before the learner is prompted to enter a last learner-constructed response.
0. 77. A computer readable storage medium storing a computer program, the computer program for execution by a computer system having a processor, a memory, and a display, the computer program for implementing a learning system with a learner-constructed response based methodology, comprising:
a presentation process for presenting on the display, using a graphical user interface, at least one knowledge topic to the learner and for prompting the learner to enter a learner-constructed response thereto;
a displaying process for presenting on the display, using the graphical user interface, the learner-constructed response;
an evaluation information process for providing keyword data that corresponds to the knowledge topic; and
an evaluation process for determining, based upon entry of a learner-constructed response to the knowledge topic, success or failure of the learner to know the knowledge topic, the success or failure being determined by comparison of the learner-constructed response with the keyboard data, wherein upon a determination of failure of the learner, remedial information is provided to the learner before the learner is prompted to enter a last learner-constructed response.
1. A computer readable storage medium storing a computer program, the computer program for execution by a computer system having a processor, a memory, and a display, the computer program for implementing a learning system with a learner-constructed response based methodology, comprising:
a presentation process for presenting on the display, using a graphical user interface, at least one knowledge topic to the learner and for prompting the learner to enter a learner-constructed response thereto;
a displaying process for presenting on the display, using the graphical user interface, the learner-constructed response;
an evaluation information process for providing keyword data that corresponds to the knowledge topic; and
an evaluation process for determining, based upon entry of a learner-constructed response to the knowledge topic, success or failure of the learner to know the knowledge topic, the success or failure being determined by comparison of the learner-constructed response with the keyword data wherein upon a determination of failure of the learner, remedial information is provided to the learner before the learner is prompted to enter another learner-constructed response.
0. 91. A computer readable storage medium storing a computer program, the computer program for execution by a computer system having a processor, a memory, and a display, the computer program for implementing a learning system with a learner-constructed response based methodology, comprising:
a presentation process for presenting on the display, using a graphical user interface, at least one knowledge topic to the learner and for prompting the learner to enter a learner-constructed response thereto;
a displaying process for presenting on the display, using the graphical user interface, the learner-constructed response;
an evaluation information process for providing keyword data that corresponds to the knowledge topic; and
an evaluation process for determining, based upon entry of a learner-constructed response to the knowledge topic, success or failure of the learner to know the knowledge topic, the success or failure being determined by comparison of the learner-constructed response with the keyword data wherein upon a determination of failure of the learner, remedial information is provided to the learner before the learner is prompted to enter a plurality of learner-constructed responses.
0. 49. A computer readable storage medium storing a computer program, the computer program for execution by a computer system having a processor, a memory, and a display, the computer program for implementing a learning system with a learner-constructed response based methodology, comprising:
a presentation process for presenting on the display, using a graphical user interface, at least one knowledge topic to the learner and for prompting the learner to enter a learner-constructed response thereto;
a displaying process for presenting on the display, using the graphical user interface, the learner-constructed response;
an evaluation information process for providing keyword data that corresponds to the knowledge topic; and
an evaluation process for determining, based upon entry of the learner-constructed response to the knowledge topic, success or failure of the learner to know the knowledge topic, the success or failure being determined by comparison of the learner-constructed response with the keyword data wherein after a determination of failure of the learner, remedial information is provided to the learner, after which the learner is prompted to enter another learner-constructed response.
4. A method for implementing an automated learning system, the method performed by a computer system having a processor, a memory, and a display, the method comprising:
presenting a series of knowledge topics on the display, using a graphical user interface, to the learner; and
prompting the learner to enter a learner constructed learner-constructed response to each topic;
presenting on the display, using the graphical user interface, the learner-constructed responses;
comparing keyword data that corresponds to the knowledge topics with the learner-constructed responses; and
determining success or failure of the learner to know each of the knowledge topics, the success or failure being determined by whether or not expected keyword data appears in the learner-constructed response;
upon a determination of failure of the learner, providing remedial information to the learner and again prompting the learner to enter a learner-constructed response;
upon a determination of success of the learner, discontinuing presentation and prompting of the learner regarding the particular knowledge topic;
whereupon automated presentation of the series is completed when success is determined for each knowledge topic.
0. 147. A method for implementing an automated learning system, the method performed by a computer system having a processor, a memory, and a display, the method comprising:
presenting on the display, using a graphical user interface, a series of knowledge topics to the learner;
prompting the learner to enter a learning constructed response to each knowledge topic;
presenting the learner-constructed response on the display using the graphical user interface;
comparing keyword data that corresponds to each knowledge topic with each learner-constructed response;
determining a success or a failure of the learner to know each knowledge topic, the success or failure being determined by whether expected keyword data appears in each learner-constructed response;
after a determination of failure of the learner for a particular knowledge topic, providing remedial information to the learner for the particular knowledge topic and prompting the learner to enter a new learner-constructed response to the particular knowledge topic;
upon a determination of success of the learner for a particular one of the knowledge topics, discontinuing presentation and prompting of the learner regarding the particular one of the knowledge topics.
0. 63. A computer readable storage medium storing a computer program, the computer program for execution by a computer system having a processor, a memory, and a display, the computer program for implementing a learning system with a learner-constructed response based methodology, comprising:
a presentation process for presenting on the display, using a graphical user interface, at least one knowledge topic to the learner and for prompting the learner to enter a learner-constructed response to one of the at least one knowledge topic;
a displaying process for presenting on the display, using the graphical user interface, the learner-constructed response;
an evaluation information process for providing keyword data that corresponds to the one of the at least one knowledge topic; and
an evaluation process for determining, based upon entry of the learner-constructed response to the knowledge topic, success or failure of the learner to know the knowledge topic, the success or failure being determined by comparison of the learner-constructed response with the keyword data, wherein upon a determination of failure of the learner, remedial information is provided to the learner after the learner is prompted to enter the learner-constructed response.
2. The program of
5. The method of
6. The method of
7. The method of
8. The method of
9. The method of
collecting information regarding a performance of at least one learner during the presentation process, the evaluation information process and the evaluation process;
analyzing the collected information; and
generating a report based on the analyzed information for at least one predetermined party.
0. 10. The computer readable storage medium of
0. 11. The computer readable storage medium of
0. 12. The computer readable storage medium of
0. 13. The computer readable storage medium of
0. 14. The computer readable storage medium of
0. 15. The computer readable storage medium of
0. 16. The computer readable storage medium of
0. 17. The computer readable storage medium of
0. 18. The computer readable storage medium of
0. 19. The computer readable storage medium of
0. 20. The computer readable storage medium of
0. 21. The computer readable storage medium of
0. 22. The computer readable storage medium of
0. 23. The method of
0. 24. The method of
0. 25. The method of
0. 26. The method of
0. 27. The method of
0. 28. The method of
0. 29. The method of
0. 30. The method of
0. 31. The method of
0. 32. The method of
0. 33. The method of
0. 34. The method of
0. 35. The method of
0. 36. The method of
0. 37. The method of
0. 38. The method of
0. 39. The method of
0. 40. The method of
0. 41. The method of
0. 42. The method of
0. 43. The method of
0. 44. The method of
0. 45. The method of
0. 46. The method of
0. 47. The method of
0. 48. The method of
0. 50. The computer readable storage medium of
0. 51. The computer readable storage medium of
0. 52. The computer readable storage medium of
0. 53. The computer readable storage medium of
0. 54. The computer readable storage medium of
0. 55. The computer readable storage medium of
0. 56. The computer readable storage medium of
0. 57. The computer readable storage medium of
0. 58. The computer readable storage medium of
0. 59. The computer readable storage medium of
0. 60. The computer readable storage medium of
0. 61. The computer readable storage medium of
0. 62. The computer readable storage medium of
0. 64. The computer readable storage medium of
0. 65. The computer readable storage medium of
0. 66. The computer readable storage medium of
0. 67. The computer readable storage medium of
0. 68. The computer readable storage medium of
0. 69. The computer readable storage medium of
0. 70. The computer readable storage medium of
0. 71. The computer readable storage medium of
0. 72. The computer readable storage medium of
0. 73. The computer readable storage medium of
0. 74. The computer readable storage medium of
0. 75. The computer readable storage medium of
0. 76. The computer readable storage medium of
0. 78. The computer readable storage medium of
0. 79. The computer readable storage medium of
0. 80. The computer readable storage medium of
0. 81. The computer readable storage medium of
0. 82. The computer readable storage medium of
0. 83. The computer readable storage medium of
0. 84. The computer readable storage medium of
0. 85. The computer readable storage medium of
0. 86. The computer readable storage medium of
0. 87. The computer readable storage medium of
0. 88. The computer readable storage medium of
0. 89. The computer readable storage medium of
0. 90. The computer readable storage medium of
0. 92. The computer readable storage medium of
0. 93. The computer readable storage medium of
0. 94. The computer readable storage medium of
0. 95. The computer readable storage medium of
0. 96. The computer readable storage medium of
0. 97. The computer readable storage medium of
0. 98. The computer readable storage medium of
0. 99. The computer readable storage medium of
0. 100. The computer readable storage medium of
0. 101. The computer readable storage medium of
0. 102. The computer readable storage medium of
0. 103. The computer readable storage medium of
0. 104. The computer readable storage medium of
0. 106. The method of
0. 107. The method of
0. 108. The method of
0. 109. The method of
0. 110. The method of
0. 111. The method of
0. 112. The method of
0. 113. The method of
0. 114. The method of
0. 115. The method of
0. 116. The method of
0. 117. The method of
0. 118. The method of
0. 120. The method of
0. 121. The method of
0. 122. The method of
0. 123. The method of
0. 124. The method of
0. 125. The method of
0. 126. The method of
0. 127. The method of
0. 128. The method of
0. 129. The method of
0. 130. The method of
0. 131. The method of
0. 132. The method of
0. 134. The method of
0. 135. The method of
0. 136. The method of
0. 137. The method of
0. 138. The method of
0. 139. The method of
0. 140. The method of
0. 141. The method of
0. 142. The method of
0. 143. The method of
0. 144. The method of
0. 145. The method of
0. 146. The method of
0. 148. The method of
0. 149. The method of
0. 150. The method of
0. 151. The method of
0. 152. The method of
0. 153. The method of
0. 154. The method of
0. 155. The method of
0. 156. The method of
0. 157. The method of
0. 158. The method of
0. 159. The method of
0. 160. The method of
0. 162. The method of
0. 163. The method of
0. 164. The method of
0. 165. The method of
0. 166. The method of
0. 167. The method of
0. 168. The method of
0. 169. The method of
0. 170. The method of
0. 171. The method of
0. 172. The method of
0. 173. The method of
0. 174. The method of
|
This invention relates to systems and methods for personnel training and, more particularly, to supervised or self-administered computer-based training systems that incorporate a learner-constructed response based testing methodology for improved evaluation of knowledge acquisition.
A variety of systems are available for automated learning and training using computers or other personal electronic devices. In current computer mediated learning and training systems, assessment of the “knowledge” gained by the user is carried out by, for example, true/false questions, matching (paired-associate) type questions, multiple choice questions, and marking questions. A multiple choice question differs from a marking question in that a multiple choice question has one correct answer, while a marking question has multiple correct answers. The foregoing question formats are not fully effective as learning aids, not are they reliable in assessing actual knowledge, for various reasons. For example, in a true/false question, a learner has a fifty-fifty chance of answering correctly by guessing; in a four way multiple choice question, the probability of a correct answer through guessing is twenty five percent. Test results thus are not necessarily indicative of actual knowledge.
What is needed, therefore, is a methodology for use in computer based training that provides for improved learning, improved efficiency, and improved reliability in the assessment of a user's actual knowledge of subject matter.
This invention provides a methodology in which a learner-constructed response is provided in answer to a question presented by the system, the response being evaluated by comparison with pre-defined expected responses and, based upon the evaluation, the system determining whether to proceed to another question or to offer remedial feedback. Such a learner-constructed response based evaluation methodology greatly reduces the potential for “guess-work” based correct responses and improves the training process through remedial feedback and advancement upon demonstration of knowledge.
Evaluation of responses involves identification of pre-defined keyword data pertaining to the subject matter being tested. Examples include passages of text with important keywords (keywords being defined herein to include one or more words, or phases, or related words and phases, or synonyms). Multiple choice questions may also include keywords, such that after the learner completes a sequence of reading material or any kind of current multiple-choice, mix or match, true false questions, the learner is prompted to enter answers to “fill-in-the-blank” or “verbal narrative” questions (a learner-constructed response). The learner entered responses are compared to standard solutions recorded on the system and remedial actions are provided.
The methodology may be used in a specially designed training system or in cooperation with existing computer based training systems. For every “choice” based question (e.g., multiple choice), for example, the methodology may prompt for a “user-constructed response” based upon a question that has associated with it all acceptable correct user-constructed responses to this question, the presentation to the learner being designed to include an area or mechanism for capturing a learner response either in the form of text or spoken words. The correct response is recognized if the response matches the keyword(s), e.g., primary/related keyword(s) or phrase(s) and/or synonym(s).
In one implementation, a computer program is provided for implementing a learning system with a learner-constructed response based methodology, the program including a presentation process for presenting at least one knowledge topic to the learner and for prompting the learner to enter a learner constructed response thereto; an evaluation information process for providing keyword data that corresponds to the knowledge topic; and an evaluation process for determining, based upon entry of a learner-constructed response to the knowledge topic, success or failure of the learner to know the knowledge topic, the success or failure being determined by comparison of the learner-constructed response with the keyword data.
In
Programming, as discussed in detail below for implementing the present learning methodology, is stored on disc input 26 and/or memory 16 and is executed by the system 10. The learning methodology preferably is practiced using the foregoing system components, although it may be practiced with alternative components.
The presentation information component 28 contains information for presenting the question, and may also include additional instructions, help information and an avenue for capturing learner-constructed responses (e.g., a text area or a record button for voice input). The evaluation information component 30 may include a sequence of phrases and, in one embodiment, these may take the form of standard HTML tags for the display of question information and a sequence of proprietary tags for the encoding of expected key-words or phrases under the “META-DATA” tag in HTML.
Referring to
The program 34 enables creation of the components 42, 44 for a desired training session. During the creation of the training “content” the authors are prompted to create different key-words and phrases that best describe the “gist” of the content or embody the essence of the knowledge topic under discussion. These key-words and phrases are utilized for the construction of questions. These key-words may also be analyzed to produce additional key-words, phrases or synonyms, and identify negative constructs (wrong answers).
Referring to
Referring to the process 300, in step 302 the learner is prompted to construct the target knowledge (presented previously, as described above) in his or her own words. One example of the prompt is the fill-in-the-blank format 48, above. In step 304, if the learner's response is verbal, the speech is converted into text data. After the learner's response has been fully entered, a comparison can be triggered automatically in a predetermined manner. For example, the learner can hit a particular key on the keyboard (e.g., an “Enter” key) or activate a particular area on the display screen to start the comparison. In step 306, the comparison is performed of the learner's response with the pre-defined key word data contained in the evaluation information component 30 (FIG. 2A). The comparison may involve a variety of analyses. For example, the comparison may:
(1) check for and correct spelling mistakes in the learner-constructed responses;
(2) determine whether the correct key word (words, phrases) appear in the learner-constructed response;
(3) determine whether synonyms of missing key word(s) appear in the learner-constructed response;
(4) determine whether related phrases that convey the same meaning as the expected key word(s) or phrases appear in the learner-constructed response;
(5) determine whether there are any incorrect key word(s) or phrases in the learner-constructed response or other negative constructs that might indicate a wrong answer.
A variety of logic selections for evaluation are contemplated. In one example, for purposes of improved learning and expediting the testing, a decision is made in step 308 of whether the learner response fails a lexical analysis (described more fully in FIG. 3B), thereby indicating a possible wrong answer or misunderstanding. If yes, then in step 310 the methodology prompts the user for a positive construct. If not, in step 312 a determination is made whether or not expected keyword(s) are found in the response, albeit not necessarily in the exact way or phraseology preferred. If yes, then the methodology proceeds to step 314 and provides a success message to the evaluation control program and execution returns to the program for testing of other target knowledge topics. If not, then in step 316 a determination is made whether expected related phrase(s) are found in the learner's response (thus indicating a correct or partially correct answer). If yes, execution proceeds to step 314. If not, in step 318 a determination is made whether expected synonym(s) appear in the learner response, thereby indicating a correct or partially correct answer. If yes, execution proceeds to step 314. If not, the methodology proceeds to step 320. In step 320, a “failure” message is sent to the evaluation control program 34.
Possible scenarios of a “failure” message to the evaluation control program 34 are that the evaluation control program may:
(1) Proceed to other questions and come back to the question upon which failure is indicated, until a satisfactory answer is received.
(2) Offer remedial questions or target information;
(3) Re-evaluate the learner with a focus on the missed part of the current topic.
Possible scenarios of a “success” message to the evaluation control program 34 are that the evaluation control program may:
(1) Discontinue further questioning on the target knowledge subject;
(2) Question the learner on the target knowledge again or in a different way to confirm understanding.
Referring to
In step 324, if the response contains negative constructs, the learner is prompted in step 326 for alternative responses. For example, if the learner types “no empathy” or “not empathy” or “don't XXX” or “can't YYY” a parsing algorithm that looks for “empathy” or “XXX” or “YYY” will normally flag this as correct even though the negative construct makes the meaning totally different. Accordingly, step 324 determines that the answer with the negative construct is incorrect and proceeds to step 326.
If in step 324 there are no negative constructs, in step 328 a determination is made whether the user-constructed response contains a “conjunctive” construct and, if so, in step 330 prompts the learner for a single response. As an example, if “and” or “hut” or “or” are included in the answer, to indicate a possible guess or two possible answers, step 328 determines that the user-constructed responses is not correct and prompts the learner in step 330.
If in step 328 there are no conjunctive constructs, a determination in step 332 whether there are non-definite constructs, and if so, prompts the learner for a definite response. Example non-definite constructs include, e.g., “maybe” or “perhaps.”
If in step 332 there are no non-definite constructs, in step 336 execution proceeds to the next phase of the analysis, as further described in step 312 of
It is noted that at any given moment during the execution of the above mentioned learning methodology, various information pertaining to the training session or the performance of the learner is collected by the system 10 for different purposes. In one specific case, at the end of a training session, the collected information gives an in-depth view of how well the learner has been trained. The collected information can be analyzed to generate various reports to be delivered to a predetermined interested party. For instance, the analyzed information will help to identify comparative difficulties of different materials or subjects covered in the training session, or provide information on how the learner has performed on a per question basis, etc. A statistical analysis and report can also be generated in a similar fashion based on the performances of a group of learners with regard to the training session. Therefore, the interested party can evaluate the performance of a group of learners to make various decisions such as to determine whether the training session should be revised, or whether the group of learners can be profiled in a certain manner.
In summary, the system 10 provides a learning methodology that improves the speed and retention of learning, and furthermore provides improved accuracy in assessment of the learner. By requiring, perhaps in addition to traditional multiple choice or other testing techniques, a learner-constructed response in which the learner must use his or her own words in answering a question, greater assurance is provided that the learner indeed knows the subject matter. Also, the system allows for refinement of the testing as the learner gets closer to accurate responses, as enabled by the construction of a key word component associated with the target knowledge component, as enabled by the evaluation process.
Although illustrative embodiments of the invention have been shown and described, other modifications, changes, and substitutions are intended in the foregoing disclosure. Accordingly, it is appropriate that the appended claims be constructed broadly and in a manner consistent with the scope of the invention.
Patent | Priority | Assignee | Title |
8506305, | Dec 23 2008 | DECK CHAIR LEARNING SYSTEMS, INC | Electronic learning system |
8606170, | Jan 20 2012 | Northrop Grumman Systems Corporation | Method and apparatus for interactive, computer-based, automatically adaptable learning |
8851900, | Dec 23 2008 | Deck Chair Learning Systems Inc. | Electronic learning system |
Patent | Priority | Assignee | Title |
3408749, | |||
3566482, | |||
3606688, | |||
3671668, | |||
3715811, | |||
4289313, | Sep 07 1979 | Management teaching game apparatus and method | |
4416454, | Sep 07 1979 | Management teaching game method | |
4817036, | Mar 15 1985 | Brigham Young University | Computer system and method for data base indexing and information retrieval |
4833610, | Dec 16 1986 | International Business Machines Corporation | Morphological/phonetic method for ranking word similarities |
4895518, | Nov 02 1987 | Greenhalgh Technologies | Computerized diagnostic reasoning evaluation system |
4958284, | Dec 06 1988 | IPSOS-INSIGHT, INC | Open ended question analysis system and method |
5002491, | Apr 28 1989 | BETTER EDUCATION INC | Electronic classroom system enabling interactive self-paced learning |
5002865, | Apr 24 1985 | Konica Corporation | Silver halide photographic material |
5011413, | Jul 19 1989 | EDUCATIONAL TESTING SERVICE, A CORP OF NJ | Machine-interpretable figural response testing |
5033969, | Jul 21 1989 | Pioneer Electronic Corporation | Support device for resolving questions about reproduced information |
5112064, | Jun 13 1990 | Psychology game | |
5168565, | Jan 20 1988 | Ricoh Company, Ltd. | Document retrieval system |
5246375, | Nov 15 1991 | Memory aiding device | |
5265065, | Oct 08 1991 | WEST SERVICES INC | Method and apparatus for information retrieval from a database by replacing domain specific stemmed phases in a natural language to create a search query |
5307266, | Aug 22 1990 | Hitachi, Ltd. | Information processing system and method for processing document by using structured keywords |
5314340, | Oct 30 1990 | Texas Instruments Incorporated | Electronic teaching apparatus having two-part partially and wholly actuated for indication of correct and incorrect answers |
5325465, | Mar 04 1992 | CENTURY OASIS LIMITED | End user query facility |
5384703, | Jul 02 1993 | Xerox Corporation | Method and apparatus for summarizing documents according to theme |
5424947, | Jun 15 1990 | International Business Machines Corporation | Natural language analyzing apparatus and method, and construction of a knowledge base for natural language analysis |
5441415, | Feb 11 1992 | JRL ENTERPRISES, INC | Interactive computer aided natural learning method and apparatus |
5442780, | Jul 11 1991 | MITSUBISHI DENKI KABUSHIKI KAISHA A CORP OF JAPAN | Natural language database retrieval system using virtual tables to convert parsed input phrases into retrieval keys |
5463773, | May 25 1992 | Fujitsu Limited | Building of a document classification tree by recursive optimization of keyword selection function |
5475588, | Jun 18 1993 | Binary Services Limited Liability Company | System for decreasing the time required to parse a sentence |
5511793, | Jun 08 1992 | QUANTUM DEVELOPMENT, INC | Composite chess game and method |
5519608, | Jun 24 1993 | Xerox Corporation | Method for extracting from a text corpus answers to questions stated in natural language by using linguistic analysis and hypothesis generation |
5528491, | Aug 31 1992 | CANTENA SERVICE AGENT CORPORATION | Apparatus and method for automated natural language translation |
5540589, | Apr 11 1994 | Mitsubishi Electric Research Laboratories, Inc | Audio interactive tutor |
5597312, | May 04 1994 | Qwest Communications International Inc | Intelligent tutoring method and system |
5632624, | Sep 22 1993 | Brainchild, Inc. | Electronic study guide |
5689716, | Apr 14 1995 | Xerox Corporation | Automatic method of generating thematic summaries |
5694523, | May 31 1995 | Oracle International Corporation | Content processing system for discourse |
5696962, | Jun 24 1993 | Xerox Corporation | Method for computerized information retrieval using shallow linguistic analysis |
5708822, | May 31 1995 | Oracle International Corporation | Methods and apparatus for thematic parsing of discourse |
5749736, | Mar 22 1995 | BANCROFT, WILLIAM M | Method and system for computerized learning, response, and evaluation |
5768580, | May 31 1995 | Oracle Corporation | Methods and apparatus for dynamic classification of discourse |
5863208, | Jul 02 1996 | IpLearn, LLC | Learning system and method based on review |
5885087, | Sep 30 1994 | Robolaw Corporation | Method and apparatus for improving performance on multiple-choice exams |
5987302, | Mar 21 1997 | Educational Testing Service | On-line essay evaluation system |
5987443, | Dec 22 1998 | Accenture Global Services Limited | System, method and article of manufacture for a goal based educational system |
6029043, | Jan 29 1998 | IpLearn, LLC | Computer-aided group-learning methods and systems |
6067538, | Dec 22 1998 | Accenture Global Services Limited | System, method and article of manufacture for a simulation enabled focused feedback tutorial system |
6077085, | May 19 1998 | INTELLECTUAL RESERVE, INC | Technology assisted learning |
6086382, | Sep 30 1994 | Robolaw Corporation | Method and apparatus for improving performance on multiple-choice exams |
6115683, | Mar 31 1997 | Educational Testing Service | Automatic essay scoring system using content-based techniques |
6120297, | Aug 25 1997 | FABLE VISION, INC | Vocabulary acquistion using structured inductive reasoning |
6160987, | Jan 29 1998 | IpLearn, LLC | Computer-aided group-learning methods and systems |
6164974, | Mar 28 1997 | SOFTLIGHT INC | Evaluation based learning system |
6168440, | Feb 05 1993 | NCS PEARSON, INC | Multiple test item scoring system and method |
6173251, | Aug 05 1997 | Mitsubishi Denki Kabushiki Kaisha | Keyword extraction apparatus, keyword extraction method, and computer readable recording medium storing keyword extraction program |
6181909, | Jul 22 1997 | Educational Testing Service | System and method for computer-based automatic essay scoring |
6199034, | May 31 1995 | Oracle International Corporation | Methods and apparatus for determining theme for discourse |
6208832, | Nov 14 1997 | Sony Corporation; Sony Electronics, INC | Learning system with response analyzer |
6226611, | Oct 02 1996 | SRI International | Method and system for automatic text-independent grading of pronunciation for language instruction |
6254395, | Apr 13 1998 | Educational Testing Service | System and method for automated testing of writing skill |
6256399, | Jul 08 1992 | NCS PEARSON, INC | Method of distribution of digitized materials and control of scoring for open-ended assessments |
6267601, | Dec 05 1997 | HARCOURT ASSESSMENT, INC | Computerized system and method for teaching and assessing the holistic scoring of open-ended questions |
6282404, | Sep 22 1999 | FRONTLINE TECHNOLOGIES GROUP LLC | Method and system for accessing multimedia data in an interactive format having reporting capabilities |
6287123, | Sep 08 1998 | Computer managed learning system and data processing method therefore | |
6292792, | Mar 26 1999 | INTELLIGENT LEARNING SYSTEMS, INC | System and method for dynamic knowledge generation and distribution |
6295439, | Mar 21 1997 | Educational Testing Service | Methods and systems for presentation and evaluation of constructed responses assessed by human evaluators |
6302698, | Feb 16 1999 | Educational Testing Service | Method and apparatus for on-line teaching and learning |
6311040, | Jul 31 1997 | HARCOURT ASSESSMENT, INC | System and method for scoring test answer sheets having open-ended questions |
6343935, | Mar 01 2000 | CASTLE HILL CONFERENCE CENTER, LLC | Computerized interactive educational method and apparatus for teaching vocabulary |
6345270, | May 26 1997 | SOCIONEXT INC | Data management system |
6356864, | Jul 25 1997 | REGENTS OF THE UNIVERSITY OF COLORADO, THE, A BODY CORPORATE | Methods for analysis and evaluation of the semantic content of a writing based on vector length |
6411924, | Jan 23 1998 | EMC IP HOLDING COMPANY LLC | System and method for linguistic filter and interactive display |
6461166, | Oct 17 2000 | VXI GLOBAL SOLUTIONS LLC | Learning system with learner-constructed response based testing methodology |
6470170, | May 18 2000 | CHANGSHA DILIFE DIGITAL TECHNOLOGY CO LTD | System and method for interactive distance learning and examination training |
6493690, | Dec 22 1998 | Accenture Global Services Limited | Goal based educational system with personalized coaching |
6548470, | Dec 14 1998 | The Procter & Gamble Company | Bleaching compositions |
6553382, | Mar 17 1995 | Canon Kabushiki Kaisha | Data management system for retrieving data based on hierarchized keywords associated with keyword names |
6554618, | Apr 20 2001 | Managed integrated teaching providing individualized instruction | |
WO9718698, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Aug 29 2003 | BERMAN, DENNIS RAY | DRB LIT LTD | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 014480 | /0687 | |
Sep 02 2003 | DRB LIT Ltd. | (assignment on the face of the patent) | / | |||
Jan 01 2007 | DRB LIT LTD | BERMAN, DENNIS R | SECURITY AGREEMENT | 019181 | /0279 | |
Dec 01 2011 | DRB LIT LTD | BERMAN, DENNIS R , MR | PATENT SECURITY AGREEMENT | 027371 | /0283 | |
Jan 31 2014 | BERMAN, DENNIS R | DRB LIT LTD | RELEASE BY SECURED PARTY SEE DOCUMENT FOR DETAILS | 032370 | /0618 | |
Jan 31 2014 | TRIVAC LTD | LOCKIN, LLC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 032371 | /0018 | |
Jan 31 2014 | DRB LIT LTD | LOCKIN, LLC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 032371 | /0018 | |
Jul 21 2014 | LOCKIN, LLC | MEMORY SCIENCE, LLC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 033367 | /0085 | |
May 03 2019 | MEMORY SCIENCE, LLC | VXI GLOBAL SOLUTIONS LLC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 050298 | /0734 |
Date | Maintenance Fee Events |
Jul 10 2008 | STOL: Pat Hldr no Longer Claims Small Ent Stat |
Mar 22 2010 | M1552: Payment of Maintenance Fee, 8th Year, Large Entity. |
Apr 08 2014 | M2553: Payment of Maintenance Fee, 12th Yr, Small Entity. |
Apr 10 2014 | LTOS: Pat Holder Claims Small Entity Status. |
Date | Maintenance Schedule |
Dec 19 2009 | 4 years fee payment window open |
Jun 19 2010 | 6 months grace period start (w surcharge) |
Dec 19 2010 | patent expiry (for year 4) |
Dec 19 2012 | 2 years to revive unintentionally abandoned end. (for year 4) |
Dec 19 2013 | 8 years fee payment window open |
Jun 19 2014 | 6 months grace period start (w surcharge) |
Dec 19 2014 | patent expiry (for year 8) |
Dec 19 2016 | 2 years to revive unintentionally abandoned end. (for year 8) |
Dec 19 2017 | 12 years fee payment window open |
Jun 19 2018 | 6 months grace period start (w surcharge) |
Dec 19 2018 | patent expiry (for year 12) |
Dec 19 2020 | 2 years to revive unintentionally abandoned end. (for year 12) |