A method is disclosed for generating an application that analyzes image data, such as from satellite and microscope pictures. The method uses a graphical user interface to add a new processing object to a processing object network. The processing object network includes a parent processing object and a child processing object. A user can append a new processing object to the child processing object or can add the new processing object as a subprocess to the parent processing object. The user selects a data domain and an algorithm from selection lists on the graphical user interface and adds them to the new processing object. The application uses a semantic cognition network to process data objects that are generated by segmenting the image data. The application then uses the new processing object to identify portions of the image that are to be highlighted on the graphical user interface.
|
16. A system for generating an application that analyzes digital image data, comprising:
a computer readable storage medium; and
a computer program stored on the storage medium comprising:
a set of algorithms, wherein each algorithm of the set of algorithms represents an operation performable by the application;
a plurality of data domains, wherein each of the plurality of data domains represents a subset of a data object network, and wherein the data object network is generated by segmenting the digital image data; and
means for generating a processing object network, wherein the processing object network includes a parent process and a plurality of child processes, and wherein the parent process comprises one of the plurality of data domains and one algorithm of the set of algorithms.
1. A system for generating an application that analyzes image information, comprising:
a computer readable storage medium; and
a computer program stored on the storage medium comprising:
a set of algorithms, wherein each algorithm of the set of algorithms represents an operation performable by the application;
a plurality of data domains, wherein each of the plurality of data domains represents a subset of a data object network, and wherein the data object network is generated by segmenting the image information; and
a graphical user interface usable to generate a processing object network, wherein the processing object network includes a parent process and a plurality of child processes, and wherein the parent process comprises one of the plurality of data domains and one algorithm of the set of algorithms.
21. A computer implemented method for generating an application that analyzes an image, comprising:
adding a new processing object to a processing object network using a graphical user interface, wherein the processing object network includes a parent processing object and a child processing object, and wherein the image is displayed on the graphical user interface;
designating that the new processing object is appended to the child processing object;
selecting a data domain from a list of data domains displayed in a first dialog element on the graphical user interface and adding the data domain to the new processing object; and
selecting an algorithm from a list of algorithms displayed in a second dialog element on the graphical user interface and adding the algorithm to the new processing object, wherein the application uses the new processing object to identify a portion of the image that is to be highlighted on the graphical user interface.
2. The system of
3. The system of
4. The system of
5. The system of
6. The system of
7. The system of
8. The system of
9. The system of
10. The system of
11. The system of
12. The system of
13. The system of
14. The system of
15. The system of
17. The system of
18. The system of
19. The system of
20. The system of
22. The method of
configuring parameters of the algorithm by selecting from among available parameters displayed in a selection list in a third dialog element.
23. The method of
24. The method of
|
This application is a continuation of, and claims priority under 35 U.S.C. §120 from, nonprovisional U.S. patent application Ser. No. 10/687,477 U.S. Pat No. 7,146,380, entitled “Extracting Information from Input Data Using a Semantic Cognition Network,” filed on Oct. 15, 2003. application Ser. No. 10/687,477 in turn is a continuation of, and claims the benefit under 35 U.S.C. §119 from, German Application No. 102 48 013.3, filed on Oct. 15, 2002, in Germany. The subject matter of each of the foregoing documents is incorporated herein by reference.
The present invention relates generally to computer-implemented methods for extracting information from input data and, more specifically, to such methods employing semantic cognition networks.
There are known semantic networks that are formalisms for knowledge representation in the field of artificial intelligence. A semantic network includes semantic units and linking objects. The linking objects link respective semantic units and define the type of the link between the respective semantic units. However, it is not possible to expand, delete or amend the knowledge that is present in the semantic units and the linking objects of the semantic network.
From WO 01/45033 A1 there is known a computer-implemented method for processing data structures using a semantic network. Processing objects comprising algorithms and execution controls act on semantic units to which there is a link. Processing objects can be linked to a class object to thereby be able to perform local adaptive processing. The processing objects can use a plurality of algorithms.
According to the aforementioned document a new approach is used for object-oriented data analysis and especially picture analysis. The main difference between this method and pixel-oriented picture analysis is that in this method classification of object primitives is performed and used in further segmentation steps. These object primitives are generated during segmentation of the picture. For this purpose a so-called multi-resolution segmentation can be performed. The multi-resolution segmentation allows for segmentation of a picture in a network of homogenous picture region in each resolution selected by a user. The object primitives represent picture information in an abstract form.
As classified information carriers within a picture object network such object primitives and also other picture objects derived from such object primitives offer several advantages as compared to classified pixel.
In general, the semantic network comprises two essential components. The first one is a data object network such as a picture object network and the second one is a class object network. Beside the multi-resolution segmentation there is also the possibility of performing a so-called classification-based segmentation.
As mentioned above, the processing objects can be linked to class objects and therefore knowledge which is present in the semantic network can be expanded, deleted or amended by using the processing objects.
However, there exist several problems. The processing objects perform a local adaptive processing in the semantic network. The important aspects of local adaptive processing are analyzing and modifying objects but also navigating through the semantic network according to linking objects. However, the aspect of navigating is not covered by the aforementioned method.
In one embodiment, a method extracts information from input data by mapping the input data into a data object network. The input data is represented by semantic units. The method uses a semantic cognition network comprised of the data object network, a class object network and a processing object network. The semantic cognition network uses a set of algorithms to process the semantic units. The semantic cognition network defines a processing object in the processing object network by selecting a data domain in the data object network, a class domain in the class object network and an algorithm from the set of algorithms. The processing object comprises the data domain, the class domain and the algorithm. The processing object is used in the processing object network to process the semantic units.
In another embodiment, a system extracts information from input data using a semantic cognition network.
Other embodiments and advantages are described in the detailed description below. This summary does not purport to define the invention. The invention is defined by the claims.
The accompanying drawings, where like numerals indicate like components, illustrate embodiments of the invention.
Reference will now be made in detail to some embodiments of the invention, examples of which are illustrated in the accompanying drawings.
Both a computer-implemented method and a system are disclosed for extracting information from input data. The system can be implemented both on a single computer and on a distributed network of computers such as a local area network (LAN) or a wide area network (WAN). The constituents of a semantic cognition network, also simply called a semantic network, may be implemented in both a centralized and a decentralized form of a WAN. As the structure of distributed networks of computers upon which the system is implemented is commonly known in the art, a detailed description of such distributed networks of computers is omitted here.
In the exemplary embodiment described below, a system is described that processes image data. In other embodiments, however, other types of data are processed. For example, other embodiments process data structures provided in a topological context. In addition, yet other embodiments can process data structures in the form of audio data, text data or statistically acquired data (data mining).
With regard to the terms “semantic network”, “semantic unit”, “linking object” and “processing object” as used in this application reference is made to WO 01/45033 A1 which is fully incorporated herein by reference.
The processing object network 1 comprises a plurality of processing objects 2. The data object network 4 comprises a plurality of data objects 5. A predetermined data domain 6 is chosen from among the plurality of data objects 5, depending upon the given situation as is described below. The class object network 7 comprises a plurality of class objects 8. A predetermined class domain 9 is chosen from among the plurality of class objects 8, depending upon the given situation as is described below. The set of algorithms 10 comprises a plurality of algorithms 11. A predetermined algorithm 11 is chosen from among the set of algorithms 10, depending upon the given situation as is described below.
A processing object 2 is an object to perform an analysis operation, such as a picture analysis operation within a given project. As shown in
Due to the different domains and algorithms that can be selected depending on the given application, processing objects 2 can selectively act on predetermined data objects 5 and on predetermined class objects 8 by using a predetermined algorithm 11. For example, a predetermined segmentation algorithm is applied only to sub-data objects of a selected data object 5 within the data object network 4.
Information relating to the processing object network 1 appears in the window with a tab labeled “Process”. Such information includes processing objects 2 and related linking objects representing the execution control. Information relating to class object network 7 appears in the window labeled “Class Hierarchy”. Such information includes the class objects 8 and related linking objects representing a semantic grouping of the class objects 8.
A dialog element labeled “Loops & Cycles” can be used to repeat the respective processing object 2 a certain number of times or until a stable state is reached. The large selection list on the right side shows all class objects 8 of the class object network 7. The user selects class objects 8 to specify the class domain 9.
Furthermore, there is also the possibility to navigate through the data object network 4 using different data domains 6 in linked processing objects 2. For example, firstly the data domain 6 labeled “image object level 2” is selected to address all data objects 5 on a certain hierarchical level of the data objects 5. Secondly the data domain 6 labeled “sub object (1)” is selected to address all data objects 5 that are sub-data objects to a data object 5 addressed before and being processed by a super-ordinate processing object. Thirdly the data domain 6 labeled “neighbour objects (0)” is selected to address all data objects 5 neighboring to the data object 5 being addressed before and processed by a super-ordinate processing object.
The dialog element labeled “Candidates” is used to specify the classification of the two neighboring data objects 5. The dialog element labeled “Fitting function” allows the user to define an optimization criterion composed of a data object feature such as here “Elliptic Fit” which measures a property of the two neighboring data objects 5 and a data object 5 generated by merging the two neighboring data 5 such as here the closeness to an ellipsis. However, any other property can also be used. Furthermore, the optimization criterion is also composed of a weighted sum of property values of the two neighboring data objects 5 and the data object 5 generated by merging the two neighboring data objects 5.
Radio buttons “Minimize” and “Maximize” are used to select whether the weighted sum of property values should be minimized or maximized. Finally, the resulting value weighted sum is compared with a fitting threshold which can also be selected by the user and is used to determine if the optimization is fulfilled or not.
If the optimization criterion is not fulfilled, the two neighboring data objects are not merged, and processing is completed. However, if the optimization criterion is fulfilled it is checked. For example, the optimization criterion is fulfilled if a mode for the fitting, such as “best fitting”, which can also be selected by the user, is checked. If the mode for the fitting is fulfilled, the two neighboring data objects 5 are merged. However, if the mode of the fitting is not fulfilled, the process proceeds with one of the two neighboring data objects 5 and another data object 5 neighboring the one of the two neighboring data objects 5. The process mentioned above can be repeated until no further merging of data objects 5 occurs or until a certain number of cycles have been executed. The process mentioned above is an example of an important type of special processing objects, which is shown in
Thereafter, a morphological operation 13 is performed using the algorithm 11 to combine the semantic units in the data domain 6 with the semantic units in the additional data domain 12 to create secondary semantic units 14. One of said secondary semantic units 14 is compared with a best-fitting one of the semantic units in the additional class domain 16 (see arrow 15) and said one secondary semantic unit 14 is accepted or rejected (see arrow 17) according to the fitting to thereby form a tertiary semantic unit 20.
Thereafter the aforementioned morphological operation 13 and the steps of comparing and accepting or rejecting (see arrow 17) are repeated until each tertiary semantic unit 20 is thereby removed from further processing. The aforementioned process can be performed such that each step is performed multiple times, as is shown by the circle 18 labeled “Iterations during evolution). Furthermore, multiple special processing units can be linked together before the morphological operation is performed to form an execution control based on how the multiple processing objects are linked.
There is additionally the possibility to define super-ordinate special processing object by selecting a special data domain in a special processing object network, the class domain 9 in the class object network 7 and the algorithm 11 in the set of algorithms 10. The super-ordinate special processing object comprises the special data domain, the class domain 9 and the algorithm 11. The special processing is compared to the semantic units in the class domain 9. Finally, the special processing object is removed from further processing if the special processing object fulfills a predetermined criterion.
By means of the aforementioned additional process, there is the possibility to observe the behavior in time of each processed object and to use this behavior for classification and other processing of the observed objects.
As has been already mentioned above, the present invention can be applied to data objects in general. However, one important application of the present invention is the application on picture data to classify pictures. One specific application of the present invention is the application on picture data in the field of life science where the pictures are microscopic pictures taken from tissue samples, microscopic pictures taken from suitable stained tissue samples, microscopic pictures taken from living or fixed cells and microscopic pictures taken with fluorescent microscopes or scanners, etc. Another important application of the present invention is the application on picture data in the field of geographical information extraction using satellite, airborne or other pictures.
Although the present invention has been described in connection with certain specific embodiments for instructional purposes, the present invention is not limited thereto. Accordingly, various modifications, adaptations, and combinations of various features of the described embodiments can be practiced without departing from the scope of the invention as set forth in the claims.
Binnig, Gerd, Athelogou, Maria, Schaepe, Arno, Benz, Ursula, Krug, Christof
Patent | Priority | Assignee | Title |
8233712, | Jul 28 2006 | The University of New Brunswick | Methods of segmenting a digital image |
9265458, | Dec 04 2012 | NEUROSYNC, INC | Application of smooth pursuit cognitive testing paradigms to clinical drug development |
9380976, | Mar 11 2013 | NEUROSYNC, INC | Optical neuroinformatics |
9552549, | Jul 28 2014 | GOOGLE LLC | Ranking approach to train deep neural nets for multilabel image annotation |
Patent | Priority | Assignee | Title |
5537485, | Jul 21 1992 | Arch Development Corporation | Method for computer-aided detection of clustered microcalcifications from digital mammograms |
5671353, | Feb 16 1996 | CARESTREAM HEALTH, INC | Method for validating a digital imaging communication standard message |
5778378, | Apr 30 1996 | MICHIGAN, REGENTS OF THE UNIVERSITY OF, THE | Object oriented information retrieval framework mechanism |
5872859, | Nov 02 1995 | University of Pittsburgh | Training/optimization of computer aided detection schemes based on measures of overall image quality |
6018728, | Feb 09 1996 | SRI International | Method and apparatus for training a neural network to learn hierarchical representations of objects and to detect and classify objects with uncertain training data |
6058322, | Jul 25 1997 | Arch Development Corporation | Methods for improving the accuracy in differential diagnosis on radiologic examinations |
6075878, | Nov 28 1997 | Arch Development Corporation | Method for determining an optimally weighted wavelet transform based on supervised training for detection of microcalcifications in digital mammograms |
6075879, | Sep 29 1993 | Hologic, Inc; Biolucent, LLC; Cytyc Corporation; CYTYC SURGICAL PRODUCTS, LIMITED PARTNERSHIP; SUROS SURGICAL SYSTEMS, INC ; Third Wave Technologies, INC; Gen-Probe Incorporated | Method and system for computer-aided lesion detection using information from multiple images |
6246782, | Jun 06 1997 | Lockheed Martin Corporation | System for automated detection of cancerous masses in mammograms |
6282305, | Jun 05 1998 | Arch Development Corporation | Method and system for the computerized assessment of breast cancer risk |
6320976, | Apr 01 1999 | Siemens Aktiengesellschaft | Computer-assisted diagnosis method and system for automatically determining diagnostic saliency of digital images |
6324532, | Feb 07 1997 | SRI International | Method and apparatus for training a neural network to detect objects in an image |
6389305, | Dec 15 1993 | Lifeline Biotechnologies, Inc. | Method and apparatus for detection of cancerous and precancerous conditions in a breast |
6453058, | Jun 07 1999 | Siemens Aktiengesellschaft | Computer-assisted diagnosis method using correspondence checking and change detection of salient features in digital images |
6574357, | Sep 29 1993 | Computer-aided diagnosis method and system | |
6625303, | Feb 01 1999 | CARESTREAM HEALTH, INC | Method for automatically locating an image pattern in digital images using eigenvector analysis |
6650766, | Aug 28 1997 | ICAD, INC | Method for combining automated detections from medical images with observed detections of a human interpreter |
6763128, | Jun 23 1999 | ICAD, INC | Method for analyzing detections in a set of digital images using case based normalcy classification |
6801645, | Jun 23 1999 | ICAD, INC | Computer aided detection of masses and clustered microcalcifications with single and multiple input image context classification strategies |
6937776, | Jan 31 2003 | University of Chicago | Method, system, and computer program product for computer-aided detection of nodules with three dimensional shape enhancement filters |
6944603, | Jun 24 2000 | IBM Corporation | Fractal semantic network generator |
6970587, | Aug 28 1997 | ICAD, INC | Use of computer-aided detection system outputs in clinical practice |
20020188436, | |||
20030115175, | |||
20040006765, | |||
20040148296, | |||
DE19908204, | |||
WO145033, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Feb 17 2004 | ATHELOGOU, MARIA | Definiens AG | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 018436 | /0208 | |
Feb 17 2004 | KRUG, CHRISTOF | Definiens AG | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 018436 | /0208 | |
Feb 18 2004 | SCHAEPE, ARNO | Definiens AG | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 018436 | /0208 | |
Mar 01 2004 | BENZ, URSULA | Definiens AG | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 018436 | /0208 | |
Mar 08 2004 | BINNIG, GERD | Definiens AG | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 018436 | /0208 | |
Oct 17 2006 | Definiens AG | (assignment on the face of the patent) | / |
Date | Maintenance Fee Events |
Jun 11 2012 | M1551: Payment of Maintenance Fee, 4th Year, Large Entity. |
Jan 30 2013 | ASPN: Payor Number Assigned. |
Jun 13 2016 | M1552: Payment of Maintenance Fee, 8th Year, Large Entity. |
Jun 10 2020 | M1553: Payment of Maintenance Fee, 12th Year, Large Entity. |
Date | Maintenance Schedule |
Dec 16 2011 | 4 years fee payment window open |
Jun 16 2012 | 6 months grace period start (w surcharge) |
Dec 16 2012 | patent expiry (for year 4) |
Dec 16 2014 | 2 years to revive unintentionally abandoned end. (for year 4) |
Dec 16 2015 | 8 years fee payment window open |
Jun 16 2016 | 6 months grace period start (w surcharge) |
Dec 16 2016 | patent expiry (for year 8) |
Dec 16 2018 | 2 years to revive unintentionally abandoned end. (for year 8) |
Dec 16 2019 | 12 years fee payment window open |
Jun 16 2020 | 6 months grace period start (w surcharge) |
Dec 16 2020 | patent expiry (for year 12) |
Dec 16 2022 | 2 years to revive unintentionally abandoned end. (for year 12) |