This invention provides a system and method for automating the setup of locators and detectors within an image view of an object on the HMI of a vision detector by determining detectable edges and best fitting the locators and detectors to a location on the object image view following the establishment of an user selected operating point on the image view, such as by clicking a gui cursor. In this manner, the initial placement and sizing of the graphical elements for locator and detector ROIs are relatively optimized without excessive adjustment by the user. locators can be selected for direction, including machine or line-movement direction, cross direction or angled direction transverse to cross direction and movement direction. detectors can be selected based upon particular analysis tools, including brightness tools, contrast tools and trained templates. The locators and detectors are each associated with a particular set of operating parameters, such as activation threshold, which are displayed in a control box within the gui (and can be accessed by clicking on the specific locator or detector. A parameter bar can also be provided adjacent to the depiction of the detector on the image view for easy reference. Both locators and detectors may be manually readjusted once automatically placed and sized by drag and drop techniques.
|
0. 46. A method for placing at least one of a locator and a detector on a graphical user interface (gui) display, the method comprising one or more processors performing the steps of:
displaying an image view of an object derived from a vision sensor on the gui display;
performing an edge detection process that identifies detectable edges in the image view and stores edge information;
selecting at least one of a (a) locator and (b) a detector for placement on the image view; and
using the edge information to automatically place the selected at least one of the (a) locator and (b) detector at a position on the image view with a size that is determined based upon the edge information.
0. 34. A system for placing at least one of a locator and a detector on a graphical user interface (gui) display, the system comprising one or more processors performing the steps of:
providing an image view of an object derived from a vision sensor on the gui display;
performing an edge detection process that identifies detectable object edges in the image view and stores edge information;
enabling selection of at least one of a (a) locator and (b) a detector for placement on the image view; and
automatically using the edge information to place the selected at least one of the (a) locator and (b) detector at a position on the image view with a size that is determined based upon edge information.
17. A method for placing and sizing on a graphical user interface (gui) display at least one of locators and detectors comprising one or more processors implementing the steps of the steps of:
displaying a gui screen image view of an object derived from a vision sensor having a field of view in which the object is in relative motion thereto and a plurality of captured image frames of the object within the filed of view, the image view being accessible by a gui cursor;
determining and analyzing detectable edges in the screen image view and stores edge information;
selecting either a (a) locator or (b) a detector based upon a predetermined analysis tool for placement on the image view; and
placing automatically, using the edge information, the selected (a) locator or (b) detector at a position on the image view upon which the cursor points with a size that is determined based upon a location of adjacent edges of the object image view.
1. A system for placing and sizing on a graphical user interface (gui) display at least one of locators and detectors comprising one or more processors implementing the steps of:
a gui screen image view of an object derived from a vision sensor having a field of view in which the object is in relative motion thereto and a plurality of captured image frames of the object within the filed of view, the image view being accessible by a gui cursor;
an edge detection process that determines and analyzes detectable edges in the screen image view and stores edge information;
a selector that allows a user to select either a (a) locator or (b) a detector based upon a predetermined analysis tool for placement on the image view; and
an automatic placement process that uses the edge information to place the selected (a) locator or (b) detector at a position on the image view upon which the cursor points with a size that is determined based upon a location of adjacent edges of the object image view.
2. The system as set forth in
3. The system as set forth in
4. The system as set forth in
5. The system as set forth in
6. The system as set forth in
0. 7. The system as set forth in
0. 8. The system as set forth in
0. 9. The system as set forth in
10. The system as set forth in
0. 11. The system as set forth in
12. The system as set forth in
0. 13. The system as set forth in
0. 14. The system as set forth in
0. 15. The system as set forth in
0. 16. The system as set forth in
18. The method as set forth in
19. The method as set forth in
20. The method as set forth in
21. The method as set forth in
22. The method as set forth in
0. 23. The method as set forth in
0. 24. The method as set forth in
0. 25. The method as set forth in
26. The method as set forth in claim 25 18 wherein the threshold for activating the locator is automatically determined by computing a threshold value based upon a magnitude value relative to the nearest adjacent edge.
27. The method as set forth in claim 25 18 further comprising displaying the operating parameters are selectively in the control box by operating the cursor upon the locator.
0. 28. The method as set forth in
29. The method as set forth in
0. 30. The method as set forth in
0. 31. The method as set forth in
0. 32. The method as set forth in
0. 33. The system as set forth in
0. 35. The system as set forth in claim 34 wherein the step of using the edge information to place one of a locator and a detector on the image view with a size based upon edge information includes determining the size based upon a location of adjacent edges of the object in the screen image.
0. 36. The system as set forth in claim 34 wherein the step of placing one of a locator and a detector includes placing a locator on the image view relative to a nearest adjacent edge of the image view and adjusting the locator so as to avoid a stronger-magnitude more-distant edge.
0. 37. The system as set forth in claim 36 wherein the step of placing one of a locator and a detector includes sizing a width of the locator so as to avoid the stronger-magnitude more-distant edge.
0. 38. The system as set forth in claim 36 wherein the locator includes a height based upon a line segment fit within a predetermined deviation away from the nearest adjacent edge.
0. 39. The system as set forth in claim 38 wherein the line segment is oriented at a relative angle with respect to a vertical and a horizontal axis within the image view so as to cause the line segment to fit with minimum deviation from the nearest adjacent edge.
0. 40. The system as set forth in claim 36 wherein the locator is selected to be oriented with respect to a direction of relative motion of the object within a field of view of the vision sensor.
0. 41. The system as set forth in claim 36 wherein the locator is selected to be oriented with respect to (a) a direction of relative motion of the object within a field of view of the vision sensor, and at least one of (b) a direction transverse to the direction of relative motion and (c) a direction at an angle between (a) and (b).
0. 42. The system as set forth in claim 36 wherein the operating parameters are selectively displayed in a control box by operating the cursor upon the locator.
0. 43. The system as set forth in claim 34 wherein the detector is adapted to be selected based upon at least one of brightness, contrast and a trained template.
0. 44. The system as set forth in claim 34 wherein the step of placing one of a locator and a detector includes placing the detector on the image view relative to the position at which a cursor points so that a relative center of the detector is at the position at which the cursor points and an outer boundary of the detector extends to a location that is within detected edges of the object in the image view.
0. 45. The system as set forth in claim 44 wherein the operating parameters are selectively displayed in the a control box by operating the cursor upon the detector.
0. 47. The method as set forth in claim 46 wherein the step of using the edge information to place includes placing the locator on the image view relative to a nearest adjacent edge of the image view and to adjust the locator so as to avoid a stronger-magnitude more-distant edge.
0. 48. The method as set forth in claim 47 wherein the step of using the edge information to place includes sizing a width of the locator so as to avoid the stronger-magnitude more-distant edge.
0. 49. The method as set forth in claim 47 wherein the locator includes a height based upon a line segment fit within a predetermined deviation away from the nearest adjacent edge.
0. 50. The method as set forth in claim 49 wherein the line segment is oriented at a relative angle with respect to a vertical and a horizontal axis within the image view so as to cause the line segment to fit with minimum deviation from the nearest adjacent edge.
0. 51. The method as set forth in claim 47 wherein the locator is selected to be oriented with respect to the direction of relative motion of the object in a field of view of the vision sensor.
0. 52. The method as set forth in claim 47 wherein the threshold for activating the locator is automatically determined by computing a threshold value based upon a magnitude value relative to the nearest adjacent edge.
0. 53. The method as set forth in claim 47 further comprising displaying the operating parameters selectively in the control box by operating a cursor upon the locator.
0. 54. The method as set forth in claim 46 wherein the step of using the edge information to place includes placing the detector on the image view relative to the position at which a cursor points so that a relative center of the detector is at the position at which the cursor points and an outer boundary of the detector extends to a location that is within detected edges of the object within the image view.
|
This application is related to copending and commonly assigned U.S. patent application Ser. No. 10/865,155, entitled METHOD AND APPARATUS FOR VISUAL DETECTION AND INSPECTION OF OBJECTS, by William M. Silver, filed Jun. 9, 2004, the teachings of which are expressly incorporated herein by reference.
This invention relates to automated detection and inspection of objects being manufactured on a production line, and more particularly to setup systems and methods for such automated detection and inspection.
Industrial manufacturing relies on automatic inspection of objects being manufactured. One form of automatic inspection that has been in common use for decades is based on optoelectronic technologies that use electromagnetic energy, usually infrared or visible light, photoelectric sensors, and some form of electronic decision making.
One well-known form of optoelectronic automatic inspection uses an arrangement of photodetectors. A typical photodetector has a light source and a single photoelectric sensor that responds to the intensity of light that is reflected by a point on the surface of an object, or transmitted along a path that an object may cross. A user-adjustable sensitivity threshold establishes a light intensity above which (or below which) an output signal of the photodetector will be energized.
One photodetector, often called a gate, is used to detect the presence of an object to be inspected. Other photodetectors are arranged relative to the gate to sense the light reflected by appropriate points on the object. By suitable adjustment of the sensitivity thresholds, these other photodetectors can detect whether certain features of the object, such as a label or hole, are present or absent. A decision as to the status of the object (for example, pass or fail) is made using the output signals of these other photodetectors at the time when an object is detected by the gate. This decision is typically made by a programmable logic controller (PLC), or other suitable electronic equipment.
Automatic inspection using photodetectors has various advantages. Photodetectors are inexpensive, simple to set up, and operate at very high speed (outputs respond within a few hundred microseconds of the object being detected, although a PLC will take longer to make a decision).
Automatic inspection using photodetectors has various disadvantages, however, including:
Another well-known form of optoelectronic automatic inspection uses a device that can capture a digital image of a two-dimensional field of view (FOV) in which an object to be inspected is located, and then analyze the image and make decisions. Such a device is usually called a machine vision system, or simply a vision system. The image is captured by exposing a two-dimensional array of photosensitive elements for a brief period, called the integration or shutter time, to light that has been focused on the array by a lens. The array is called an imager and the individual elements are called pixels. Each pixel measures the intensity of light falling on it during the shutter time. The measured intensity values are then converted to digital numbers and stored in the memory of the vision system to form the image, which is analyzed by a digital processing element such as a computer, using methods well-known in the art to determine the status of the object being inspected.
In some cases the objects are brought to rest in the field of view, and in other cases the objects are in continuous motion through the field of view. An event external to the vision system, such as a signal from a photodetector, or a message from a PLC, computer, or other piece of automation equipment, is used to inform the vision system that an object is located in the field of view, and therefore an image should be captured and analyzed. Such an event is called a trigger.
Machine vision systems avoid the disadvantages associated with using an arrangement of photodetectors. They can analyze patterns of brightness reflected from extended areas, easily handle many distinct features on the object, accommodate line changeovers through software systems and/or processes, and handle uncertain and variable object locations.
Machine vision systems have disadvantages compared to an arrangement of photodetectors, including:
Machine vision systems have limitations that arise because they make decisions based on a single image of each object, located in a single position in the field of view (each object may be located in a different and unpredictable position, but for each object there is only one such position on which a decision is based). This single position provides information from a single viewing perspective, and a single orientation relative to the illumination. The use of only a single perspective often leads to incorrect decisions. It has long been observed, for example, that a change in perspective of as little as a single pixel can in some cases change an incorrect decision to a correct one. By contrast, a human inspecting an object usually moves it around relative to his eyes and the lights to make a more reliable decision.
Also, the limitations of machine vision systems arise in part because they operate too slowly to capture and analyze multiple perspectives of objects in motion, and too slowly to react to events happening in the field of view. Since most vision systems can capture a new image simultaneously with analysis of the current image, the maximum rate at which a vision system can operate is determined by the larger of the capture time and the analysis time. Overall, one of the most significant factors in determining this rate is the number of pixels comprising the imager.
The availability of new low-cost imagers, such as the LM9630 from National Semiconductor of Santa Clara, Calif. that operate at a relatively low-resolution (approximately 100×128 pixels), high frame rate (up to 500 frames per second) and high sensitivity allowing short shutter times with inexpensive illumination (e.g., 300 microseconds with LED illumination), have made possible the implementation of a novel vision detector that employs on-board processors to control machine vision detection and analysis functions. A novel vision detector using such an imager, and overall inspection system employing such a vision detector, is taught in copending and commonly assigned U.S. patent application Ser. No. 10/865,155, entitled METHOD AND APPARATUS FOR VISUAL DETECTION AND INSPECTION OF OBJECTS, by William M. Silver, filed Jun. 9, 2004, and the teachings of which are expressly incorporated herein by reference (herein also termed “above-incorporated-by-reference METHOD AND APPARATUS).
An advantage to the above-incorporated-by-reference detection and inspection METHOD AND APPARATUS is that the vision detector can be implemented within a compact housing that is programmed using a PC or other Human-Machine Interface (HMI) device (via, for example, a Universal Serial Bus (USB)), and is then deployed to a production line location for normal runtime operation. The outputs of the apparatus are (in one implementation) a pair of basic High/Low lines indicating detection of the object and whether that object passes or fails based upon the characteristics being analyzed. These outputs can be used (for example) to reject a failed object using a rejection arm mounted along the line that is signaled by the apparatus' output.
By way of example,
In an alternate example, the vision detector 100 sends signals to a PLC for various purposes, which may include controlling a reject actuator. In another exemplary implementation, suitable in extremely high-speed applications or where the vision detector cannot reliably detect the presence of an object, a photodetector is used to detect the presence of an object and sends a signal to the vision detector for that purpose. In yet another implementation, there are no discrete objects, but rather material flows past the vision detector continuously—for example a web. In this case the material is inspected continuously, and signals are sent by the vision detector to automation equipment, such as a PLC, as appropriate.
Basic to the function of the vision detector 100 in the above-incorporated-by-reference METHOD AND APPARATUS is the ability to exploit the abilities of the imager's quick-frame-rate and low-resolution image capture to allow a large number of image frames of an object passing down the line to be captured and analyzed in real-time. Using these frames, the apparatus' on-board processor can decide when the object is present and use location information to analyze designated areas of interest on the object that must be present in a desired pattern for the object to “pass” inspection.
With brief reference to
Boxes labeled “c”, such as box 220, represent image capture by the vision detector 100. Boxes labeled “a”, such as box 230, represent image analysis. It is desirable that capture “c” of the next image be overlapped with analysis “a” of the current image, so that (for example) analysis step 230 analyzes the image captured in capture step 220. In this timeline, analysis is shown as taking less time than capture, but in general analysis will be shorter or longer than capture depending on the application details. If capture and analysis are overlapped, the rate at which a vision detector can capture and analyze images is determined by the longer of the capture time and the analysis time. This is the “frame rate”. The above-incorporated-by-reference METHOD AND APPARATUS allows objects to be detected reliably without a trigger signal, such as that provided by a photodetector.
Each analysis step “a” first considers the evidence that an object is present. Frames where the evidence is sufficient are called active. Analysis steps for active frames are shown with a thick border, for example analysis step 240. In an exemplary implementation, inspection of an object begins when an active frame is found, and ends when some number of consecutive inactive frames are found. In the example of
At the time that inspection of an object is complete, for example at the end of analysis step 248, decisions are made on the status of the object based on the evidence obtained from the active frames. In an exemplary implementation, if an insufficient number of active frames were found then there is considered to be insufficient evidence that an object was actually present, and so operation continues as if no active frames were found. Otherwise an object is judged to have been detected, and evidence from the active frames is judged in order to determine its status, for example pass or fail. A variety of methods may be used to detect objects and determine status within the scope of this example; some are described below and many others will occur to those skilled in the art. Once an object has been detected and a judgment made, a report may be made to appropriate automation equipment, such as a PLC, using signals well-known in the art. In such a case a report step would appear in the timeline. The example of
Note in particular that the report 260 may be delayed well beyond the inspection of subsequent objects such as object 110 (
Once inspection of an object is complete, the vision detector 100 may enter an idle step 280. Such a step is optional, but may be desirable for several reasons. If the maximum object rate is known, there is no need to be looking for an object until just before a new one is due. An idle step will eliminate the chance of false object detection at times when an object couldn't arrive, and will extend the lifetime of the illumination system because the lights can be kept off during the idle step.
The processor of the exemplary above-incorporated-by-reference METHOD AND APPARATUS is provided with two types of software elements to use in making its decisions: “Locators” that locate the object and “Detectors” that decide whether an object feature is present or absent. The decisions made by both Locators and Detectors are used to judge whether an object is detected and, if so, whether it passes inspection. In one example, Locators can be simply described as a one-dimensional edge detector in a region of interest. The vision detector is configured for locating objects by placing Locators at certain positions in an image where an edge feature of the object can be seen when the object is in the field of view. The Locator can be oriented with respect to the direction the object is moving, and sized to ensure that the edge feature of the object can be located at multiple positions while in the field of view. During analysis, the location of the edge feature of the object within the Locator can be reported, as well as a logical output state that the location is known.
Detectors are vision tools that operate on a region of interest that produce a logical output state that detects the presence or absence of features in an image of the object. The vision detector is configured for detecting features of an object by placing Detectors at certain positions in an image where object features can be seen when the object is located by the Locators. Various types of Detectors can be used, such as Brightness Detectors, Edge Detectors, and Contrast Detectors.
Detectors can be linked to the location of the feature determined by a Locator to further refine the presence detection and inspection of the object. Accordingly, in each frame where the object may be viewed at a different perspective, the location of the object determined by the Locator will be different, and the position of the Detectors in the image can be moved according to the location determined by the Locator. The operation of the vision detector at high frame rates, therefore permits the vision detector to capture and analyze multiple images of the object while it passes through the field of view.
The above-discussion of Locators and Detectors is further illustrated by way of example in
The Locator 320 is used to detect and locate the top edge of the object, and the Locator 322 is used to detect and locate the right edge. A Brightness Detector 330 is used to help detect the presence of the object. In this example the background is brighter than the object, and the sensitivity threshold is set to distinguish the two brightness levels, with the logic output inverted to detect the darker object and not the brighter background. Together the Locators 320 and 322, and the Brightness Detector 330, provide the evidence needed to judge that an object has been detected, as further described below. A Contrast Detector 340 is used to detect the presence of the hole 312. When the hole 312 is absent the contrast would be very low, and when present the contrast would be much higher. A Spot Detector could also be used. An Edge Detector 360 is used to detect the presence and position of the label 310. If the label 310 is absent, mis-positioned horizontally, or significantly rotated, the analog output of the Edge Detector would be very low. A Brightness Detector 350 is used to verify that the correct label has been applied. In this example, the correct label is white and incorrect labels are darker colors.
As the object (110 in
The choice of Gadgets to wire to ObjectDetect is made by a user based on knowledge of the application. In the example of
The logic output of ObjectDetect Judge 400 is wired to AND Gate 470. The logic output of ObjectPass Judge 402 is inverted (circle 403) and also wired to AND Gate 470. The ObjectDetect Judge is set to “output when done” mode, so a pulse appears on the logic output of ObjectDetect Judge 400 after an object has been detected and inspection is complete. Since the logic output of ObjectPass 402 has been inverted, this pulse will appear on the logic output of AND Gate 470 only if the object has not passed inspection. The logic output of AND Gate 470 is wired to an Output Gadget 480, named “Reject”, which controls an output signal from the vision detector than can be connected directly to a reject actuator 170 (
To aid the user's understanding of the operation of the exemplary vision detector 100, Gadgets and/or wires can change their visual appearance to indicate fuzzy logic values. For example, Gadgets and/or wires can be displayed red when the logic value is below 0.5, and green otherwise. In
where wi is the ith weight and zi is the corresponding pixel gray level. In this example, the weights approximate a Gaussian function of distance r from the center of the kernel to the center of each weight,
so that pixels near the center are weighted somewhat higher than those near the edge. One advantage of a center-weighted Brightness Detector is that if a bright feature happens to lie near the edge of the Detector's ROI, then slight variations in its position will not cause large variations in the analog output. In
In example, b=1.0.
In another exemplary implementation, the analog output is defined by the function C(q), which is the gray level such that:
where q is a percentile chosen by a user. C is the inverse cumulative weighted distribution of gray levels. Various useful values of q are given in the following table:
q
C(q)
0.0
absolute minimum gray level in ROI
0.1
statistically reliable minimum gray level
0.5
weighted median gray level
0.9
statistically reliable maximum gray level
1.0
absolute maximum gray level
In one example of a Contrast Detector, the analog output is the standard deviation of the gray levels within the ROI. In an exemplary implementation, the array of positive weights 500 is used to compute a weighted standard deviation:
In another example, the analog output is given by
C(qhi)−C(qlo) (6)
where the q values may be chosen by the user. Useful values are qhi=0.95, qlo=0.05.
In the implementation of
The step kernel 600, with values ki, can be considered to be the product of an ideal step edge template ei and a kernel of positive weights wi:
Note that the ideal step edge template values ei are +1 when ki>0, corresponding to the black on white region of step kernel 600, and −1 when ki<0, corresponding to the white on black region of step kernel 600.
Define contrast C and weighted normalized correlation R2 of the step kernel and a like-shaped ROI with pixel values zi as follows:
The contrast C uses the standard formula for weighted standard deviation, and R2 uses the standard formula for weighted normalized correlation, but simplified because for step kernel 600
An orthogonal step kernel 610 with values ki′ is also created that is identical to the step kernel 600 but rotated 90 degrees. The ratio
is a reasonable estimate of the tangent of the angle between the actual and expected direction of an edge, particularly for small angles where D is also a good estimate of the angle itself. Note that an orthogonal step template 610 doesn't need to be created—the values from the step kernel 600 can be used, but corresponding to the pixels values in the ROI in a different order.
A weighted normalized correlation operation 700 using ROI 710 and step kernel 720 computes R2. A contrast operation 730 using ROI 710 and step kernel 720 computes C, which is converted by fuzzy threshold operation 740 into a fuzzy logic value 742 indicating the confidence that the contrast is above the noise level. Weighted correlation operations 750 and 752, using ROI 710, step kernel 720, and orthogonal step kernel 722, and absolute value of arctangent of ratio operation 760, compute D, which is converted by fuzzy threshold operation 770 into a fuzzy logic value 772 indicating the confidence that the angle between the expected and actual edge directions is small.
A fuzzy AND element 780 operates on R2 and fuzzy logic values 742 and 772 to produce the analog output 790 of the Edge Detector. Note that R2, being in the range 0-1, can be used directly as a fuzzy logic value. The analog output 790 is in the range 0-1, but it can be multiplied by some constant, for example 100, if a different range is desired. Note that the logic output of an Edge Detector is derived from the analog output using the sensitivity threshold that all Photos have.
In
The use of ridge kernel 800 is similar to that for step kernel 600. The contrast C is computed using the same formula, but R2 uses a different formula because the sum of the kernel values is not 0:
Note that this formula reduces to the one used for step edges when the sum of the kernel values is 0.
A different method is used to determine the angle D between the actual and expected edge directions. A positive rotated ridge kernel 810 with values ki+ is created with an edge direction θ+a, and a negative rotated ridge kernel 810 with values ki− is created with an edge direction θ−a. A parabola is fit to the three points
The x coordinate of the minimum of the parabola is a good estimate of the angle D between the actual and expected edge directions.
R2 and the fuzzy logic values are used by fuzzy AND element 980 to produce a ridge analog output 992 for an Edge Detector that can detect ridge edges. For an Edge Detector that can detect either step or ridge edges, the ridge analog output 992 and analog output 990 from a step edge detector 988 can be used by fuzzy OR element 982 to produce a combined analog output 991.
Position control 1020 is used to position a Photo in the field of view. Diameter spinner 1022 is used to change the diameter of a Detector. Direction controls 1024 are used to orient an Edge Detector to the expected edge direction. Position, diameter, and orientation can also be set by manipulation of graphics in an image view, for example the image view of
Edge type checkboxes 1030 are used to select the types of edges to be detected and the edge polarity. Dark-to-light step, light-to-dark step, dark ridge, and light ridge can be selected. Any combination of choices is allowed, except for choosing none.
Jiggle spinner 1040 allows the user to specify a parameter j such that the Edge Detector will be run at a set of positions ±j pixels around the specified position, and the position with the highest analog output will be used. Sensitivity threshold controls 1050 allow the user to set the sensitivity fuzzy threshold of a Photo. Zero-point label 1051 shows value t0 that can be set by zero-point slider 1052. One-point label 1053 shows value t1, which can be set by one-point slider 1054. Analog output label 1055 shows the current analog output of a Photo. The analog output is also shown graphically by the filled-in region to the left of analog output label 1055, which shrinks and grows like a mercury thermometer lying on its side. The filled-in region can be displayed in three distinct colors or patterns corresponding to a first zone 1056 below t0, a second zone 1057 between t0 and t1, and a third zone 1058 above t1.
Contrast threshold controls 1060 allow the user to view the contrast C and set the contrast fuzzy thresholds 740 and 940. These controls operate in the same manner as the sensitivity threshold controls 1050.
Direction error controls 1070 allow the user to view the angle between the actual and expected edge directions D and set the direction fuzzy thresholds 770 and 970. These controls operate in the same manner as the sensitivity threshold controls 1050, except that the thermometer display fills from right-to left instead of left-to-right because lower values of D correspond to higher fuzzy logic values.
The use of spot kernel 1100 is similar to that for ridge kernel 800. Weighted normalized correlation R2 and contrast C are computed using the same formulas as was used for the ridge kernel.
In one example, a Locator searches a one-dimensional range for an edge, using any of a variety of well-known techniques. The search direction is normal to the edge, and a Locator has a width parameter that is used to specify smoothing along the edge, which is used in well-known ways. The analog output of a Locator depends on the particular method used to search for the edge.
In one example, a Locator searches a one-dimensional range for an edge using the well-known method of computing a projection of the ROI parallel to the edge, producing a one-dimensional profile along the search range. The one-dimensional profile is convolved with a one-dimensional edge kernel, and the location of the peak response corresponds to the location of the edge. A interpolation, such as the well-known parabolic interpolation, can be used if desired to improve the edge location accuracy. In another example, an edge can be located by searching for a peak analog output using the edge detector of
In another example, a Locator searches a multi-dimensional range, using well-known methods, which may include translation, rotation, and size degrees of freedom. It will be clear to one skilled in the art how to employ multi-dimensional Locators to position Photos in practicing the example, so the following discussion will be limited to one-dimensional Locators, which are preferred due to their simplicity.
Detector 1310 and Locator 1312 can be moved around in the FOV by clicking anywhere on their border and dragging. Detector 1310 has a resize handle 1320 for changing its diameter, and Locator 1312 has a resize handle 1322 for changing its width and range, and a rotate handle 1324 for changing its direction. All Photos can be moved by dragging the border, and have similar handles as appropriate to their operation.
In the example of
A Locator has a rail 1332, shown in
Every Photo can be linked to zero or more locators, up to some maximum number determined by this example. The number of links determines the number of degrees of freedom that the Locators can control. Degrees of freedom include rotation, size, and the two degrees of freedom of translation. In one example, the maximum number of links is two and only the translation degrees of freedom are controlled.
A linkage defines how a Photo moves as the Locator's plunger moves, following an edge in the image. The movements are defined to keep the Photo at a constant relative distance to the rail or rails of the locators to which it is linked. In this example, the linkages are drawn using a mechanical analogy, such that one could actually build a linkage out of structural elements and bearings and the Photos would move in the same way as forces are applied to the plungers.
In
Every photo has an emitter, a diamond-shaped handle drawn somewhere on the border. For example Detector 1310 has emitter 1350 and Locator 1312 has emitter 1352. A link is created by drag-dropping a Photo's emitter to any point on a Locator. If the link already exists, the drag-drop might delete the link, or another mechanism for deleting might be used. The user may not create more than the maximum number of allowable links from any Photo, nor any circular dependencies. To aid the user during an emitter drag over a Locator, a tool tip can be provided to tell the user whether a link would be created, deleted, or rejected (and why). Dragging a Locator does not change the behavior of its plunger—it stays locked on an edge if it can find one, or reverts to the center if not. Thus dragging a locator while an edge is detected just changes its search range; the plunger does not move relative to the FOV. More generally, dragging a Locator never changes the position of any Photo to which it is linked. Dragging a Locator will adjust the rod lengths as necessary to insure that no other Photo moves relative to the FOV.
Any plunger may be dragged manually within the range of its Locator, whether or not it has found an edge, and any linked Photos will move accordingly. This allows users to see the effect of the linkages. As soon as the mouse button is released, the plunger will snap back to its proper position (moving linked Photos back as appropriate).
In
Comparing second image view 1402 with first image view 1400, first plunger 1424 has moved down as it follows a first edge (not shown) in the image, and second plunger 1434 has moved to the left and slightly down as it follows a second edge (not shown). Note that the positions in the FOV of Locators 1420 and have not changed, but Detector 1410 has moved down and to the left to follow the plungers, which is following the edges of an object and therefore following the motion of the object itself. In a mechanical analogy, Detector 1410 moves because it is rigidly attached to first rail 1426 by first rod 1422, and to second rail 1436 by second rod 1432. Note that first slider 1428 has slid to the left along first rail 1426, and second slider 1438 has slid down along second rail 1436. The sliders slide along the rails when two non-orthogonal Locators are linked to a Photo.
If a Photo is linked to two nearly parallel Locators, its motion would be unstable. It is useful to set an angle limit between the Locators, below which the linked Photo will not be moved. This state can be indicated in some way in the image view, such as by displaying the two rods using a special color such as red. The ability to have Locators either at fixed positions or linked to other Locators provides important flexibility. In
The Locators are configured to follow the top and right edges of a circular feature 1550. Comparing second image view 1502 with first image view 1500, the circular feature 1550 has moved down, causing rail 1522 to move down to follow it. This moves both Detector 1510 and second Locator 1530 down. Note that Detector 1510 is at the same position relative to the object, and so is second Locator 1530. This is desirable in this case, because if second Locator 1530 were fixed in the FOV, it might miss the right edge of circular feature 1550 as it moves up and down. Note that this would not be problematic if the edge of an object in the image was a straight line.
First Locator 1520 has no Locator to move it left and right so as to find the top edge of circular feature 1550. The first Locator 1520 cannot link to second Locator 1530 because that would create a circular chain of links, which is not allowed because one Locator has to run first and it cannot be linked to anything. Instead, the motion of the object through the FOV insures that first Locator 1520 will find the top edge. In the example of
Accordingly,
Thus, in
Comparing second image view 1702 with first image view 1700, the object (not shown) has moved to the right and rotated counterclockwise, which can be seen by the motion of the Detectors as the Locators follow the object edges. Note that second Locator 1722 and third Locator 1724 are linked to first Locator 1720 so that they stay close to the Detectors.
Having described in detail the setup of Locators and Detectors in accordance with the above-incorporated-by reference METHOD AND APPARATUS, it should be clear that, while effective, the GUI screen of
Thus, in establishing appropriate Locators and Detectors in an image view of an object during setup, the functionality of the GUI can be highly beneficial. It is desirable that the process for setting up such Locators and Detectors be as easy to use and accurate as possible. By arranging functions of the GUI to facilitate automated setup of locators and detectors, the overall performance and ease of use of the vision detector can be greatly enhanced.
This invention provides a system and method for automating the setup of Locators and Detectors within an image view of an object on the HMI of a vision detector by determining detectable edges and best fitting the Locators and Detectors to a location on the object image view following the establishment of an user selected operating point on the image view, such as by clicking a GUI cursor. In this manner, the initial placement and sizing of the graphical elements for Locator and Detector ROIs are relatively optimized without excessive adjustment by the user. Locators can be selected for direction, including machine or line-movement direction, cross direction or angled direction transverse to cross direction and movement direction. Detectors can be selected based upon particular analysis tools, including brightness tools, contrast tools and trained templates. The Locators and detectors are each associated with a particular set of operating parameters, such as activation threshold, which are displayed in a control box within the GUI (and can be accessed by clicking on the specific Locator or Detector. A parameter bar can also be provided adjacent to the depiction of the Detector on the image view for easy reference. Both Locators and Detectors may be manually readjusted once automatically placed and sized by drag and drop techniques.
In an illustrative embodiment the system includes a GUI screen image view of an object derived from a vision sensor having a field of view in which the object is in relative motion thereto and a plurality of image frames of the object within the filed of view are captured by the vision detector. The image view is accessible by the GUI cursor. An edge detection process determines and analyzes detectable edges in the screen image view and stores edge information. A selector allows a user to select either a (a) Locator or (b) a Detector based upon a predetermined analysis tool for placement on the image view. an automatic placement process then uses that edge information to place the selected (a) Locator or (b) Detector at a position on the image view upon which the cursor points with a size that is determined based upon a location of adjacent edges of the object image view.
The automatic placement process is constructed and arranged to place the Locator on the image view relative to a nearest adjacent edge of the image view and to adjust the Locator so as to avoid a stronger-magnitude more-distant edge. This allows a Locator having a predetermined width when originally sized to be finally sized with a cutoff on the side near the stronger edge, thus avoiding confusion as the object moves through the field of view between edges, since the Locator's activation threshold is generally set relative to the nearest adjacent edge's magnitude.
In addition, the automatic placement process is constructed and arranged to place the Director on the image view relative to the position at which the cursor points so that a relative center of the Detector as at the position at which the cursor points and an outer boundary of the Director extends to a location that is within detected edges of the object image view. The outer boundary is typically circular, and is built from incrementally larger-radius circles until the average score of pixel values of the image within the boundary indicates a change beneath an applicable threshold (based upon brightness or contrast, for example). At this time, the boundary closest to the radius still within the threshold is chosen for the automatically sized ROI of the Detector.
The invention description below refers to the accompanying drawings, of which:
In this embodiment, the GUI 1800 is provided as part of a programming application running on the HMI and receiving interface information from the vision detector. In the illustrative embodiment, a .NET framework, available From Microsoft Corporation of Redmond, Wash., is employed on the HMI to generate GUI screens. Appropriate formatted data is transferred over the link between the vision detector and HMI to create screen displays and populate screen data boxes, and transmit back selections made by the user on the GUI. Techniques for creating appropriate screens and transferring data between the
The screen 1800 includes a status pane 1802 in a column along the left side. This pane controls a current status box 1804, the dialogs for controlling general setup 1806, setup of object detection with Locators and Detectors 1808, object inspection tool setup 1810 and runtime/test controls 1812. The screen 1800 also includes a right-side column having a pane 1820 with help buttons.
The lower center of the screen 1800 contains a current selection control box 1830. The title 1832 of the box 1830 relates to the selections in the status pane 1802. In this example, the user has clicked select job 1834 in the general setup box 1806. Note, the general setup box also allows access to an item (1836) for accessing a control box (not shown) that enables setup of the imager (also termed “camera”), which includes, entry of production line speed to determine shutter time and gain. In addition, the general setup box allows the user to set up a part trigger (item 1838) via another control box (not shown). This may be an external trigger upon which the imager begins active capture and analysis of a moving object, or it may be an “internal” trigger in which the presence of a part is recognized due to analysis of a certain number of captured image frames (as a plurality of complete object image frames are captured within the imager's field of view).
The illustrated select job control box 1830 allows the user to select from a menu 1840 of job choices. In general, a job is either stored on an appropriate memory (PC or vision detector or is created as a new job. Once the user has selected either a stored job or a new job, the next button accesses a further screen with a Next button 1842. These further control boxes can, by default, be the camera setup and trigger setup boxes described above.
Central to the screen 1800 is the image view display 1850, which is provided above the control box 1830 and between the columns 1802 and 1820 (being similar to image view window 198 in
As shown in
Before describing further the setup procedure, reference is made briefly to the bottommost window 1870 which includes a line of miniaturized image frames that comprise a so-called “film strip” of the current grouping of stored, captured image frames 1872. These frames 1872 each vary slightly in bottle position with respect to the FOV, as a result of the relative motion. The film strip is controlled by a control box 1874 at the bottom of the left column.
Reference is now made to
As shown, a cursor 1930 is brought toward an edge 1940 of the object 1852. Once the user “clicks” on the cursor placement, the screen presents the control box 2010, which now displays a parameter box 2012. Briefly, this box sets up the applicable threshold indicator 2014 for machine direction. The nature of the parameter box is highly variable herein. In general, the user can decide how high or low to set a threshold for edge detection.
The click of the cursor 1930 also generates a novel Locator graphic 2020 on the image view 1850 of the object 1852. This graphic 2020 is similar in operation to the Locator 320 (
In this example, the Locator is sized with a height HL1 and width WL1 that are optimized to a given segment of edge 1940 of the object 1852. Likewise, the locator is positioned at an angle A that allows the above-described plunger bar 2022 to approximately define a straight line within the (curving) edge portion closest to the clicked cursor 1930. In general, the height HL1 if the plunger 2020 is chosen by the process so that it remains within a predetermined deviation of the object edge from a straight line. In other words, the plunger, at its opposing ends 2024 and 2026 deviates from the curving object edge 1940 no more than a predetermined distance—a longer plunger would exceed that distance at the selected edge location. The procedure for determining automatic placement and sizing of the Locator 2020 is described in greater detail below.
The position in the FOV at which the cursor 1930 is clicked typically defines the center of the locator. The locator itself remains fixed at the clicked position in the FOV. The moving object image passes through the Locator with the plunger 2022 following the detectable edge transition. In automatic setup, the Locator's width WL1 is determined by the distance from the click point to a detectable edge transition for the object in the setup view. Hence, if the click point of the cursor 1930 were further from the edge 1940, then the Locator graphic would appear longer in the width direction to lie properly upon the object. The extension of the locator into the body of the object image is sufficient so that the edge transition of the object can be of the object can be properly detected while the object is placed in the current image view (the illustrated view upon which setup is being made). Again the height HL1 of the locator and plunger 2022 is based upon a close fit with the nearest object edge transition. A more detailed procedure for the automated placement of a Locator is described with reference to
The graphical representation of the Locator 2020 is set to a given polarity so that it properly identifies the transition from light background to dark. A polarity selector (not shown) can be provided in the status pane 1802 or control box 2010. In this manner, a Locator can be placed on either edge (see phantom Locator 2030 on opposing edge 2032) and detect the movement of the object through the FOV from either edge. Polarity can be displayed by providing different, unique, opaque shading on each side of the Locator 2020. In this example, shading fill (symbolized by hatch lines) 2040 is used to show a dark-to-light polarity given a prevailing right-to-left machine direction. Likewise, the opposing alternate Locator 2030 would be set for light-to-dark polarity in this example.
It is contemplated that the automated placement of the Locator 2020 may not always yield the best result. Thus, the control box 2010 includes a recreate button 2050 that allows the Locator 2020 to be removed and replaced in another location by a subsequent move and click of the cursor 1930. Alternatively, the clicking of the cursor 1930 on a different position of the object can be adapted to recreate the Locator elsewhere on the image view 1850. Note that a cross direction button 2052 and angle direction button 2054 can still be accessed to generate additional locators as needed, using the same automated and manual placement and resizing procedures as applicable to the locator 2020.
In addition, when a Locator's automatic placement is generally desirable, but its angle, width or height will not necessarily obtain the best results, then the Locator can be manually resized as shown generally in
Having placed and adjusted a Locator 2120, reference is now made to
When a given type of tool is selected, the user may then move the cursor to an appropriate location on the object 1852 (see cursor 1930 shown in phantom). By clicking on the positioned cursor 1930 (phantom) a Detector region of interest (ROI) circle 2240 (shown in phantom) using brightness as a detection criterion is formed on the object in association with the plunger 2122 of the locator 2120. The diameter of the circle is selected automatically from the center click point based upon placement so that it falls within the desired brightness region of the object. In other words, parts of the ROI that are outside a given brightness range cause the circle to be sized so as to avoid these regions. Similarly to the case of the Locator, the threshold level of a given detector is also estimated and automatically set, subject to subsequent adjustment by the user.
In this example, the automatically sized ROI circle 2240 (phantom) covers a majority of the width of the object body 1854. As described above, when the object is located, its presence is verified by the existence of the bright spot within the ROI. However, the user may desire a longer period of detection. Thus, by clicking the cursor 1930 (shown solid), and dragging on the circle edge, the ROI's diameter can be reduced (arrows 2242) from the larger diameter automatically sized circle (phantom) to a reduced-size circle 2250 that allows verification of presence within a larger range of movement across the FOV. Note that a threshold and brightness bar 2260 is automatically appended to the Detector circle 2250 by the GUI. This allows the user to ascertain the current settings and readings of the particular detector. Such data is helpful particularly where a plurality of detectors are present on the image view, and only one Detector's status is currently shown (typically the last Detector clicked) in the control box 2210. Note that by clicking any Detector or Locator in the image view, the relevant control box and associated parameter box is retrieved and displayed in the GUI.
The user may place as many Detectors as he or she desires in association with a given locator. To further verify object presence, a second Detector may be applied as shown in
In the example of
Note that the automatic sizing of a Detector ROI circle is described in further detail with reference to
Briefly, the user may also select Detectors based upon other tools such as template. When selecting template a control box (not shown) allows the user to lay a dark circle (automatically with manual adjustment option) on an object image location. The user activates a training button to sore the pattern in the vision detector's memory. Generalized pattern-matching algorithms are used to determine whether a detected ROI on the object matches the pattern. A threshold setting slider is provided to adjust the pattern matching algorithm.
The status pane 1810 also shows a set up inspection box 1810 with an associated button 1884 for inspection tools. In general, inspection occurs within the detector concurrently with detection. In some implementations, simply detecting an object is sufficient. In other applications, the detector can inspect objects for flaws by analyzing ROI's in association with a locator. Typically, ROIs are placed in areas where flaws will affect the appearance of the object in sufficient degree to be discriminated by the relatively low-resolution capture of the vision detector. Briefly, when the inspection setup button 1884 is clicked, the user is provided with various screens similar to those in
The automatic placement and sizing of a Locator in response to positioning and clicking of a cursor on the image view is now discussed in further detail with reference to
The user desires to place a Locator along the left-side edge portion 2412 and has clicked a cursor at the click point 2414 at a slight spacing from the edge 2412 (step 2512). The procedure locates the closest point on the nearest located edge 2412 and establishes this point as the Locator origin 2416 (step 2514). The origin 2416 is defined in terms of orthogonal x and y axes and a rotation θ relative to the axes and the closest distance can be determined as the shortest line segment 2419 between the click point 2414 and origin 2416. In one embodiment, the angle of this segment with respect to the X-Y axes defines θ (the segment being oriented at 90 degrees to θ). The procedure 2500 begins to define increments above and below the origin (steps 2516, 2518, 2520 and 2522) generating a line 2420 that fits along the edge 2412 in each direction from the origin 2416. This forms the basis of the plunger when the creation of the Locator is complete. The increments build as far as they are able until the maximum width (according to a predetermined constant) is achieved (for example the lower point 2430). The increments may build to less than the maximum width if they deviate from the edge by more than a maximum deviation (MAXDEV), at which point (top point 2432) increments are no longer built. In one embodiment, MAXDEV is approximately 2 pixels wide. Once increments are maximized, the maximum height of the locator is established.
In step 2524, the width of the Locator in both directions from the line 2420 is established (MAXWIDTH1 and MAXWIDTH2). Typically, width is determined by a predetermined ratio of the height and by other factors, such as ensuring that a sufficient portion of the width is located in each of the object side and background side.
The procedure 2500 may attempt to move the Locator line 2420 upwardly or downwardly along the edge to seek a better fit within a predetermined limit (steps 2526 and 2528) that allows a truncated side (due to exceeding MAXDEV) of the Locator to expand in height. Likewise, in an embodiment, the line may be rotated relative to θ, to allow a better fit within certain rotational limits. Once the Locator positioning is established, the procedure 2500 in step 2530 ranks the strength of the transition of all edges within the original width of the Locator's ROI. In this example, a stronger (or equally strong) edge 2440 is identified (step 2532), which may confuse the analysis during runtime. Thus, the procedure 2500 resizes the width boundary 2442 (step 2534 and arrow 2441) to exclude the edge 2440. The amount (ADJWIDTH) of withdrawal of the Locator's width boundary 2442 may be calculated based upon a constant or a ratio relative to the distance between edges 2412 and 2440, or upon another metric. Finally the Locator is completed in step 2536.
Upon completion of the Locator's layout, a threshold value is assigned to the Locator. This value is calculated by deriving a measured magnitude (via the Sobel operator) of the edgelets at the edge line 2420 and multiplying this value by a constant to determine an absolute threshold value for the GUI. In an embodiment, a constant of 0.7 is used to establish a default value for the threshold assigned to the Locator, resulting in allowance of variation of up to 30%.
The placement and sizing of a detector in accordance with an embodiment of this invention is now described in further detail with reference to the exemplary object 2410 of
Next, in step 2712, the user moves the cursor to a point on the object image view and clicks the location to establish a center point (click point) 2620 for the Detector ROI circle (step 2714). This click point is established as the origin of the circle with an initial Radius equal to zero within the depicted x-axis and y-axis coordinate system. The procedure then (steps 2716 and 2718) begins to build a series of circles about the origin 2620, successively incrementing (typically by one or two pixels in distance per increment) the radius of the circle and deriving an average magnitude score for all points (or sum of all magnitudes) in the image view along the circle. In this example, the circles build successively outwardly (radial arrows 2622) from the origin 2620 to radiuses R1<R2<R3<R4. Each time the step 2718 decides whether the average or summed score of all image pixels within the given circle is (a) greater-than-or-equal-to, or (b) less-than the desired threshold value. Referring to the graph 2800 in
The GUI thus automatically displays the chosen circle with radius R3 and allows the user the option to increase or decrease the diameter as appropriate (step 2722). As described above, a further graphic image of a threshold and setting bar is provided alongside the completed circle.
The determination of magnitude is, in part based upon the type of tool used in conjunction with the Detector. In the case of brightness, the tool bases decisions upon pixel intensity versus a constant. The constant can be predetermined or calculated from the average image intensity in a variety of ways. In the case of contrast, the magnitude score may be a differential gradient between intensities and the threshold may be a constant gradient. Where needed, inverse values for these thresholds can be derived through subtraction from a constant. Automatic placement and sizing of a template circle may be based upon contrast or brightness (or both).
Hence, the above description provides useful and highly flexible mechanisms for allowing minimally trained persons to quickly employ a vision detector without the need of intensive human programming or labor in setup. The completed setup may be tested as needed, and by accessing various GUI screens through “Back” buttons and clicks upon the image's Locators and Detectors during test time, adjustments can be made to the Locators and Detectors, or new/replacement Locators and Detectors can be placed on the image view.
The foregoing has been a detailed description of illustrative embodiments of the invention. Various modifications and additions can be made without departing from the spirit and scope thereof. For example, while ROIs for Locators are shown as rectangles and Detectors are shown as circles, their ROIs may each define a different shape or a variety of selectable and/or customized shapes as needed. Likewise, while a particular form of HMI and GUI are shown, a variety of hardware and GUI expressions are expressly contemplated. For example, in alternate embodiments access to operating parameters may be through alternate display screens or boxes. Accordingly, this description is meant to be taken only by way of example, and not to otherwise limit the scope of the invention.
Eames, Andrew, Phillips, Brian S., Tremblay, II, Robert J., Mirtich, Brian V., Keating, John F., Whitman, Steven
Patent | Priority | Assignee | Title |
10664939, | Jun 30 2017 | Omron Corporation | Position control system, position detection device, and non-transitory recording medium |
10825193, | Sep 27 2017 | Omron Corporation | Position detecting apparatus and computer-readable recording medium |
11481915, | May 04 2018 | PACKSIZE LLC | Systems and methods for three-dimensional data acquisition and processing under timing constraints |
Patent | Priority | Assignee | Title |
4214265, | Oct 16 1975 | Method and device for supervising the speed of an object | |
4292666, | Apr 12 1978 | SCHNEIDER AUTOMATION INC | Programmable controller |
4384195, | Jun 09 1980 | COE MANUFACTURING COMPANY, THE | Edge-responsive apparatus for counting conveyor-transported articles |
4647979, | Dec 20 1983 | Asahi Kogaku Kogyo Kabushiki Kaisha | Automatic focusing device for video camera |
4847772, | Feb 17 1987 | Regents of the University of Minnesota; REGENTS OF THE UNIVERSITY OF MINNESOTA, A CORP OF MINNESOTA | Vehicle detection through image processing for traffic surveillance and control |
4916640, | Jun 03 1987 | ALLEN-BRADLEY COMPANY, INC | Video image processing system |
4953841, | Oct 07 1988 | GENERAL ELECTRIC CAPTIAL CORPORATION, A NY CORP | Machine and process for separating signatures |
4962538, | Feb 13 1989 | Comar, Inc. | Image analysis counting system |
4972494, | Feb 26 1988 | R J REYNOLDS TOBACCO COMPANY | Package inspection system |
5018213, | May 11 1988 | WEB PRINTING CONTROLS CO , INC | Method and apparatus for registration mark identification |
5040056, | Jan 29 1990 | TECHNISTAR CORPORATION, 2040 MILLER DRIVE, LONGMONT, CO 80501-6798 | Automated system for locating and transferring objects on a conveyor belt |
5121201, | Aug 25 1989 | FUJITEC CO , LTD | Method and apparatus for detecting the number of persons |
5146510, | Feb 09 1989 | PHILIP MORRIS INCORPORATED, A CORP OF VA | Methods and apparatus for optically determining the acceptability of products |
5164998, | Mar 04 1991 | Apparatus and method for image pattern analysis | |
5177420, | May 01 1989 | Honda Giken Kogyo Kabushiki Kaisha | Method of and apparatus for generating control program |
5184217, | Aug 02 1990 | System for automatically inspecting a flat sheet part | |
5198650, | Jun 24 1991 | NCR Corporation | Hands free/hand held bar code scanner |
5210798, | Jul 19 1990 | Litton Systems, Inc. | Vector neural network for low signal-to-noise ratio detection of a target |
5233541, | Aug 10 1990 | KAMAN AEROSPACE CORPORATION, A CORP OF DE | Automatic target detection process |
5262626, | Dec 06 1989 | Symbol Technologies, Inc. | Decoding bar codes from multiple scans using element replacement |
5271703, | May 08 1992 | SI Handling System, Inc. | Automatic order selection system capable of responding to simultaneous order requests |
5286960, | Nov 04 1991 | Welch Allyn Data Collection, Inc | Method of programmable digitization and bar code scanning apparatus employing same |
5298697, | Sep 19 1991 | Hitachi, Ltd. | Apparatus and methods for detecting number of people waiting in an elevator hall using plural image processing means with overlapping fields of view |
5317645, | Feb 28 1991 | SILVER POINT FINANCE, LLC, AS AGENT | Method and apparatus for the recognition and counting of discrete objects |
5365596, | Dec 17 1992 | Philip Morris Incorporated | Methods and apparatus for automatic image inspection of continuously moving objects |
5420409, | Oct 18 1993 | Welch Allyn Data Collection, Inc | Bar code scanner providing aural feedback |
5476010, | Jul 14 1992 | Sierra Matrix, Inc. | Hands-free ultrasonic test view (HF-UTV) |
5481712, | Apr 06 1993 | Cognex Technology and Investment LLC | Method and apparatus for interactively generating a computer program for machine vision analysis of an object |
5581625, | Jan 31 1994 | International Business Machines Corporation; INTERNATIONAL BUSINES MACHINES CORPORATION | Stereo vision system for counting items in a queue |
5602890, | Sep 27 1995 | THERMEDICS DETECTION INC | Container fill level and pressurization inspection using multi-dimensional images |
5687249, | Sep 06 1993 | Nippon Telephone and Telegraph | Method and apparatus for extracting features of moving objects |
5717834, | Aug 26 1993 | CNN programamble topographic sensory device | |
5734742, | Sep 19 1994 | Nissan Motor Co., Ltd. | Inspection system and process |
5742037, | Mar 07 1996 | Cognex Corporation | Method and apparatus for high speed identification of objects having an identifying feature |
5751831, | Sep 12 1991 | FUJIFILM Corporation | Method for extracting object images and method for detecting movements thereof |
5802220, | Dec 15 1995 | University of Maryland | Apparatus and method for tracking facial motion through a sequence of images |
5809161, | Mar 20 1992 | Commonwealth Scientific and Industrial Research Organisation | Vehicle monitoring system |
5825483, | Dec 19 1995 | WEINZIMMER, RUSS; Cognex Corporation | Multiple field of view calibration plate having a reqular array of features for use in semiconductor manufacturing |
5852669, | Apr 06 1994 | THE CHASE MANHATTAN BANK, AS COLLATERAL AGENT | Automatic face and facial feature location detection for low bit rate model-assisted H.261 compatible coding of video |
5872354, | Jan 31 1989 | Intermec IP CORP | Hand-held data capture system with interchangable modules including autofocusing data file reader using the slope of the image signal to determine focus |
5917602, | Apr 30 1998 | EMHART GLASS, S A | System and method for image acquisition for inspection of articles on a moving conveyor |
5929418, | Mar 04 1994 | HAND HELD PRODUCTS, INC | Optical reader having improved menuing features |
5932862, | Mar 04 1994 | Welch Allyn Data Collection, Inc | Optical reader having improved scanning-decoding features |
5937096, | Nov 30 1994 | Canon Kabushiki Kaisha | Motion image processing apparatus and method |
5942741, | Mar 04 1994 | HAND HELD PRODUCTS, INC | Apparatus for optimizing throughput in decoded-output scanners and method of using same |
5943432, | Nov 17 1993 | Postage due detection system | |
5960097, | Jan 21 1997 | Raytheon Company | Background adaptive target detection and tracking with multiple observation and processing stages |
5960125, | Nov 21 1996 | Cognex Corporation | Nonfeedback-based machine vision method for determining a calibration relationship between a camera and a moveable object |
5966457, | Jun 14 1955 | Method for inspecting, coding and sorting objects | |
6046764, | May 25 1995 | The Gillette Company | Visual inspection system of moving strip edges using cameras and a computer |
6049619, | Feb 12 1996 | Sarnoff Corporation | Method and apparatus for detecting moving objects in two- and three-dimensional scenes |
6061471, | Jun 07 1996 | HEWLETT-PACKARD DEVELOPMENT COMPANY, L P | Method and system for detecting uniform images in video signal |
6072494, | Oct 15 1997 | Microsoft Technology Licensing, LLC | Method and apparatus for real-time gesture recognition |
6072882, | Aug 29 1997 | WASHINGTON SUB, INC ; ALPHA INDUSTRIES, INC ; Skyworks Solutions, Inc | Method and apparatus for sensing an audio signal that is sensitive to the audio signal and insensitive to background noise |
6075882, | Jun 18 1997 | PHILIP MORRIS USA INC | System and method for optically inspecting cigarettes by detecting the lengths of cigarette sections |
6078251, | Mar 27 1996 | Intermec IP Corporation | Integrated multi-meter and wireless communication link |
6088467, | Apr 07 1995 | California Institute of Technology | Pulse domain neuromorphic integrated circuit for computing motion |
6115480, | Mar 31 1995 | Canon Kabushiki Kaisha | Method and apparatus for processing visual information |
6158661, | Dec 28 1981 | Intermec IP CORP | Instant portable bar code reader |
6160494, | Jul 26 1996 | Machine and method for detecting traffic offenses with dynamic aiming systems | |
6161760, | Sep 14 1998 | Welch Allyn, Inc | Multiple application multiterminal data collection network |
6169535, | Jun 30 1997 | Toshiba America Information Systems, Inc. | Monitor adjustment control |
6169600, | Nov 20 1998 | Acuity Imaging, LLC | Cylindrical object surface inspection system |
6173070, | Dec 30 1997 | Cognex Corporation | Machine vision method using search models to find features in three dimensional images |
6175644, | May 01 1998 | RUSS WEINZIMMER, ESQ ; Cognex Corporation | Machine vision system for object feature analysis and validation based on multiple object images |
6175652, | Dec 31 1997 | Cognex Corporation | Machine vision system for analyzing features based on multiple object images |
6184924, | May 23 1997 | SIEGMAG TRANSPLAN GMBH | Method and device for the automatic detection of surface defects for continuously cast products with continuous mechanical removal of the material |
6215892, | Nov 30 1995 | Carl Zeiss Microscopy GmbH | Method and apparatus for automated image analysis of biological specimens |
6282462, | Jun 28 1996 | Metrovideo Inc. | Image acquisition system |
6285787, | Oct 31 1996 | Omron Corporation | Image pickup and processing device and method thereof |
6298176, | Oct 17 1997 | HAND HELD PRODUCTS, INC | Symbol-controlled image data reading system |
6301610, | Feb 11 2000 | ANSALDO STS USA, INC | Communication system |
6333993, | Oct 03 1997 | NEC Corporation | Method and device of object detectable and background removal, and storage media for storing program thereof |
6346966, | Jul 07 1997 | Agilent Technologies Inc | Image acquisition system for machine vision applications |
6347762, | May 07 2001 | The United States of America as represented by the Secretary of the Army | Multispectral-hyperspectral sensing system |
6360003, | Aug 12 1997 | Kabushiki Kaisha Toshiba | Image processing apparatus |
6396517, | Mar 01 1999 | Agilent Technologies Inc | Integrated trigger function display system and methodology for trigger definition development in a signal measurement system having a graphical user interface |
6396949, | Mar 21 1996 | Cognex Corporation | Machine vision methods for image segmentation using multiple images |
6408429, | Jan 17 1997 | Cognex Corporation | Machine vision system for identifying and assessing features of an article |
6429387, | Dec 13 1996 | Matsushita Electric Industrial Co., Ltd. | Electronic component and mounting method and device therefor |
6434264, | Dec 11 1998 | Lucent Technologies Inc. | Vision comparison inspection system |
6446868, | Nov 23 1998 | INFORMATICS, INC | Scanning system for decoding two-dimensional barcode symbologies with a one-dimensional general purpose scanner |
6483935, | Oct 29 1999 | Cognex Corporation | System and method for counting parts in multiple fields of view using machine vision |
6487304, | Jun 16 1999 | Microsoft Technology Licensing, LLC | Multi-view approach to motion and stereo |
6525810, | Nov 11 1999 | IMAGEXPERT, INC | Non-contact vision based inspection system for flat specular parts |
6526156, | Jan 10 1997 | Xerox Corporation | Apparatus and method for identifying and tracking objects with view-based representations |
6539107, | Dec 30 1997 | Cognex Corporation | Machine vision method using search models to find features in three-dimensional images |
6545705, | Apr 10 1998 | LYNX SYSTEM DEVELOPERS, INC | Camera with object recognition/data output |
6549647, | Jan 07 2000 | CyberOptics Corporation | Inspection system with vibration resistant video capture |
6573929, | Nov 23 1998 | AMERICAN TRAFFIC SOLUTIONS, INC | Traffic light violation prediction and recording system |
6580810, | Feb 26 1999 | CYBERLINK CORP. | Method of image processing using three facial feature points in three-dimensional head motion tracking |
6587122, | Jan 30 1998 | ROCKWELL AUTOMATION TECHNOLOGIES, INC | Instruction syntax help information |
6597381, | Jul 24 1999 | Intelligent Reasoning Systems, Inc. | User interface for automated optical inspection systems |
6608930, | Aug 09 1999 | KONINKLIJKE PHILIPS ELECTRONICS, N V | Method and system for analyzing video content using detected text in video frames |
6618074, | Aug 01 1997 | ADT Services AG | Central alarm computer for video security system |
6625317, | Sep 12 1995 | Visual imaging system and method | |
6628805, | Jun 17 1996 | SRI International | Apparatus and a method for detecting motion within an image sequence |
6629642, | Aug 02 1996 | Symbol Technologies, LLC | Data system and method for accessing a computer network using a collection of bar code symbols |
6646244, | Dec 19 2001 | HEWLETT-PACKARD DEVELOPMENT COMPANY L P | Optical imaging device with speed variable illumination |
6668075, | Jun 26 1998 | Nikon Corporation | Position detection apparatus and method |
6677852, | Sep 22 1999 | Intermec IP Corp. | System and method for automatically controlling or configuring a device, such as an RFID reader |
6681151, | Dec 15 2000 | Cognex Technology and Investment LLC | System and method for servoing robots based upon workpieces with fiducial marks using machine vision |
6714213, | Oct 08 1999 | General Electric Company | System and method for providing interactive haptic collision detection |
6741977, | Jan 29 1999 | HITACHI INDUSTRY & CONTROL SOLUTIONS, LTD | Image recording/reproducing apparatus in monitor system |
6750974, | Apr 02 2002 | Electro Scientific Industries, Inc | Method and system for 3D imaging of target regions |
6753876, | Dec 21 2001 | General Electric Company | Method for high dynamic range image construction based on multiple images with multiple illumination intensities |
6754374, | Dec 16 1998 | SURGICAL NAVIGATION TECHNOLOGIES, INC | Method and apparatus for processing images with regions representing target objects |
6761316, | Mar 27 2001 | Symbol Technologies, LLC | Compact auto ID reader and radio frequency transceiver data collection module |
6766414, | May 04 2000 | International Business Machines Corporation | Methods, apparatus and system for caching data |
6774917, | Mar 11 1999 | FUJI XEROX CO , LTD ; Xerox Corporation | Methods and apparatuses for interactive similarity searching, retrieval, and browsing of video |
6795521, | Aug 17 2001 | CETUS CORP | Computer-aided diagnosis system for thoracic computer tomography images |
6816063, | Jan 29 1999 | Intermec IP CORP | Radio frequency identification systems and methods for waking up data storage devices for wireless communication |
6817982, | Apr 19 2002 | FUJIFILM SONOSITE, INC | Method, apparatus, and product for accurately determining the intima-media thickness of a blood vessel |
6891570, | Jan 31 2001 | Exelis, Inc | Method and adaptively deriving exposure time and frame rate from image motion |
6901277, | Jul 17 2001 | MERATIVE US L P | Methods for generating a lung report |
6919793, | Sep 09 1994 | Intermec IP CORP | Radio frequency identification system write broadcast capability |
6944584, | Apr 16 1999 | BROOKS AUTOMATION HOLDING, LLC; Brooks Automation US, LLC | System and method for control and simulation |
6985827, | Mar 22 2000 | KAMA-TECH HK LIMITED; LASER TECHNOLOGY, INC | Speed measurement system with onsite digital image capture and processing for use in stop sign enforcement |
6987528, | May 27 1999 | Mitsubishi Denki Kabushiki Kaisha | Image collection apparatus and method |
6997556, | Oct 01 2001 | VIEWPOINT SICHERHEITSFORSCHUNG - BLICKFORSCHUNG GMBH | Method for detecting, evaluating, and analyzing look sequences |
6999625, | Jul 12 2002 | The United States of America as represented by the Secretary of the Navy | Feature-based detection and context discriminate classification for digital images |
7062071, | Dec 28 2001 | Honda Giken Kogyo Kabushiki Kaisha | Apparatus, program and method for detecting both stationary objects and moving objects in an image using optical flow |
7066388, | Dec 18 2002 | Symbol Technologies, LLC | System and method for verifying RFID reads |
7070099, | Sep 30 2004 | Symbol Technologies, Inc | Modular architecture for a data capture device |
7085401, | Oct 31 2001 | F POSZAT HU, L L C | Automatic object extraction |
7088387, | Aug 05 1997 | Mitsubishi Electric Research Laboratories, Inc | Video recording device responsive to triggering event |
7088846, | Nov 17 2003 | AXIS AB | Video surveillance system that detects predefined behaviors based on predetermined patterns of movement through zones |
7097102, | Jul 29 2004 | Symbol Technologies, LLC | System and method for decoding optical codes read by an imager-based optical code reader |
7130457, | Jul 17 2001 | MERATIVE US L P | Systems and graphical user interface for analyzing body images |
7175090, | Aug 30 2004 | Cognex Technology and Investment LLC | Methods and apparatus for reading bar code identifications |
7181066, | Dec 26 2002 | Cognex Technology and Investment LLC | Method for locating bar codes and symbols in an image |
7227925, | Oct 02 2002 | Varian Medical Systems, Inc | Gantry mounted stereoscopic imaging system |
7227978, | Jun 20 2002 | Casio Computer Co., Ltd. | Image input device |
7266768, | Jan 09 2001 | Sharp Kabushiki Kaisha | Systems and methods for manipulating electronic information using a three-dimensional iconic representation |
7274808, | Apr 18 2003 | QWERX INC | Imaging system and apparatus for combining finger recognition and finger navigation |
7280685, | Nov 14 2002 | Mitsubishi Electric Research Laboratories, Inc.; Mitsubishi Electric Research Laboratories, Inc | Object segmentation from images acquired by handheld cameras |
7611358, | Sep 08 2006 | Siemens Aktiengesellschaft | Method of coupling circuit board connectors |
7657081, | Sep 03 2004 | National Research Council of Canada | Recursive 3D model optimization |
7734008, | May 24 2007 | Vehicle cargo inspection station and associated method | |
7787678, | Oct 07 2005 | Siemens Corporation | Devices, systems, and methods for processing images |
8049193, | Jun 05 2001 | RAYTHEON TECHNOLOGIES CORPORATION | Systems, devices, and methods for large area micro mechanical systems |
8194821, | Sep 26 2008 | Varian Medical Systems, Inc | Methods, systems, and computer-program products to correct degradation in tomographic images caused by extraneous radiation |
20010042789, | |||
20020005895, | |||
20020099455, | |||
20020122582, | |||
20020177918, | |||
20020181405, | |||
20020196336, | |||
20020196342, | |||
20030062418, | |||
20030095710, | |||
20030113018, | |||
20030120714, | |||
20030137590, | |||
20030201328, | |||
20030219146, | |||
20030227483, | |||
20040148057, | |||
20040218806, | |||
20050184217, | |||
20050226490, | |||
20050254106, | |||
20050257646, | |||
20050275728, | |||
20050275831, | |||
20050275833, | |||
20050275834, | |||
20050276445, | |||
20050276459, | |||
20050276460, | |||
20050276461, | |||
20050276462, | |||
20060022052, | |||
20060107211, | |||
20060107223, | |||
20060131419, | |||
20060133757, | |||
20060146337, | |||
20060223628, | |||
20060249581, | |||
20060283952, | |||
20070009152, | |||
20070146491, | |||
20070181692, | |||
20080036873, | |||
20080166015, | |||
20080167890, | |||
20080205714, | |||
20080219521, | |||
20080285802, | |||
20100318936, | |||
DE10012715, | |||
DE10040563, | |||
DE2309078, | |||
EP815688, | |||
EP896290, | |||
EP939382, | |||
EP1469420, | |||
EP1734456, | |||
GB2226130, | |||
JP11101689, | |||
JP2000227401, | |||
JP2000322450, | |||
JP200084495, | |||
JP2002148205, | |||
JP60147602, | |||
JP9288060, | |||
WO141068, | |||
WO2075637, | |||
WO215120, | |||
WO3102859, | |||
WO2005050390, | |||
WO2005124709, | |||
WO9609597, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Dec 22 2010 | Cognex Technology and Investment Corporation | (assignment on the face of the patent) | / |
Date | Maintenance Fee Events |
May 16 2017 | M1552: Payment of Maintenance Fee, 8th Year, Large Entity. |
Aug 09 2021 | REM: Maintenance Fee Reminder Mailed. |
Jan 24 2022 | EXP: Patent Expired for Failure to Pay Maintenance Fees. |
Date | Maintenance Schedule |
Jul 09 2016 | 4 years fee payment window open |
Jan 09 2017 | 6 months grace period start (w surcharge) |
Jul 09 2017 | patent expiry (for year 4) |
Jul 09 2019 | 2 years to revive unintentionally abandoned end. (for year 4) |
Jul 09 2020 | 8 years fee payment window open |
Jan 09 2021 | 6 months grace period start (w surcharge) |
Jul 09 2021 | patent expiry (for year 8) |
Jul 09 2023 | 2 years to revive unintentionally abandoned end. (for year 8) |
Jul 09 2024 | 12 years fee payment window open |
Jan 09 2025 | 6 months grace period start (w surcharge) |
Jul 09 2025 | patent expiry (for year 12) |
Jul 09 2027 | 2 years to revive unintentionally abandoned end. (for year 12) |