This invention provides a system and method for employing and analyzing images that are illuminated in different colors depending upon the type of illumination being employed. In an illustrative embodiment, a color image sensor is used to acquire images of subjects of interest, and each of the direct bright field, dark field and diffuse illumination are transmitted to the surface simultaneously (concurrently with each other) in a discrete illumination color that is discretely discernable by the sensor. For example, direct bright field and dark field may be red, while diffuse may be blue. pixels of the same sensitivity (for example, red and blue) in the image sensor receive only the image generated by that color of illumination. The reader of this invention includes processing components that independently assemble images from red and blue pixel addresses to create, in effect two simultaneous images (one image generated using a combination dark field and direct bright field and the other image generated using diffuse illumination) that overlap (are registered with respect to each other) perfectly. The best image is determined using conventional image analysis tools, and meaningful data (the code of the read symbol, for example) is derived from that best image.
|
30. A system for scanning a surface and deriving data therefrom comprising:
an imager that receives image data in at least three discrete image colors, each of which colors is resolvable into a discrete image color data stream;
an illuminator that simultaneously projects on the surface at least three illumination colors corresponding substantially to at least three image colors, each of the three image colors respectively being projected in a first illumination type, a second illumination type and a third illumination type; and
a best image process that derives that image color stream displaying the best image from each discrete image color data stream.
17. A method for scanning a surface comprising the steps of:
acquiring images with image pixels of an imager in each of at least a first image color and a second image color;
varying focus of the imager with respect to the surface;
simultaneously providing a first illumination type in a first illumination color recognized by the imager as the first image color and a second illumination type in a second illumination color recognized by the imager as the second image color, and a focus pattern, projected on the surface, in a focus illumination color recognized by the imager as the focus image color; and
reading image pixel data from the imager in the focus illumination color; and
controlling focus based upon the image pixel data in the focus illumination color.
1. A system for scanning a surface comprising:
an imager having image sensor pixels that each acquire images in each of at least a first image color and a second image color;
a focuser that varies focus of the imager with respect to the surface;
an illumination assembly constructed and arranged to simultaneously provide a first illumination type in a first illumination color recognized by the imager as the first image color and a second illumination type in a second illumination color recognized by the imager as the second image color, and a focus pattern, projected on the surface, in a focus illumination color recognized by the imager as the focus image color; and
a focus process that reads image pixel data from the imager in the focus illumination color and that controls the focuser based upon the image pixel data in the focus illumination color.
2. The system as set forth in
3. The system as set forth in
4. The system as set forth in
5. The system as set forth in
6. The system as set forth in
7. The system as set forth in
8. The system as set forth in
9. The system as set forth in
10. The system as set forth in
11. The system as set forth in
12. The system as set forth in
13. The system as set forth in
14. The system as set forth in
15. The system as set forth in
16. The system as set forth in
18. The method as set forth in
19. The method as set forth in
20. The method as set forth in
21. The method as set forth in
22. The method as set forth in
23. The method as set forth in
24. The method as set forth in
25. The method as set forth in
26. The method as set forth in
27. The method as set forth in
28. The method as set forth in
29. The method of
31. The system as set forth in
32. The system as set forth in
33. The system as set forth in
34. The system as set forth in
35. The system as set forth in
36. The system of
|
1. Field of the Invention
This invention relates to machine vision systems and symbology readers that employ machine vision and more particularly to illuminators for the same.
2. Background Information
Machine vision systems use image acquisition devices that include camera sensors to deliver information on a viewed subject. The system then interprets this information according to a variety of algorithms to perform a programmed decision-making and/or identification function. For an image to be most-effectively acquired by a sensor in the visible, and near-visible light range, the subject should be properly illuminated.
In the example of symbology reading (also commonly termed “barcode” scanning) using an image sensor, proper illumination is highly desirable. Symbology reading entails the aiming of an image acquisition sensor (CMOS camera, CCD, etc.) at a location on an object that contains a symbol (a “barcode”), and acquiring an image of that symbol. The symbol contains a set of predetermined patterns that represent an ordered group of characters or shapes from which an attached data processor (for example a microcomputer) can derive useful information about the object (e.g. its serial number, type, model, price, etc.). Symbols/barcodes are available in a variety of shapes and sizes. Two of the most commonly employed symbol types used in marking and identifying objects are the so-called one-dimensional barcode, consisting of a line of vertical stripes of varying width and spacing, and the so-called two-dimensional barcode consisting of a two-dimensional array of dots or rectangles.
By way of background
The scanning application 113 can be adapted to respond to inputs from the scanning appliance 102. For example, when the operator toggles a trigger 122 on the hand held scanning appliance 102, an internal camera image sensor (within the image formation system 151) acquires an image of a region of interest 131 on an object 105. The exemplary region of interest includes a two-dimensional symbol 195 that can be used to identify the object 105. Identification and other processing functions are carried out by the scanning application 113, based upon image data transmitted from the hand held scanning appliance 102 to the processor 109. A visual indicator 141 can be illuminated by signals from the processor 109 to indicate a successful read and decode of the symbol 195.
In reading symbology or other subjects of interest, the type of illumination employed is of concern. Where symbology and/or other viewed subjects are printed on a flat surface with contrasting ink or paint, a diffuse, high-angle “bright field” illumination may best highlight these features for the sensor. By high-angle it is meant, generally, light that strikes the subject nearly perpendicularly (normal) or at an angle that is typically no more than about 45 degrees from perpendicular (normal) to the surface of the item being scanned. Such illumination is subject to substantial reflection back toward the sensor. By way of example, barcodes and other subjects requiring mainly bright field illumination may be present on a printed label adhered to an item or container, or on a printed field in a relatively smooth area of item or container.
Conversely, where a symbology or other subject is formed on a more-irregular surface, or is created by etching or peening a pattern directly on the surface, the use of highly reflective bright field illumination may be inappropriate. A peened/etched surface has two-dimensional properties that tend to scatter bright field illumination, thereby obscuring the acquired image. Where a viewed subject has such decidedly two-dimensional surface texture, it may be best illuminated with dark field illumination. This is an illumination with a characteristic low angle (approximately 45 degrees or less, for example) with respect to the surface of the subject (i.e. an angle of more than approximately 45 degrees with respect to normal). Using such low-angle, dark field illumination, two-dimensional surface texture is contrasted more effectively (with indents appearing as bright spots and the surroundings as shadow) for better image acquisition.
In other instances of applied symbology a diffuse direct illumination may be preferred. Such illumination is typically produced using a direct-projected illumination source (e.g. light emitting diodes (LEDs)) that passes through a diffuser to generate the desired illumination effect.
To take full advantage of the versatility of a camera image sensor, it is desirable to provide bright field, dark field and diffuse illumination. However, dark field illumination must be presented close to a subject to attain the low incidence angle thereto. Conversely, bright field illumination is better produced at a relative distance to ensure full area illumination.
Commonly assigned U.S. patent application Ser. No. 11/014,478, entitled HAND HELD SYMBOLOGY READER ILLUMINATION DIFFUSER and U.S. patent application Ser. No. 11/019,763, entitled LOW PROFILE ILLUMINATION FOR DIRECT PART MARK READERS, both by Laurens W. Nunnink, the teachings of which are expressly incorporated herein by reference, provide techniques for improving the transmission of bright field (high angle) and dark field (low angle) illumination. These techniques include the provision of particular geometric arrangements of direct, bright field LEDs and conical and/or flat diffusers that are placed between bright field illuminators and the subject to better spread the bright field light. The above-incorporated HAND HELD SYMBOLOGY READER ILLUMINATION DIFFUSER further teaches the use of particular colors for improving the illumination applicable to certain types of surfaces. However, it has been observed that the choice of bright field, dark field, direct or diffuse light is not intuitive to user for many types of surfaces and/or the particular angles at which the reader is directed toward them. In other words, a surface may appear to be best read using dark field illumination, but in practice, bright field is preferred for picking out needed details, especially at a certain viewing angle. Likewise, with handheld readers, the viewing angle is never quite the same from surface to surface (part-to-part) and some viewing angles be better served by bright field while other may be better served by dark field.
The reader may be directed to step through various types of illumination when reading each part, but this takes time, both in cycling each set of illuminators on and off and integrating/analyzing the resulting image. Currently, for a reader to be considered efficient, the reading process should take place within 200 milliseconds or less. Stepping through illumination types, storing results, comparing and deriving the best image may exceed desired time limits. It is, therefore highly desirable to provide a technique that allows the best form of illumination to be employed at once for all types of surfaces and scan angles, and for acquired images from this illumination to be used immediately to derive meaningful image data.
This invention overcomes the disadvantages of the prior art by providing a system and method for employing and analyzing images that are illuminated in different colors depending upon the type of illumination being employed. In an illustrative embodiment, a color image sensor is used to acquire images of subjects of interest, and each of the direct bright field, dark field and diffuse illumination are transmitted to the surface simultaneously (concurrently with each other) in a discrete illumination color that is discretely discernable by the sensor. For example, direct bright field and dark field may be red, while diffuse may be blue. Pixels of the same sensitivity (for example, red and blue) in the image sensor receive only the image generated by that color of illumination. An appropriate filter, in line with the particular illumination source (LEDs, for example) may be employed to generate one or more discrete color and/or attenuate inadvertent migration of non-diffuse-color light into the diffuser portion. The reader of this invention includes processing components that independently assemble images from red and blue pixel addresses to create, in effect two simultaneous images (one image generated using a combination dark field and direct bright field and the other image generated using diffuse illumination) that overlap (are registered with respect to each other) perfectly. The best image is determined using conventional image analysis tools, and meaningful data (the code of the read symbol, for example) is derived from that best image. In a further embodiment, the best parts of one or more images can be combined to derive the symbolic data. In addition, another set of discrete-color light transmitters (green LEDs, for example) can be used to transmit direct bright field, and this direct bright field light can be discriminated by appropriate green-sensitivity pixels in the image sensor, thereby deriving a third discrete image that is registered with respect to the other two discrete-color images.
The invention description below refers to the accompanying drawings, of which:
With brief reference to the illuminator, the illumination board 214 supports a plurality of LEDs 310 that are red in this embodiment (a variety of colors can be used). The LEDs 310 are directed forwardly, toward the opening of the reader. These LEDs are positioned behind a passive light pipe 244 that internally transmits light from the ring of LEDs 310 to a front end 230. In this embodiment, the front end 230 includes a chamfered surface 232. Various examples of a light pipe for use with a reader or similar application are shown and described in U.S. patent application Ser. No. 10/693,626, entitled LIGHT PIPE ILLUMINATION SYSTEM AND METHOD, by William H. Equitz, et al., the teachings of which are expressly incorporated herein by reference.
Briefly explained, light passes through the extended body of the pipe 244 from the inner end, adjacent to the LEDs 310. The body is formed from a transmissive/transparent substance, such as polymethyl methacrylate (PMMA) or polycarbonate. The transmitted light is reflected internally by the angled/chamfered surface 232 of the light pipe 244 to exit at a low angle toward the center optical axis 270. The inner and/or outer wall surfaces of the light pipe 244 can be coated with opaque paint or another compound to prevent leakage of light into or out of the pipe. In this example, a shield 250 is also provided along the inner surface of the light pipe. One function of the shield 250 is to prevent transmission of diffuse light (described below) in to the light pipe. Another function is to redirect light transmitted from the reflector (see below) back into the diffuser.
In this example, the ring of LEDs 310 acts to produce a red direct bright field effect along with the dark field effect through refraction of some light from the LEDs through the chamfered surface 232. In general, at short reading distances from a surface (<25 mm between the light pipe distal (forward) end 230 and surface), the bright field illumination from the light pipe 230 tends not to interfere with the dark field illumination. The bright field illumination is available, however, for larger reading distances (>25 mm between the end 230 and the surface). This is useful for easy-to-read codes, such as black-and-white printed labels. In alternate embodiments, a separate bright field illuminator can be provided, and as described below. In fact, many available imagers include integral red bright field illuminators. In an alternate embodiment, a separate bright field illuminator can be provided in a discrete color, such as green.
Note that a pair of aiming LEDs 220 (typically emitting green light) are provided. However, these are optional. Such aiming LEDs may be integral with the commercially available image employed herein.
A tether cord 260 provides electrical power to the reader 200, as well as a communication transmission path for the decoded character string of the encoded information, though it is contemplated that the reader 200 can be configured with battery power and wireless communication for complete portable flexibility.
With reference also to
Reference is now also made to the exploded view of
To further ensure that diffuse light and dark field light do not mix within the light pipe or diffuser, a translucent “conical” filter 292 is provided. The filter 292 is adapted to filter out light with larger wavelengths, thereby allowing smaller wavelength blue light to pass out of the diffuser and onto the surface, but preventing the retransmission of any reflected red light from the surface, which would otherwise tend to become retransmitted as diffuse red light along with the red dark field illumination. The filter 292 conforms to the shape of the diffuser's outer (exposed) surface, and can be snapped or adhered onto the diffuser surface using a variety of fastening techniques that should be clear to those of ordinary skill. Note that instead of a separate filter 292, a similar effect can be obtained through the use of a colored diffuser (see
Thus, to summarize, at least two discrete sets of illumination transmitters (LEDs, for example) are provided according to the illustrative embodiment, the direct diffuse to transmitters 282 and the dark field transmitters 310. In accordance with the illustrative embodiment, each discrete set of transmitters 282 and 310 generates a corresponding discrete illumination color. For example, direct diffuse illumination can be generated by blue LEDs and dark field (and direct bright field) can be generated by red LEDs. Because the image sensor used herein is a commercially available color sensor, each pixel in the sensor's pixel array is assigned a particular color sensitivity. In a common arrangement, certain sensor pixels comprise a red pixel, certain pixels are green pixels and other pixels are blue pixels. The pixels are grouped in proximity so that the sensor receives at least one red, one green and one blue pixel data from each point on the sensor. This has the effect of producing an overall continuum of differing-intensity red, green and blue data that collectively mix in different intensities to define an overall color image across the sensor array. A typical technique for creating a color pixel array is to apply a mosaic color filter over a monochrome sensor. The filter renders particular pixels sensitive to particular colors. As a point of interest and by way of further background, a popular pixel pattern is the so-called Bayer pattern, shown for part of an array below (in which R is a red sensitive pixel, G is a green sensitive pixel and B is a blue sensitive pixel):
R
G
R
G
G
B
G
B
R
G
R
G
G
B
G
B
It therefore follows that each discrete pixel senses only the light from one of the corresponding illumination sources. This allows the single-color image formed by each type of illumination to be discriminated in a manner described in further detail below. Note that while, green illumination is not used, in alternate embodiments it can be transmitted (for example, by a separate direct bright field illuminator), and sensed by green pixels. However, it should be noted that blue pixels often have some sensitivity to green and the spread between red and blue is most effective for resolving different images from different discrete illumination sources.
Before further describing the novel discrimination of color images by the reader according to this invention, reference is now made to
In this embodiment (
While not shown in this illustration for simplicity, it can be assumed that a filter (292 above) may be applied over the diffuser to prevent migration of reflected dark field (and bright field) light into the diffuser 280. Such a filter is also omitted from the illustration of—but may be applicable to—the embodiment described in
Having described the general properties and construction of an illumination assembly according to various embodiments, reference is now made to
By way of an example of the process described in
The blue (diffuse image) 820 can be separately resolved from appropriate blue pixels of the sensor. This image has moderately good contrast. Predictably (with a peened surface), the image 830 resolved from the green pixels displays the least contrast as it is derived mainly from blue light. This green image is unused in this embodiment. The red pixels, however, deliver a high-contrast between peened and unpeened surface details in the red image 840. This depicted image appears as if dark field illumination was used exclusively, but in this case direct diffuse illumination was also present, but not resolved in the image 840. The process would likely select this resolved dark field image (840) as the best for analysis and deliver it to pattern recognition for decoding.
Clearly a significant advantage of the reader and process described herein is that the user need not select the best form of illumination, or reorient the scanner to obtain the best image. Rather, the scanner automatically applies all three types of illumination at once and acquires a single image using all types of illumination. Since one image is required, the three different colors are perfectly aligned so there is no need to apply complex translation functions to realign the images. This is desirable, in part where, in an alternate embodiment, a plurality of discrete color images are analyzed to decode the pattern. This feature may be desirable when a particular image contains more than one type of surface characteristic. For example half of the surface of interest may be printed and the other half of the surface may be peened. The part of each image showing the best contrast for a particular area is chosen to transmit to the pattern recognition process. Since these images are all registered (each color pixel data is an identifiable location on the sensor array, the particular color pixel data (only one pixel per sensor array location) with its intensity value can streamed to the pattern recognition process in the manner of a grayscale image. The pattern recognition process need not be concerned with the original color of the pixel, only what that color's intensity is. The best image process chooses the pixels displaying the best contrast and only those pixels are transmitted.
According to further embodiments, it is contemplated that the techniques for color illumination and image filtration described herein with reference, for example, to
Likewise, in an embodiment, opposing quadrants (320 and 324), (322 and 326) can be provided with different color LEDs (red and blue) or variable colors. In this manner, reflections caused by surface texture of the subject surface can be eliminated. This would entail filtering all or portions of a color image that are not clear and selecting unreflected portions for data recognition.
According to yet another embodiment, the ability to filter discrete colors via discrete color-sensitive sensor pixels can be employed to allow distance measurement and automatic focus of the lens for distance simultaneously with pattern acquisition so that the reader need not be held stationary after initial focus so that an image can be acquired.
The imager receives images in each of the discrete colors and transmits the pixel data from each color as image data. While the entire field of view is acquired, COLOR1 pixels will mainly show intensities for the focus pattern (data 930). Likewise, Pattern data provide intensities in COLOR 2 (and other colors provided by the illumination assembly 910) (data 932). The image data 932 is resolved by pattern recognition processes 940 into one or more readable images that are decoded for their data content or other information. The focus image data 930 is analyzed by a focus process 950 that may loop back to an electronic or electromechanical focus mechanism 960 that manipulates (double arrow 962) the imager to achieve desired focus on the surface. The focus processes may perform a plurality of quickly iterative cycles in order to produce the desired focus, at which time the pattern 922 is acquired for a well-focused image.
To again summarize, the above-described reader and process affords quicker image analysis, greater accuracy and increased versatility. It allows more flexibility in types of surfaces being scanned and the angles at which the reader is held relative to the surface. Moreover, since color image sensors are presently becoming less expensive than grayscale sensors of the same resolution, the overall cost of producing a reader according to this invention may decrease.
The foregoing is a detailed description of illustrative embodiments of the invention. Various modifications and additions can be made without departing from the spirit and scope thereof. For example, the placement and colors of various transmitters is highly variable. Additional colors and/or wavelengths of light can be provided for further illumination types. While the colors red, green and blue are employed for particular types of illumination, one skilled in the art will appreciate that alternative color characteristics, such as red/infrared or cyan, magenta and yellow and can be employed according to the any of the embodiments contemplated. Appropriate functions in the reader can be established to recognize, read and process these particularized wavelengths instead of, or in addition to the illumination colors described above. Further, while the embodiments shown herein relate to a handheld scanner, it is expressly contemplated that the principles described herein can be applied to a fixed scanner and the terms “reader,” “scanner” and the like should be taken broadly to include fixed units. Also, any of the processes or steps described herein can be executed by elements in the handheld reader, a linked computing device or another device. In addition, while colored LEDs are used to generate the desired dark field illumination, the color can be generated alternatively using a colored filter and/or tinted light pipe combined with white LEDs in the ring source. Finally, it is expressly contemplated that any of the processes or steps described herein can be implemented as hardware, software, including program instructions executing on a computer, or a combination of hardware and software. Accordingly, this description should be taken only by way of example and not to otherwise limit the scope of the invention.
Patent | Priority | Assignee | Title |
10007858, | May 15 2012 | Honeywell International Inc.; HONEYWELL INTERNATIONAL INC D B A HONEYWELL SCANNING AND MOBILITY | Terminals and methods for dimensioning objects |
10025314, | Jan 27 2016 | Hand Held Products, Inc. | Vehicle positioning and object avoidance |
10031018, | Jun 16 2015 | Hand Held Products, Inc. | Calibrating a volume dimensioner |
10060721, | Jul 16 2015 | Hand Held Products, Inc. | Dimensioning and imaging items |
10060729, | Oct 21 2014 | Hand Held Products, Inc. | Handheld dimensioner with data-quality indication |
10066982, | Jun 16 2015 | Hand Held Products, Inc. | Calibrating a volume dimensioner |
10083333, | Oct 10 2014 | Hand Held Products, Inc. | Depth sensor based auto-focus system for an indicia scanner |
10094650, | Jul 16 2015 | Hand Held Products, Inc. | Dimensioning and imaging items |
10096099, | Oct 10 2014 | HAND HELD PRODUCTS, INC | Image-stitching for dimensioning |
10121039, | Oct 10 2014 | Hand Held Products, Inc. | Depth sensor based auto-focus system for an indicia scanner |
10127674, | Jun 15 2016 | Hand Held Products, Inc. | Automatic mode switching in a volume dimensioner |
10134120, | Oct 10 2014 | HAND HELD PRODUCTS, INC | Image-stitching for dimensioning |
10140724, | Jan 12 2009 | Intermec IP Corporation | Semi-automatic dimensioning with imager on a portable device |
10163216, | Jun 15 2016 | Hand Held Products, Inc. | Automatic mode switching in a volume dimensioner |
10203402, | Jun 07 2013 | Hand Held Products, Inc. | Method of error correction for 3D imaging device |
10218964, | Oct 21 2014 | Hand Held Products, Inc. | Dimensioning system with feedback |
10225544, | Nov 19 2015 | Hand Held Products, Inc. | High resolution dot pattern |
10228452, | Jun 07 2013 | Hand Held Products, Inc. | Method of error correction for 3D imaging device |
10240914, | Aug 06 2014 | Hand Held Products, Inc. | Dimensioning system with guided alignment |
10247547, | Jun 23 2015 | Hand Held Products, Inc. | Optical pattern projector |
10249030, | Oct 30 2015 | Hand Held Products, Inc. | Image transformation for indicia reading |
10321127, | Aug 20 2012 | Intermec IP CORP | Volume dimensioning system calibration systems and methods |
10339352, | Jun 03 2016 | Hand Held Products, Inc. | Wearable metrological apparatus |
10359273, | Oct 21 2014 | Hand Held Products, Inc. | Handheld dimensioning system with measurement-conformance feedback |
10360424, | Dec 28 2016 | Hand Held Products, Inc. | Illuminator for DPM scanner |
10377624, | Feb 19 2013 | GOJO Industries, Inc. | Refill container labeling |
10393508, | Oct 21 2014 | Hand Held Products, Inc. | Handheld dimensioning system with measurement-conformance feedback |
10402956, | Oct 10 2014 | Hand Held Products, Inc. | Image-stitching for dimensioning |
10417769, | Jun 15 2016 | Hand Held Products, Inc. | Automatic mode switching in a volume dimensioner |
10455112, | Nov 19 2014 | Digimarc Corporation | Optimizing optical scanners for digital watermark detection |
10467806, | May 04 2012 | Intermec IP Corp. | Volume dimensioning systems and methods |
10584962, | May 01 2018 | HAND HELD PRODUCTS, INC | System and method for validating physical-item security |
10593130, | May 19 2015 | Hand Held Products, Inc. | Evaluating image values |
10612958, | Jul 07 2015 | Hand Held Products, Inc. | Mobile dimensioner apparatus to mitigate unfair charging practices in commerce |
10635922, | May 15 2012 | Hand Held Products, Inc. | Terminals and methods for dimensioning objects |
10650205, | Sep 28 2018 | Hand Held Products, Inc.; HAND HELD PRODUCTS, INC | Methods, systems, and apparatuses for scanning and decoding direct part marking indicia |
10713456, | Jul 16 2009 | Digimarc Corporation | Coordinated illumination and image signal capture for enhanced signal detection |
10733748, | Jul 24 2017 | Hand Held Products, Inc. | Dual-pattern optical 3D dimensioning |
10747227, | Jan 27 2016 | Hand Held Products, Inc. | Vehicle positioning and object avoidance |
10775165, | Oct 10 2014 | HAND HELD PRODUCTS, INC | Methods for improving the accuracy of dimensioning-system measurements |
10805603, | Aug 20 2012 | Intermec IP Corp. | Volume dimensioning system calibration systems and methods |
10810715, | Oct 10 2014 | HAND HELD PRODUCTS, INC | System and method for picking validation |
10845184, | Jan 12 2009 | Intermec IP Corporation | Semi-automatic dimensioning with imager on a portable device |
10859375, | Oct 10 2014 | Hand Held Products, Inc. | Methods for improving the accuracy of dimensioning-system measurements |
10872214, | Jun 03 2016 | Hand Held Products, Inc. | Wearable metrological apparatus |
10908013, | Oct 16 2012 | Hand Held Products, Inc. | Dimensioning system |
10909708, | Dec 09 2016 | Hand Held Products, Inc. | Calibrating a dimensioner using ratios of measurable parameters of optic ally-perceptible geometric elements |
11029762, | Jul 16 2015 | Hand Held Products, Inc. | Adjusting dimensioning results using augmented reality |
11047672, | Mar 28 2017 | HAND HELD PRODUCTS, INC | System for optically dimensioning |
11095868, | Jul 01 2016 | Cognex Corporation | Vision systems and methods of making and using the same |
11403887, | May 19 2015 | Hand Held Products, Inc. | Evaluating image values |
11639846, | Sep 27 2019 | Honeywell International Inc | Dual-pattern optical 3D dimensioning |
11906280, | May 19 2015 | Hand Held Products, Inc. | Evaluating image values |
11957270, | Feb 09 2018 | Societe des Produits Nestle S A | Beverage preparation machine with capsule recognition |
8692663, | Aug 10 2010 | General Motors LLC. | Wireless monitoring of battery for lifecycle management |
9120106, | Feb 19 2013 | GOJO Industries, Inc | Refill container labeling |
9507987, | Jul 07 2015 | Symbol Technologies, LLC | Arrangement for and method of illuminating a target to be electro-optically read by image capture with interchangeable illumination modules |
9569653, | Jun 13 2016 | Datalogic IP Tech, S.r.l. | Dark field illumination system obtained in a tilted plane |
9665760, | Apr 05 2016 | The Code Corporation | Barcode-reading system |
9811705, | May 10 2016 | DATALOGIC IP TECH S R L | Illumination system with active element for generating different illumination patterns for a data reader |
9902606, | Feb 19 2013 | GOJO Industries, Inc. | Refill container labeling |
9922128, | Apr 05 2016 | The Code Corporation | Barcode-reading system |
Patent | Priority | Assignee | Title |
5359185, | May 11 1992 | Intermec IP CORP | Chromatic ranging method and apparatus for reading optically readable information over a substantial range of distances |
5420712, | Jun 10 1992 | Nikon Corporation | Scanning device |
5714745, | Feb 26 1996 | Symbol Technologies, Inc | Portable data collection device with color imaging assembly |
6633375, | Jan 29 1999 | Leica Microsystems Semiconductor GmbH | Method and device for optically examining structured surfaces of objects |
6834807, | Jul 13 2001 | Hand Held Products, Inc. | Optical reader having a color imager |
6970608, | Dec 30 2001 | Cognex Technology and Investment LLC | Method for obtaining high-resolution performance from a single-chip color image sensor |
7025271, | Dec 18 2002 | Symbol Technologies, LLC | Imaging optical code reader having selectable depths of field |
7028901, | Jul 17 2003 | Symbol Technologies, LLC | System and method for reading and decoding optical codes using multiple color illumination |
7163149, | Mar 02 2004 | Symbol Technologies, LLC | System and method for illuminating and reading optical codes imprinted or displayed on reflective surfaces |
7617984, | Dec 16 2004 | Cognex Corporation | Hand held symbology reader illumination diffuser |
7823783, | Oct 24 2003 | Cognex Technology and Investment LLC | Light pipe illumination system and method |
7823789, | Dec 21 2004 | Cognex Technology and Investment LLC | Low profile illumination for direct part mark readers |
20030062413, | |||
20050011956, | |||
20050168729, | |||
20060072158, | |||
20070090193, | |||
DE10134974, | |||
DE4108916, | |||
EP516927, | |||
JP200212379, | |||
JP962831, | |||
WO217216, | |||
WO2005043449, | |||
WO2006065619, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Oct 24 2005 | Cognex Technology and Investment Corporation | (assignment on the face of the patent) | / | |||
Dec 19 2005 | NUNNINK, LAURENS | Cognex Technology and Investment Corporation | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 016989 | /0151 | |
May 19 2014 | NUNNINK, LAURENS | Cognex Technology and Investment LLC | CORRECTIVE ASSIGNMENT TO CORRECT THE NAME AND COMPANY IDENTITY OF ASSIGNEE PREVIOUSLY RECORDED ON REEL 016989 FRAME 0151 ASSIGNOR S HEREBY CONFIRMS THE ASSIGNMENT | 033917 | /0227 |
Date | Maintenance Fee Events |
May 19 2015 | M1551: Payment of Maintenance Fee, 4th Year, Large Entity. |
Mar 13 2019 | M1552: Payment of Maintenance Fee, 8th Year, Large Entity. |
Jul 10 2023 | REM: Maintenance Fee Reminder Mailed. |
Dec 25 2023 | EXP: Patent Expired for Failure to Pay Maintenance Fees. |
Date | Maintenance Schedule |
Nov 22 2014 | 4 years fee payment window open |
May 22 2015 | 6 months grace period start (w surcharge) |
Nov 22 2015 | patent expiry (for year 4) |
Nov 22 2017 | 2 years to revive unintentionally abandoned end. (for year 4) |
Nov 22 2018 | 8 years fee payment window open |
May 22 2019 | 6 months grace period start (w surcharge) |
Nov 22 2019 | patent expiry (for year 8) |
Nov 22 2021 | 2 years to revive unintentionally abandoned end. (for year 8) |
Nov 22 2022 | 12 years fee payment window open |
May 22 2023 | 6 months grace period start (w surcharge) |
Nov 22 2023 | patent expiry (for year 12) |
Nov 22 2025 | 2 years to revive unintentionally abandoned end. (for year 12) |