An optical code symbol reading system including a hand-supportable housing having a light transmission aperture. A manually-actuated trigger switch is integrated within the housing. An optical code symbol reading subsystem is disposed in the housing for optically reading a code symbol in the field external to the light transmission aperture, and generating symbol character data representative of the read code symbol. One or more light emitting diodes (LEDs) are disposed in the housing, for producing a visible illumination. Also, an optical-waveguide structure is disposed in the housing for coupling visible illumination produced from the one or more LEDs, so as to illuminate the region about the manually-actuated trigger switch, thereby causing the optically-translucent region about the manually-actuated trigger switch to glow and visually indicate where it is located on the hand-supportable housing.
|
16. An optical code symbol reading system, comprising:
a housing having a light transmission aperture;
a manually-actuated trigger switch integrated within said housing;
an optical code symbol reading subsystem for optically reading a code symbol in the field external to said light transmission aperture, and generating symbol character data representative of the read code symbol;
one or more light emitting diodes (LEDs) for producing a visible illumination; and
an optical-waveguide structure, disposed in said housing, and having an optically-translucent region about said manually-actuated trigger switch, and adapted for coupling said visible illumination produced from said one or more LEDs, so as to illuminate said optically-translucent region about said manually-actuated trigger switch, thereby causing said optically-translucent region to glow and visually indicate where said manually-actuated trigger switch is located on said housing;
wherein said optical-waveguide structure comprises:
one or more light coupling elements arranged about and in optical communication with said optically-translucent region about said manually-actuated trigger switch;
wherein said optically-translucent region about said manually-actuated trigger switch, surrounds an aperture through which said manually-actuated trigger switch is installed at the top portion of said hand-supportable housing; and
wherein visible illumination produced from said one or more LEDs is optically conduced by said one or more light coupling elements, into said optically-translucent region about said manually-actuated trigger switch, causing said optically-translucent region to glow and visually indicate where said manually-actuated trigger switch is located on said hand-supportable housing.
1. An optical code symbol reading system comprising:
a hand-supportable housing having a light transmission aperture;
a manually-actuated trigger switch integrated within said housing;
an optical code symbol reading subsystem for optically reading a code symbol in the field external to said light transmission aperture, and generating symbol character data representative of the read code symbol;
one or more light emitting diodes (LEDs) for producing a visible illumination; and
an optical-waveguide structure, disposed in said hand-supportable housing, and having an optically-translucent region about said manually-actuated trigger switch, and adapted for coupling said visible illumination produced from said one or more LEDs, so as to illuminate said optically-translucent region about said manually-actuated trigger switch, thereby causing said optically-translucent region to glow and visually indicate where said manually-actuated trigger switch is located on said hand-supportable housing;
wherein said optical code symbol reading subsystem comprises a digital image capture and processing subsystem comprising:
an area-type image formation and detection subsystem having image formation optics for producing a field of view (FOV) upon an object to be imaged and forming an image of the object on an area-type image detection array, during illumination operations in an image capture mode, and detecting 2D digital images of the object;
a LED-based illumination subsystem for producing a field of illumination within the FOV of said image formation and detection subsystem during said image capture mode, so that light transmitted from said LED-based illumination subsystem and reflected from said illuminated object and transmitted through said light transmission aperture is detected by said area-type image detection array;
an image capturing and buffering subsystem for capturing and buffering 2-D images detected by the image formation and detection subsystem; and
a digital image processing subsystem for processing said 2D images captured and buffered by said image capturing and buffering subsystem so as to read one or more code symbols graphically represented in said 2D digital images and generating symbol character data representative thereof.
9. An optical code symbol reading system comprising:
a housing having a light transmission aperture;
a manually-actuated trigger switch integrated within said housing;
an optical code symbol reading subsystem for optically reading a code symbol in the field external to said light transmission aperture, and generating symbol character data representative of the read code symbol;
one or more light emitting diodes (LEDs) for producing a visible illumination; and
an optical-waveguide structure, disposed in said housing, and having an optically-translucent region about said manually-actuated trigger switch, and adapted for coupling said visible illumination produced from said one or more LEDs, so as to illuminate said optically-translucent region about said manually-actuated trigger switch, thereby causing said optically-translucent region to glow and visually indicate where said manually-actuated trigger switch is located on said housing;
wherein said optical code symbol reading subsystem comprises a digital image capture and processing subsystem comprising:
an area-type image formation and detection subsystem having image formation optics for producing a field of view (FOV) upon an object to be imaged and forming an image of the object on an area-type image detection array, during illumination operations in an image capture mode, and detecting 2D digital images of the object;
a LED-based illumination subsystem for producing a field of illumination within the FOV of said image formation and detection subsystem during said image capture mode, so that light transmitted from said LED-based illumination subsystem and reflected from said illuminated object and transmitted through said light transmission aperture is detected by said area-type image detection array;
an automatic illumination control subsystem for controlling the operation of said LED-based illumination subsystem;
an image capturing and buffering subsystem for capturing and buffering 2-D images detected by the image formation and detection subsystem; and
a digital image processing subsystem for processing said 2D images captured and buffered by said image capturing and buffering subsystem so as to read one or more code symbols graphically represented in said 2D digital images and generating symbol character data representative thereof.
2. The optical code symbol reading system of
3. The optical code symbol reading system of
an automatic illumination control subsystem for controlling the operation of said LED-based illumination subsystem.
4. The optical code symbol reading system of
an object detection subsystem for producing an object detection field within the FOV of said image formation and detection subsystem.
5. The optical code symbol reading system of
one or more light coupling elements arranged about and in optical communication with said optically-translucent region about said manually-actuated trigger switch;
wherein said optically-translucent region about said manually-actuated trigger switch, surrounds an aperture through which said manually-actuated trigger switch is installed at the top portion of said hand-supportable housing; and
wherein visible illumination produced from said one or more LEDs is optically conduced by said one or more light coupling elements, into said optically-translucent region about said manually-actuated trigger switch, causing said optically-translucent region to glow and visually indicate where said manually-actuated trigger switch is located on said hand-supportable housing.
6. The optical code symbol reading system of
7. The optical code symbol reading system of
8. The optical code symbol reading system of
10. The optical code symbol reading system of
an input/output subsystem for outputting processed image data to an external host system or other information receiving or responding device; and
a system control subsystem for controlling and/or coordinating said subsystem during object illumination and imaging operations.
11. The optical code symbol reading system of
12. The optical code symbol reading system of
one or more light coupling elements arranged about and in optical communication with said optically-translucent region about said manually-actuated trigger switch;
wherein said optically-translucent region about said manually-actuated trigger switch, surrounds an aperture through which said manually-actuated trigger switch is installed at the top portion of said hand-supportable housing; and
wherein visible illumination produced from said one or more LEDs is optically conduced by said one or more light coupling elements, into said optically-translucent region about said manually-actuated trigger switch, causing said optically-translucent region to glow and visually indicate where said manually-actuated trigger switch is located on said hand-supportable housing.
13. The optical code symbol reading system of
14. The optical code symbol reading system of
15. The optical code symbol reading system of
17. The optical code symbol reading system of
18. The optical code symbol reading system of
19. The optical code symbol reading system of
20. The optical code symbol reading system of
|
This application is a Continuation of U.S. application Ser. No. 12/012,222 filed Jan. 31, 2008; which is a Continuation of U.S. application Ser. No. 12/005,150 filed Dec. 21, 2007, now U.S. Pat. No. 7,980,471; which is a Continuation of U.S. application Ser. No. 12/001,758 filed Dec. 12, 2007, now U.S. Pat. No. 7,841,533; which is a Continuation-in-Part of the following U.S. Applications: Ser. No. 11/640,814 filed Dec. 18, 2006, now U.S. Pat. No. 7,708,205; Ser. No. 11/880,087 filed Jul. 19, 2007; Ser. No. 11/305,895 filed Dec. 16, 2005, now U.S. Pat. No. 7,607,581; Ser. No. 10/989,220 filed Nov. 15, 2004, now U.S. Pat. No. 7,490,774; Ser. No. 10/712,787 filed Nov. 13, 2008, now U.S. Pat. No. 7,128,266; Ser. No. 10/893,800 filed Jul. 16, 2004, now U.S. Pat. No. 7,273,180; Ser. No. 10/893,797 filed Jul. 16, 2004, now U.S. Pat. No. 7,188,770; Ser. No. 10/893,798 filed Jul. 16, 2004, now U.S. Pat. No. 7,185,817; Ser. No. 10/894,476 filed Jul. 16, 2004, now U.S. Pat. No. 7,178,733; Ser. No. 10/894,478 filed Jul. 19, 2004, now U.S. Pat. No. 7,357,325; Ser. No. 10/894,412 filed Jul. 19, 2004, now U.S. Pat. No. 7,213,762; Ser. No. 10/894,477 filed Jul. 19, 2004, now U.S. Pat. No. 7,360,706; Ser. No. 10/895,271 filed Jul. 20, 2004, now U.S. Pat. No. 7,216,810; Ser. No. 10/895,811 filed Jul. 20, 2004, now U.S. Pat. No. 7,225,988; Ser. No. 10/897,390 filed Jul. 22, 2004, now U.S. Pat. No. 7,237,722; Ser. No. 10/897,389 filed Jul. 22, 2004, now U.S. Pat. No. 7,225,989; Ser. No. 10/901,463 filed Jul. 27, 2004, now U.S. Pat. No. 7,086,595; Ser. No. 10/901,426 filed Jul. 27, 2004, now U.S. Pat. No. 7,278,575; Ser. No. 10/901,446 filed Jul. 27, 2004; Ser. No. 10/901,461 filed Jul. 28, 2004, now U.S. Pat. No. 7,320,431; Ser. No. 10/901,429 filed Jul. 28, 2004, now U.S. Pat. No. 7,243,847; Ser. No. 10/901,427 filed Jul. 28, 2004, now U.S. Pat. No. 7,267,282; Ser. No. 10/901,445 filed Jul. 28, 2004, now U.S. Pat. No. 7,240,844; Ser. No. 10/901,428 filed Jul. 28, 2004, now U.S. Pat. No. 7,293,714; Ser. No. 10/902,709 filed Jul. 29, 2004, now U.S. Pat. No. 7,270,272; Ser. No. 10/901,914 filed Jul. 29, 2004, now U.S. Pat. No. 7,325,738; Ser. No. 10/902,710 filed Jul. 29, 2004, now U.S. Pat. No. 7,281,661; Ser. No. 10/909,270 filed Jul. 30, 2004, now U.S. Pat. No. 7,284,705; and Ser. No. 10/909,255 filed Jul. 30, 2004, now U.S. Pat. No. 7,299,986; Ser. No. 10/903,904 filed Jul. 30, 2004, now U.S. Pat. No. 7,255,279. Each said patent application is assigned to and commonly owned by Metrologic Instruments, Inc. of Blackwood, N.J., and is incorporated herein by reference in its entirety.
1. Field of Invention
The present invention relates to area-type digital image capture and processing systems having diverse modes of digital image processing for reading one-dimensional (1D) and two-dimensional (2D) bar code symbols, as well as other forms of graphically-encoded intelligence, employing advances methods of automatic illumination and imaging to meet demanding end-user application requirements.
2. Brief Description of the State of the Art
The state of the automatic-identification industry can be understood in terms of (i) the different classes of bar code symbologies that have been developed and adopted by the industry, and (ii) the kinds of apparatus developed and used to read such bar code symbologies in various user environments.
In general, there are currently three major classes of bar code symbologies, namely: one dimensional (1D) bar code symbologies, such as UPC/EAN, Code 39, etc.; 1D stacked bar code symbologies, Code 49, PDF417, etc.; and two-dimensional (2D) data matrix symbologies.
One-dimensional (1D) optical bar code readers are well known in the art. Examples of such readers include readers of the Metrologic Voyager® Series Laser Scanner manufactured by Metrologic Instruments, Inc. Such readers include processing circuits that are able to read one dimensional (1D) linear bar code symbologies, such as the UPC/EAN code, Code 39, etc., that are widely used in supermarkets. Such 1D linear symbologies are characterized by data that is encoded along a single axis, in the widths of bars and spaces, so that such symbols can be read from a single scan along that axis, provided that the symbol is imaged with a sufficiently high resolution along that axis.
In order to allow the encoding of larger amounts of data in a single bar code symbol, a number of 1D stacked bar code symbologies have been developed, including Code 49, as described in U.S. Pat. No. 4,794,239 (Allais), and PDF417, as described in U.S. Pat. No. 5,340,786 (Pavlidis, et al.). Stacked symbols partition the encoded data into multiple rows, each including a respective 1D bar code pattern, all or most of all of which must be scanned and decoded, then linked together to form a complete message. Scanning still requires relatively high resolution in one dimension only, but multiple linear scans are needed to read the whole symbol.
The third class of bar code symbologies, known as 2D matrix symbologies offer orientation-free scanning and greater data densities and capacities than their 1D counterparts. In 2D matrix codes, data is encoded as dark or light data elements within a regular polygonal matrix, accompanied by graphical finder, orientation and reference structures. When scanning 2D matrix codes, the horizontal and vertical relationships of the data elements are recorded with about equal resolution.
In order to avoid having to use different types of optical readers to read these different types of bar code symbols, it is desirable to have an optical reader that is able to read symbols of any of these types, including their various subtypes, interchangeably and automatically. More particularly, it is desirable to have an optical reader that is able to read all three of the above-mentioned types of bar code symbols, without human intervention, i.e., automatically. This is turn, requires that the reader have the ability to automatically discriminate between and decode bar code symbols, based only on information read from the symbol itself. Readers that have this ability are referred to as “auto-discriminating” or having an “auto-discrimination” capability.
If an auto-discriminating reader is able to read only 1D bar code symbols (including their various subtypes), it may be said to have a 1D auto-discrimination capability. Similarly, if it is able to read only 2D bar code symbols, it may be said to have a 2D auto-discrimination capability. If it is able to read both 1D and 2D bar code symbols interchangeably, it may be said to have a 1D/2D auto-discrimination capability. Often, however, a reader is said to have a 1D/2D auto-discrimination capability even if it is unable to discriminate between and decode 1D stacked bar code symbols.
Optical readers that are capable of 1D auto-discrimination are well known in the art. An early example of such a reader is Metrologic's VoyagerCG® Laser Scanner, manufactured by Metrologic Instruments, Inc.
Optical readers, particularly hand held optical readers, that are capable of 1D/2D auto-discrimination and based on the use of an asynchronously moving 1D image sensor, are described in U.S. Pat. Nos. 5,288,985 and 5,354,977, which applications are hereby expressly incorporated herein by reference. Other examples of hand held readers of this type, based on the use of a stationary 2D image sensor, are described in U.S. Pat. Nos. 6,250,551; 5,932,862; 5,932,741; 5,942,741; 5,929,418; 5,914,476; 5,831,254; 5,825,006; 5,784,102, which are also hereby expressly incorporated herein by reference.
Optical readers, whether of the stationary or movable type, usually operate at a fixed scanning rate, which means that the readers are designed to complete some fixed number of scans during a given amount of time. This scanning rate generally has a value that is between 30 and 200 scans/sec for 1D readers. In such readers, the results the successive scans are decoded in the order of their occurrence.
Imaging-based bar code symbol readers have a number advantages over laser scanning based bar code symbol readers, namely: they are more capable of reading stacked 2D symbologies, such as the PDF 417 symbology; more capable of reading matrix 2D symbologies, such as the Data Matrix symbology; more capable of reading bar codes regardless of their orientation; have lower manufacturing costs; and have the potential for use in other applications, which may or may not be related to bar code scanning, such as OCR, security systems, etc
Prior art digital image capture and processing systems suffer from a number of additional shortcomings and drawbacks.
Most prior art hand held optical reading devices can be reprogrammed by reading bar codes from a bar code programming menu or with use of a local host processor as taught in U.S. Pat. No. 5,929,418. However, these devices are generally constrained to operate within the modes in which they have been programmed to operate, either in the field or on the bench, before deployment to end-user application environments. Consequently, the statically-configured nature of such prior art imaging-based bar code reading systems has limited their performance.
Prior art digital image capture and processing systems with integrated illumination subsystems also support a relatively short range of the optical depth of field. This limits the capabilities of such systems from reading big or highly dense bar code labels.
Prior art digital image capture and processing systems generally require separate apparatus for producing a visible aiming beam to help the user to aim the camera's field of view at the bar code label on a particular target object.
Prior art digital image capture and processing systems generally require capturing multiple frames of image data of a bar code symbol, and special apparatus for synchronizing the decoding process with the image capture process within such readers, as required in U.S. Pat. Nos. 5,932,862 and 5,942,741 assigned to Welch Allyn, Inc.
Prior art digital image capture and processing systems generally require large arrays of LEDs in order to flood the field of view within which a bar code symbol might reside during image capture operations, oftentimes wasting large amounts of electrical power which can be significant in portable or mobile imaging-based readers.
Prior art digital image capture and processing systems generally require processing the entire pixel data set of capture images to find and decode bar code symbols represented therein. On the other hand, some prior art imaging systems use the inherent programmable (pixel) windowing feature within conventional CMOS image sensors to capture only partial image frames to reduce pixel data set processing and enjoy improvements in image processing speed and thus imaging system performance.
Many prior art digital image capture and processing systems also require the use of decoding algorithms that seek to find the orientation of bar code elements in a captured image by finding and analyzing the code words of 2-D bar code symbologies represented therein.
Some prior art digital image capture and processing systems generally require the use of a manually-actuated trigger to actuate the image capture and processing cycle thereof.
Prior art digital image capture and processing systems generally require separate sources of illumination for producing visible aiming beams and for producing visible illumination beams used to flood the field of view of the bar code reader.
Prior art digital image capture and processing systems generally utilize during a single image capture and processing cycle, and a single decoding methodology for decoding bar code symbols represented in captured images.
Some prior art digital image capture and processing systems require exposure control circuitry integrated with the image detection array for measuring the light exposure levels on selected portions thereof.
Also, many imaging-based readers also require processing portions of captured images to detect the image intensities thereof and determine the reflected light levels at the image detection component of the system, and thereafter to control the LED-based illumination sources to achieve the desired image exposure levels at the image detector.
Prior art digital image capture and processing systems employing integrated illumination mechanisms control image brightness and contrast by controlling the time that the image sensing device is exposed to the light reflected from the imaged objects. While this method has been proven for the CCD-based bar code scanners, it is not suitable, however, for the CMOS-based image sensing devices, which require a more sophisticated shuttering mechanism, leading to increased complexity, less reliability and, ultimately, more expensive bar code scanning systems.
Prior art digital image capture and processing systems generally require the use of tables and bar code menus to manage which decoding algorithms are to be used within any particular mode of system operation to be programmed by reading bar code symbols from a bar code menu.
Also, due to the complexity of the hardware platforms of such prior art digital image capture and processing systems, end-users are not permitted to modify the features and functionalities of such system to their customized application requirements, other than changing limited functions within the system by reading system-programming type bar code symbols, as disclosed in U.S. Pat. Nos. 6,321,989; 5,965,863; 5,929,418; and 5,932,862, each being incorporated herein by reference.
Also, dedicated image-processing based bar code symbol reading devices usually have very limited resources, such as the amount of volatile and non-volatile memories. Therefore, they usually do not have a rich set of tools normally available to universal computer systems. Further, if a customer or a third-party needs to enhance or alter the behavior of a conventional image-processing based bar code symbol reading system or device, they need to contact the device manufacturer and negotiate the necessary changes in the “standard” software or the ways to integrate their own software into the device, which usually involves the re-design or re-compilation of the software by the original equipment manufacturer (OEM). This software modification process is both costly and time consuming.
Prior Art Field of View (FOV) Aiming, Targeting, Indicating and Marking Techniques
The need to target, indicate and/or mark the field of view (FOV) of 1D and 2D image sensors within hand-held imagers has also been long recognized in the industry.
In U.S. Pat. No. 4,877,949, Danielson et a disclosed on Aug. 8, 1986 an digital image capture and processing system having a 2D image sensor with a field of view (FOV) and also a pair of LEDs mounted about a 1D (i.e. linear) image sensor to project a pair of light beams through the FOV focusing optics and produce a pair of spots on a target surface supporting a 1D bar code, thereby indicating the location of the FOV on the target and enable the user to align the bar code therewithin.
In U.S. Pat. No. 5,019,699, Koenck et al disclosed on Aug. 31, 1988 an digital image capture and processing system having a 2D image sensor with a field of view (FOV) and also a set of four LEDs (each with lenses) about the periphery of a 2D (i.e. area) image sensor to project four light beams through the FOV focusing optics and produce four spots on a target surface to mark the corners of the FOV intersecting with the target, to help the user align 1D and 2D bar codes therewithin in an easy manner.
In FIGS. 48-50 of U.S. Pat. Nos. 5,841,121 and 6,681,994, Koenck disclosed on Nov. 21, 1990, an digital image capture and processing system having a 2D image sensor with a field of view (FOV) and also apparatus for marking the perimeter of the FOV, using four light sources and light shaping optics (e.g. cylindrical lens).
In U.S. Pat. No. 5,378,883, Batterman et al disclosed on Jul. 29, 1991, a hand-held digital image capture and processing system having a 2D image sensor with a field of view (FOV) and also a laser light source and fixed lens to produce a spotter beam that helps the operator aim the reader at a candidate bar code symbol. As disclosed, the spotter beam is also used measure the distance to the bar code symbol during automatic focus control operations supported within the bar code symbol reader.
In U.S. Pat. No. 5,659,167, Wang et al disclosed on Apr. 5, 1994, an digital image capture and processing system comprising a 2D image sensor with a field of view (FOV), a user display for displaying a visual representation of a dataform (e.g. bar code symbol), and visual guide marks on the user display for indicating whether or not the dataform being imaged is in focus when its image is within the guide marks, and out of focus when its image is within the guide marks.
In U.S. Pat. No. 6,347,163, Roustaei disclosed on May 19, 1995, a system for reading 2D images comprising a 2D image sensor, an array of LED illumination sources, and an image framing device which uses a VLD for producing a laser beam and a light diffractive optical element for transforming the laser beam into a plurality of beamlets having a beam edge and a beamlet spacing at the 2D image, which is at least as large as the width of the 2D image.
In U.S. Pat. No. 5,783,811, Feng et al disclosed on Feb. 26, 1996, a portable imaging assembly comprising a 2D image sensor with a field of view (FOV) and also a set of LEDs and a lens array which produces a cross-hair type illumination pattern in the FOV for aiming the imaging assembly at a target.
In U.S. Pat. No. 5,793,033, Feng et al disclosed on Mar. 29, 1996, a portable imaging assembly comprising a 2D image sensor with a field of view (FOV), and a viewing assembly having a pivoting member which, when positioned a predetermined distance from the operator's eye, provides a view through its opening which corresponds to the target area (FOV) of the imaging assembly. for displaying a visual representation of a dataform (e.g. bar code symbol).
In U.S. Pat. No. 5,780,834, Havens et al disclosed on May 14, 1996, a portable imaging and illumination optics assembly having a 2D image sensor with a field of view (FOV), an array of LEDs for illumination, and an aiming or spotting light (LED) indicating the location of the FOV.
In U.S. Pat. No. 5,949,057, Feng et al disclosed on Jan. 31, 1997, a portable imaging device comprising a 2D image sensor with a field of view (FOV), and first and second sets of targeting LEDs and first and second targeting optics, which produces first and second illumination targeting patterns, which substantially coincide to form a single illumination targeting pattern when the imaging device is arranged at a “best focus” position.
In U.S. Pat. No. 6,060,722, Havens et al disclosed on Sep. 24, 1997, a portable imaging and illumination optics assembly comprising a 2D image sensor with a field of view (FOV), an array of LEDs for illumination, and an aiming pattern generator including at least a point-like aiming light source and a light diffractive element for producing an aiming pattern that remains approximately coincident with the FOV of the imaging device over the range of the reader-to-target distances over which the reader is used.
In U.S. Pat. No. 6,340,114, filed Jun. 12, 1998, Correa et al disclosed an imaging engine comprising a 2D image sensor with a field of view (FOV) and an aiming pattern generator using one or more laser diodes and one or more light diffractive elements to produce multiple aiming frames having different, partially overlapping, solid angle fields or dimensions corresponding to the different fields of view of the lens assembly employed in the imaging engine. The aiming pattern includes a centrally-located marker or cross-hair pattern. Each aiming frame consists of four corner markers, each comprising a plurality of illuminated spots, for example, two multiple spot lines intersecting at an angle of 90 degrees.
As a result of limitations in the field of view (FOV) marking, targeting and pointing subsystems employed within prior art digital image capture and processing systems, such prior art readers generally fail to enable users to precisely identify which portions of the FOV read high-density 1D bar codes with the ease and simplicity of laser scanning based bar code symbol readers, and also 2D symbologies, such as PDF 417 and Data Matrix.
Also, as a result of limitations in the mechanical, electrical, optical, and software design of prior art digital image capture and processing systems, such prior art readers generally: (i) fail to enable users to read high-density 1D bar codes with the ease and simplicity of laser scanning based bar code symbol readers and also 2D symbologies, such as PDF 417 and Data Matrix, and (iii) have not enabled end-users to modify the features and functionalities of such prior art systems without detailed knowledge about the hard-ware platform, communication interfaces and the user interfaces of such systems.
Also, control operations in prior art image-processing bar code symbol reading systems have not been sufficiently flexible or agile to adapt to the demanding lighting conditions presented in challenging retail and industrial work environments where 1D and 2D bar code symbols need to be reliably read.
Prior art digital imaging and laser scanning systems also suffering from a number of other problems as well.
Some prior art imaging systems have relied on IR-based object detection using the same image sensing array for detecting images of objects, and therefore, require that the decode microprocessor be powered up during the object detection state of operation, and consuming power which would be undesirable in portable digital imaging applications.
Thus, there is a great need in the art for an improved method of and apparatus for reading bar code symbols using image capture and processing techniques which avoid the shortcomings and drawbacks of prior art methods and apparatus.
Accordingly, a primary object of the present invention is to provide a novel method of and apparatus for enabling the reading of 1D and 2D bar code symbologies using image capture and processing based systems and devices, which avoid the shortcomings and drawbacks of prior art methods and apparatus.
Another object of the present invention is to provide a novel hand-supportable digital image capture and processing system capable of automatically reading 1D and 2D bar code symbologies using advanced illumination and imaging techniques, providing speeds and reliability associated with conventional laser scanning bar code symbol readers.
Another object of the present invention is to provide a hand-supportable digital image capture and processing system having an integrated LED-based linear targeting illumination subsystem for automatically generating a visible linear targeting illumination beam for aiming on a target object prior to illuminating the same during its area image capture mode of operation.
Another object of the present invention is to provide a hand-supportable digital image capture and processing system having a presentation mode which employs a hybrid video and snap-shot mode of image detector operation.
Another object of the present invention is to provide a hand-supportable digital image capture and processing system employing automatic object presence detection to control the generation of a wide-area illumination beam during bar code symbol imaging operations.
Another object of the present invention is to provide a hand-supportable digital image capture and processing system employing a CMOS-type image detecting array with a band-pass optical filter subsystem integrated within the hand-supportable housing thereof, to allow only narrow-band illumination from the multi-mode illumination subsystem to expose the image detecting array during object illumination and imaging operations.
Another object of the present invention is to provide a hand-supportable digital image capture and processing system employing a multi-mode led-based illumination subsystem.
Another object of the present invention is to provide a hand-supportable digital image capture and processing system having 1D/2D auto-discrimination capabilities.
Another object of the present invention is to provide such an imaging-based bar code symbol reader having target applications at point of sales in convenience stores, gas stations, quick markets, and the like.
Another object of the present invention is to provide a digital image-processing based bar code symbol reading system that is highly flexible and agile to adapt to the demanding lighting conditions presented in challenging retail and industrial work environments where 1D and 2D bar code symbols need to be reliably read.
Another object of the present invention is to provide such an automatic imaging-based bar code symbol reading system, wherein an automatic light exposure measurement and illumination control subsystem is adapted to measure the light exposure on a central portion of the CMOS image detecting array and control the operation of the LED-based illumination subsystem in cooperation with the digital image processing subsystem.
Another object of the present invention is to provide a hand-supportable digital image capture and processing system employing automatic object detection, and a linear targeting illumination beam generated from substantially the same plane as the area image detection array.
Another object of the present invention is to provide a hand-supportable digital image capture and processing system employing hybrid illumination and imaging modes of operation employing a controlled complex of snap-shot and video illumination/imaging techniques.
Another object of the present invention is to provide a hand-supportable digital image capture and processing system employing a single PC board with imaging aperture, and image formation and detection subsystem and linear illumination targeting subsystem supported on the rear side of the board, using common FOV/Beam folding optics; and also, light collection mirror for collecting central rays along the FOV as part of the automatic light measurement and illumination control subsystem.
Another object of the present invention is to provide a hand-supportable digital image capture and processing system, wherein the pair of LEDs, and corresponding aperture stops and cylindrical mirrors are mounted on opposite sides of the image detection array in the image formation and detection subsystem, and employs a common FOV/BEAM folding mirror to project the linear illumination target beam through the central light transmission aperture (formed in the PC board) and out of the imaging window of the system.
Another object of the present invention is to provide a hand-supportable digital image capture and processing system, wherein a single LED array is mounted above its imaging window and beneath a light ray blocking shroud portion of the housing about the imaging window, to reduce illumination rays from striking the eyes of the system operator or nearby consumers during system operation.
Another object of the present invention is to provide a hand-supportable digital image capture and processing system, with improved menu-reading capabilities.
Another object of the present invention is to provide a hand-supportable digital image capture and processing system having an integrated band-pass filter employing wavelength filtering FOV mirror elements.
Another object of the present invention is to provide a hand-supportable digital image capture and processing system having multi-mode image formation and detection systems supporting snap-shot, true-video, and pseudo (high-speed repeated snap-shot) modes of operation.
Another object of the present invention is to provide a hand-supportable digital image capture and processing system having an image formation and detection system supporting high-repetition snap-shot mode of operation, and wherein the time duration of illumination and imaging is substantially equal to the time for image processing—and globally-exposure principles of operation are stroboscopically implemented.
Another object of the present invention is to provide a hand-supportable digital image capture and processing system employing automatic object motion detection using IR sensing techniques (e.g. IR LED/photodiode, IR-based imaging, and IR-based LADAR—pulse doppler).
Another object of the present invention is to provide a hand-supportable digital image capture and processing system employing automatic linear illumination target beam, projected from the rear-side of the PC board, adjacent image sensing array, and reflecting off FOV folding mirror into the FOV.
Another object of the present invention is to provide a hand-supportable digital image capture and processing system having single PC board with light transmission aperture having image detection array mounted thereon, with the optical axis of the image formation optics perpendicular to the said PC board and a double-set of FOV folding mirrors for projecting the FOV out through the light transmission aperture and the image window of the system.
Another object of the present invention is to provide a hand-supportable digital image capture and processing system having single PC board with light transmission aperture, wherein a pair of cylindrical optical elements proved for forming a linear illumination target beam, are disposed parallel to a FOV folding mirror used to project the linear illumination target beam out through the light transmission aperture and the image window of the system.
Another object of the present invention is to provide a hand-supportable digital image capture and processing system having single PC board with light transmission aperture, wherein an array of visible LED are mounted on the rear side of the PC board for producing a linear illumination target beam, and an array of visible LEDs are mounted on the front side of the PC board for producing a field of visible illumination within the FOV of the system.
Another object of the present invention is to provide a hand-supportable digital image capture and processing system having single PC board with a light transmission aperture, wherein a first array of visible LED are mounted on the rear side of the PC board for producing a linear illumination target beam, whereas a second array of visible LEDs are mounted on the front side of the PC board for producing a field of visible illumination within the FOV of the system, wherein said field of visible illumination being substantially coextensive with said linear illumination target beam.
Another object of the present invention is to provide a hand-supportable digital image capture and processing system having single PC board with light transmission aperture, wherein a set of visible LEDs are mounted on opposite sides of an area-type image detection array mounted to the PC board, for producing a linear illumination target beam, that is substantially parallel to the optical axis of the image formation optics of the image detection array, as it is projected through the light transmission aperture and imaging window of the system.
Another object of the present invention is to provide a hand-supportable digital image capture and processing system having single PC board with light transmission aperture, wherein an automatic light measurement and illumination control subsystem is provided employing a light collecting mirror disposed behind said light transmission aperture for collecting light from a central portion of the FOV of the system provided by image formation optics before an area-type image detection array on mounted on the PC board, and focusing the collected light onto photodetector mounted adjacent the image detection array, but independent of its operation; and wherein beyond the light transmission aperture, the optical axis of the light collecting mirror is substantially parallel to the optical axis of the image formation and detection subsystem.
Another object of the present invention is to provide a hand-supportable digital image capture and processing system employing a system control system that controls (i) an image formation and detection subsystem employing an area-type image detection array with image formation optics providing a field of view (FOV) and wherein one of several possible image detection array modes of operation are selectable, and (ii) a multi-mode illumination subsystem employing multiple LED illumination arrays for illuminating selected portions of the FOV.
Another object of the present invention is to provide a hand-supportable digital image capture and processing system employing a system control system that controls an image formation and detection subsystem employing an area-type image detection array with image formation optics providing a field of view (FOV) and in which one of several possible image detection array modes of operation are selectable, and a multi-mode illumination subsystem employing multiple LED illumination arrays for illuminating selected portions of said FOV; and wherein the system supports an illumination and imaging control process employing both snap-shot and video-modes of operation.
Another object of the present invention is to provide a hand-supportable digital image capture and processing system employing linear target illumination beam to align programming-type bar code symbols prior to wide-area illumination and image capture and processing so as to confirm that such bar code symbol was intentionally read as a programming-type bar code symbol.
Another object of the present invention is to provide a hand-supportable digital image capture and processing system employing linear target illumination beam to align programming-type bar code symbols and narrowly-confined active subregion in the FOV centered about the linear target illumination beam so as to confirm that bar code symbols region in this subregion was intentionally read as a programming-type bar code symbols.
Another object of the present invention is to provide a hand/countertop-supportable digital image capture and processing system which carries out a first method of hands-free digital imaging employing automatic hands-free configuration detection, automatic object presence motion/velocity detection in field of view (FOV) of system (i.e. automatic-triggering), automatic illumination and imaging of multiple image frames while operating in a snap-shot mode during a first time interval, and automatic illumination and imaging while operating in a video-mode during a second time interval.
Another object of the present invention is to provide a hand/countertop-supportable digital image capture and processing system which carries out a second method of hands-free digital imaging employing automatic hands-free configuration detection, automatic object presence detection in field of view (FOV) of system (i.e. automatic-triggering), automatic linear target illumination beam generation, and automatic illumination and imaging of multiple image frames while operating in a snap-shot mode within a predetermined time interval.
Another object of the present invention is to provide such a hand/countertop-supportable digital image capture and processing system which can be easily used during for menu-reading applications.
Another object of the present invention is to provide a hand/countertop-supportable digital image capture and processing system which carries out a third method of hands-free digital imaging employing automatic hands-free configuration detection, automatic object presence detection in field of view (FOV) of system (i.e. automatic-triggering), and automatic illumination and imaging while operating in a video mode within a predetermined time interval.
Another object of the present invention is to provide a hand-supportable digital image capture and processing system which carries out a first method of hand-held digital imaging employing automatic hand-held configuration detection, automatic object presence detection in field of view (FOV) of system (i.e. automatic-triggering), automatic linear target illumination beam generation (i.e. automatic object targeting), and automatic illumination and imaging of multiple digital image frames while operating in a snap-shot mode within a predetermined time interval.
Another object of the present invention is to provide a hand-supportable digital image capture and processing system which carries out a second method of hand-held digital imaging employing automatic hand-held configuration detection, automatic object presence detection in field of view (FOV) of system (i.e. automatic-triggering), automatic linear target illumination beam generation (i.e. automatic object targeting), and automatic illumination and imaging of video image frames while operating in a video-shot mode within a predetermined time interval.
Another object of the present invention is to provide a hand-supportable digital image capture and processing system which carries out a first method of hand-held digital imaging employing automatic hand-held configuration detection, manual trigger switching (i.e. manual-triggering), automatic linear target illumination beam generation (i.e. automatic object targeting), and automatic illumination and imaging of multiple image frames while operating in a snap-shot mode within a predetermined time interval.
Another object of the present invention is to provide a hand-supportable digital image capture and processing system which carries out a fourth method of hand-held digital imaging employing automatic hand-held configuration detection, manual trigger switching (i.e. manual-triggering), automatic linear target illumination beam generation (i.e. automatic object targeting), and automatic illumination and imaging of video image frames while operating in a video-shot mode within a predetermined time interval.
Another object of the present invention is to provide a hand-supportable digital image capture and processing system which carries out a fifth method of hand-held digital imaging employing automatic hand-held configuration detection, manual trigger switching (i.e. manual-triggering), automatic linear target illumination beam generation (i.e. automatic object targeting), and illumination and imaging of single image frame while operating in a snap-shot mode.
Another object of the present invention is to provide a hand-supportable digital image capture and processing system employing a pseudo-video illumination mode, enabling ½ the number of frames captured (e.g. 15 frame/second), with a substantially reduced illumination annoyance index (IAI).
Another object of the present invention is to provide a hand-supportable digital image capture and processing system, wherein a single array of LEDs are used to illuminate the field of view of system so as minimize illumination of the field of view (FOV) of human operators and spectators in the ambient environment.
Another object of the present invention is to provide such a hand-supportable digital image capture and processing system which further comprises a linear targeting illumination beam.
Another object of the present invention is to provide a hand/countertop-supportable digital image capture and processing system, employing a method of illuminating and capturing digital images at the point of sale using a digital image capture and processing system operating in a presentation mode of operation.
Another object of the present invention is to provide such a hand/countertop-supportable digital image capture and processing system, wherein a light ray blocking structure is arranged about upper portion of the imaging window.
Another object of the present invention is to provide such a hand-supportable digital image capture and processing system, wherein illumination rays are maintained below an illumination ceiling, above which the field of view of human operator and spectators are typically positioned.
Another object of the present invention is to provide a hand-supportable digital image capture and processing system which stores multiple files for different sets of system configuration parameters which are automatically implemented when one or multiple communication interfaces supported by the system is automatically detected and implemented, without scanning programming type bar code symbols.
Another object of the present invention is to provide a hand-supportable digital image capture and processing system which incorporates image intensification technology within the image formation and detection subsystem and before the image detection array so as to enable the detection of faint (i.e. low intensity) images of objects formed in the FOV using very low illumination levels, as may be required or desired in demanding environments, such as retail POS environments, where high intensity illumination levels are either prohibited or highly undesired from a human safety and/or comfort point of view.
Another object of the present invention is to provide a hand-supportable digital image capture and processing system employing a LED-driven optical-waveguide structure that is used to illuminate a manually-actuated trigger switch integrated within the hand-supportable housing of the system.
Another object of the present invention is to provide a hand-supportable digital image capture and processing system employing an acoustic-waveguide structure coupling sonic energy, produced from its electro-acoustic transducer, to the sound ports formed in the system housing.
Another object of the present invention is to provide a hand-supportable digital image capture and processing system that is provided with an illumination subsystem employing prismatic illumination focusing lens structure integrated within its imaging window, for generating a field of visible illumination that is highly confined below the field of view of the system operator and customers who are present at the POS station at which the digital image capture and processing system is deployed.
Another object of the present invention is to provide a hand-supportable digital image capture and processing system which carries out a method of automatically programming multiple system configuration parameters within system memory of the digital image capture and processing system of present invention, without reading programming-type bar codes.
Another object of the present invention is to provide a hand-supportable digital image capture and processing system which carries out a method of unlocking restricted features embodied within the digital image capture and processing system of present invention of the third illustrative embodiment, by reading feature/functionality-unlocking programming-type bar code symbols.
Another object of the present invention is to provide a hand-supportable digital image capture and processing system of present invention employing a single linear LED illumination array for providing full field illumination within the entire FOV of the system.
Another object of the present invention is to provide a method of reducing glare produced from an LED-based illumination array employed in a digital image capture and processing system.
Another object of the present invention is to provide a hand-supportable digital image capture and processing system employing a prismatic illumination-focusing lens component, integrated within the imaging window of the present invention.
Another object of the present invention is to provide a hand-supportable digital image capture and processing system having a multi-interface I/O subsystem employing a software-controlled automatic communication interface test/detection process that is carried out over a cable connector physically connecting the I/O ports of the digital image capture and processing subsystem and its designated host system.
Another object of the present invention is to provide a hand-supportable digital image capture and processing system supporting a method of programming a set of system configuration parameters (SCPs) within system during the implementation of the communication interface with a host system.
Another object of the present invention is to provide a hand-supportable digital image capture and processing system which once initially programmed, avoids the need read individual programming codes at its end-user deployment environment in order to change additional configuration parameters (e.g. symbologies, prefixes, suffixes, data parsing, etc.) for a particular communication interface supported by the host system environment in which it has been deployed.
Another object of the present invention is to provide such hand-supportable digital image capture and processing system offering significant advantages including, for example, a reduction in the cost of ownership and maintenance, with a significant improvement in convenience and deployment flexibility within an organizational environment employing diverse host computing system environments.
Another object of the present invention is to provide a hand-supportable digital image capture and processing system, which employs or incorporates automatic gyroscopic-based image stabilization technology within the image formation and detection subsystem, so as to enable the formation and detection of crystal clear images in the presence of environments characterized by hand jitter, camera platform vibration, and the like.
Another object of the present invention is to provide such a hand-supportable digital image capture and processing system, wherein the automatic gyroscopic-based image stabilization technology employs FOV imaging optics and FOV folding mirrors which are gyroscopically stabilized, with a real-time image stabilization system employing multiple accelerometers.
These and other objects of the present invention will become more apparently understood hereinafter and in the Claims to Invention appended hereto.
For a more complete understanding of how to practice the Objects of the Present Invention, the following Detailed Description of the Illustrative Embodiments can be read in conjunction with the accompanying Drawings, briefly described below.
FIG. 5E1 is a schematic representation of transmission characteristics (energy versus wavelength) associated with the red-wavelength reflecting high-pass imaging window integrated within the hand-supportable housing of the digital image capture and processing system of the present invention, showing that optical wavelengths above 700 nanometers are transmitted and wavelengths below 700 nm are substantially blocked (e.g. absorbed or reflected);
FIG. 5E2 is a schematic representation of transmission characteristics (energy versus wavelength) associated with the low-pass optical filter element disposed after the high-pass optical filter element within the digital image capture and processing system, but before its CMOS image detection array, showing that optical wavelengths below 620 nanometers are transmitted and wavelengths above 620 nm are substantially blocked (e.g. absorbed or reflected);
FIG. 5E3 is a schematic representation of the transmission characteristics of the narrow-based spectral filter subsystem integrated within the hand-supportable image capture and processing system of the present invention, plotted against the spectral characteristics of the LED-emissions produced from the Multi-Mode LED-Based Illumination Subsystem of the illustrative embodiment of the present invention;
FIG. 14A1 is a perspective view of the hand-supportable digital image capture and processing system of the first illustrative embodiment, shown operated according to a method of hand-held digital imaging for the purpose of reading bar code symbols from a bar code symbol menu, involving the generation of a visible linear target illumination beam from the system, targeting a programming code symbol therewith, and then illuminating the bar code symbol with wide-field illumination during digital imaging operations over a narrowly-confined active region in the FOV centered about the linear targeting beam;
FIG. 14A2 is a perspective cross-sectional view of the hand-supportable digital image capture and processing system of the first illustrative embodiment in FIG. 14A1, shown operated according to the method of hand-held digital imaging used to read bar code symbols from a bar code symbol menu, involving the steps of (i) generating a visible linear target illumination beam from the system, (ii) targeting a programming-type code symbol therewithin, and then (iii) illuminating the bar code symbol within a wide-area field of illumination during digital imaging operations over a narrowly-confined active region in the FOV centered about the linear targeting beam;
FIGS. 15A1 through 15A3, taken together, show a flow chart describing the control process carried out within the countertop-supportable digital image capture and processing system of the first illustrative embodiment during its first hands-free (i.e. presentation/pass-through) method of digital imaging in accordance with the principles of the present invention, involving the use of its automatic object motion detection and analysis subsystem and both of its snap-shot and video (imaging) modes of subsystem operation;
FIGS. 17A1 and 17A2, taken together, shows a flow chart describing the control process carried out within the countertop-supportable digital image capture and processing system of the first illustrative embodiment during its third hands-free method of digital imaging in accordance with the principles of the present invention, involving the use of its automatic object motion detection and analysis subsystem and video imaging mode of subsystem operation;
FIGS. 19A1 through 19A2, taken together, show a flow chart describing the control process carried out within the hand-supportable digital image capture and processing system of the first illustrative embodiment during its second hand-held method of digital imaging in accordance with the principles of the present invention, involving the use of its automatic object motion detection and analysis subsystem and video imaging mode of subsystem operation;
FIG. 21A1 through 21A2, taken together, show a flow chart describing the control process carried out within the hand-supportable digital image capture and processing system of the first illustrative embodiment during its fourth hand-held method of digital imaging in accordance with the principles of the present invention, involving the use of its manually-actuatable trigger switch and video imaging mode of subsystem operation;
FIGS. 32B1 and 32B2 set forth a schematic block diagram representation of an exemplary implementation of the electronic and photonic aspects of the digital image capture and processing system of the third illustrative embodiment of the present invention, whose components are supported on the PC board assembly of the present invention;
FIG. 33C1 is a cross-sectional partially cut-away view of the digital image capture and processing system of the third illustrative embodiment, taken along lines 33C1-33C1 in
FIG. 33C2 is a cross-sectional view of the prismatic lens component integrated within the upper edge portion of the imaging window of the present invention, employed in the digital image capture and processing system of the third illustrative embodiment, and showing the propagation of light rays from an LED in the linear LED array, and through the prismatic lens component, into the FOV of the system;
FIG. 33G1 is a gray scale image of 1280 pixels by 768 pixels showing the spatial intensity profile of the field of illumination produced from the illumination system of the system at 50 mm from the imaging window, over an exposure duration of 0.5 milliseconds, wherein each pixel has an intensity value ranging from 0 to 255, and due to the illumination design scheme of the illustrative embodiment, the center portion of the intensity profile has a larger intensity value than the edge portion;
FIG. 33G2 is a graphical representation of the horizontal cross section of the spatial intensity profile of FIG. 33G1, taken at the center of the FOV, and showing a drop off in spatial intensity when moving from the center of the FOV to its edge, and wherein “noise-like” structures are gray scale values for the 1280 pixels in the grey scale image, whereas the solid smooth line is the curve fitted result of the fluctuation in grey scale image pixel values, showing the average intensity value drop off from the center of the image, to its edge;
FIG. 33H1 is a gray scale image of 1280 pixels by 768 pixels showing the spatial intensity profile of the field of illumination produced from the illumination system of the system at 75 mm from the imaging window, over an exposure duration of 0.5 milliseconds, wherein each pixel has an intensity value ranging from 0 to 255, and due to the illumination design scheme of the illustrative embodiment, the center portion of the intensity profile has a larger intensity value than the edge portion;
FIG. 33H2 is a graphical representation of the horizontal cross section of the spatial intensity profile of FIG. 33H1, taken at the center of the FOV, and showing a drop off in spatial intensity when moving from the center of the FOV to its edge, and wherein “noise-like” structures are gray scale values for the 1280 pixels in the grey scale image, whereas the solid smooth line is the curve fitted result of the fluctuation in grey scale image pixel values, showing the average intensity value drop off from the center of the image, to its edge;
FIG. 33I1 is a gray scale image of 1280 pixels by 768 pixels showing the spatial intensity profile of the field of illumination produced from the illumination system of the system at 100 mm from the imaging window, over an exposure duration of 0.5 milliseconds, wherein each pixel has an intensity value ranging from 0 to 255, and due to the illumination design scheme of the illustrative embodiment, the center portion of the intensity profile has a larger intensity value than the edge portion;
FIG. 33I2 is a graphical representation of the horizontal cross section of the spatial intensity profile of FIG. 33I1, taken at the center of the FOV, and showing a drop off in spatial intensity when moving from the center of the FOV to its edge, and wherein “noise-like” structures are gray scale values for the 1280 pixels in the grey scale image, whereas the solid smooth line is the curve fitted result of the fluctuation in grey scale image pixel values, showing the average intensity value drop off from the center of the image, to its edge;
FIG. 33J1 is a gray scale image of 1280 pixels by 768 pixels showing the spatial intensity profile of the field of illumination produced from the illumination system of the system at 125 mm from the imaging window, over an exposure duration of 0.5 milliseconds, wherein each pixel has an intensity value ranging from 0 to 255, and due to the illumination design scheme of the illustrative embodiment, the center portion of the intensity profile has a larger intensity value than the edge portion;
FIG. 33J2 is a graphical representation of the horizontal cross section of the spatial intensity profile of FIG. 33J1, taken at the center of the FOV, and showing a drop off in spatial intensity when moving from the center of the FOV to its edge, and wherein “noise-like” structures are gray scale values for the 1280 pixels in the grey scale image, whereas the solid smooth line is the curve fitted result of the fluctuation in grey scale image pixel values, showing the average intensity value drop off from the center of the image, to its edge;
FIG. 33K1 is a gray scale image of 1280 pixels by 768 pixels showing the spatial intensity profile of the field of illumination produced from the illumination system of the system at 50 mm from the imaging window, over an exposure duration of 0.5 milliseconds, wherein each pixel has an intensity value ranging from 0 to 255, and due to the illumination design scheme of the illustrative embodiment, the center portion of the intensity profile has a larger intensity value than the edge portion;
FIG. 33K2 is a graphical representation of the horizontal cross section of the spatial intensity profile of FIG. 33K1, taken at the center of the FOV, and showing a drop off in spatial intensity when moving from the center of the FOV to its edge, and wherein “noise-like” structures are gray scale values for the 1280 pixels in the grey scale image, whereas the solid smooth line is the curve fitted result of the fluctuation in grey scale image pixel values, showing the average intensity value drop off from the center of the image, to its edge;
FIGS. 43B1 and 43B2 set forth a schematic diagram for the interface switching module employed in the multi-interface I/O subsystem of
FIGS. 43C1 and 43C2 set forth a flow chart describing the automatic interface detection process carried out within the multi-interface I/O subsystem of
Referring to the figures in the accompanying Drawings, the various illustrative embodiments of the hand-supportable and countertop-supportable digital image capture and processing systems of the present invention will be described in great detail, wherein like elements will be indicated using like reference numerals.
Hand-Supportable/Countertop-Supportable Digital Image Capture and Processing System of the First Illustrative Embodiment of the Present Invention
Referring to
In alternative embodiments of the present invention, the form factor of the hand-supportable/countertop-supportable housing of the illustrative embodiments might be different. In yet other alternative embodiments, the housing need not be hand-supportable or countertop-supportable, as disclosed herein, but rather might be designed for stationary or permanent installation above a desktop or countertop surface, at a point-of-sale (POS) station, or a commercial or industrial environment requiring digital imaging for one or more particular applications.
Schematic Block Functional Diagram as System Design Model for the Digital Image Capture and Processing System of the Present Invention
As shown in the system design model of
In general, the primary function of the object motion detection and analysis subsystem 20 is to automatically produce an object detection field 32 within the FOV 33 of the image formation and detection subsystem 21, detect the presence of an object within predetermined regions of the object detection field 32, as well as motion and velocity information about the object therewithin, and generate control signals which are supplied to the system control subsystem 30 for indicating when and where an object is detected within the object detection field of the system.
In the first illustrative embodiment, the image formation and detection (i.e. camera) subsystem 21 includes image formation (camera) optics 34 for providing a field of view (FOV) 33 upon an object to be imaged and a CMOS area-type image detection array 35 for detecting imaged light reflected off the object during illumination and image acquisition/capture operations.
In the first illustrative embodiment, the primary function of the multi-mode LED-based illumination subsystem 22 is to produce a near-field wide-area illumination field 36 from the near field LED array 23A when an object is automatically detected within the near-field portion of the FOV, and a far-field wide-area illumination field 37 from the far-field LED array 23B when an object is detected within the far-field portion of the FOV. Notably, each such field of illumination has a narrow optical-bandwidth and is spatially confined within the FOV of the image formation and detection subsystem 21 during near and far field modes of illumination and imaging, respectively. This arrangement is designed to ensure that only narrow-band illumination transmitted from the illumination subsystem 22, and reflected from the illuminated object, is ultimately transmitted through a narrow-band transmission-type optical filter subsystem 40 within the system and reaches the CMOS area-type image detection array 35 for detection and processing, whereas all other components of ambient light collected by the light collection optics are substantially rejected at the image detection array 35, thereby providing improved SNR thereat, thus improving the performance of the system. In the illustrative embodiment, the narrow-band transmission-type optical filter subsystem 40 is realized by (1) high-pass (i.e. red-wavelength reflecting) filter element 40A embodied within at the imaging window 3, and (2) low-pass filter element 40B mounted either before the CMOS area-type image detection array 35 or anywhere after beyond the high-pass filter element 40A, including being realized as a dichroic mirror film supported on at least one of the FOV folding mirrors 74 and 75. FIG. 5E3 sets forth the resulting composite transmission characteristics of the narrow-band transmission spectral filter subsystem 40, plotted against the spectral characteristics of the emission from the LED illumination arrays employed in the LED-based illumination subsystem 22.
The primary function of the automatic light exposure measurement and illumination control subsystem 24 is two fold: (1) to measure, in real-time, the power density [joules/cm] of photonic energy (i.e. light) collected by the optics of the system at about its image detection array 35, and generate auto-exposure control signals indicating the amount of exposure required for good image formation and detection; and (2) in combination with illumination array selection control signal provided by the system control subsystem 30, automatically drive and control the output power of selected LED arrays 23A and 23B in the illumination subsystem 22, so that objects within the FOV of the system are optimally exposed to LED-based illumination and optimal images are formed and detected at the image detection array 35.
The primary function of the image capturing and buffering subsystem 25 is to (1) detect the entire 2-D image focused onto the 2D image detection array 35 by the image formation optics 34 of the system, (2) generate a frame of digital pixel data for either a selected region of interest of the captured image frame, or for the entire detected image, and then (3) buffer each frame of image data as it is captured. Notably, in the illustrative embodiment, a single 2D image frame (31) is captured during each image capture and processing cycle, or during a particular stage of a processing cycle, so as to eliminate the problems associated with image frame overwriting, and synchronization of image capture and decoding processes, as addressed in U.S. Pat. Nos. 5,932,862 and 5,942,741 assigned to Welch Allyn, and incorporated herein by reference.
The primary function of the digital image processing subsystem 26 is to process digital images that have been captured and buffered by the image capturing and buffering subsystem 25, during both far-field and near-field modes of illumination and operation. Such image processing operation includes image-based bar code decoding methods described in detail hereinafter and in U.S. Pat. No. 7,128,266, incorporated herein by reference.
The primary function of the input/output subsystem 27 is to support universal, standard and/or proprietary data communication interfaces with external host systems and devices, and output processed image data and the like to such external host systems or devices by way of such interfaces. Examples of such interfaces, and technology for implementing the same, are given in U.S. Pat. Nos. 6,619,549 and 6,619,549, incorporated herein by reference in its entirety.
The primary function of the System Control Subsystem is to provide some predetermined degree of control, coordination and/or management signaling services to each subsystem component integrated within the system, as shown. While this subsystem can be implemented by a programmed microprocessor, in the preferred embodiments of the present invention, this subsystem is implemented by the three-tier software architecture supported on microcomputing platform shown in
The primary function of the manually-activatable trigger switch 5 integrated with the hand-supportable/countertop-supportable housing is to enable the user to generate a control activation signal (i.e. trigger event signal) upon manually depressing the same (i.e. causing a trigger event), and to provide this control activation signal to the system control subsystem for use in carrying out its complex system and subsystem control operations, described in detail herein.
The primary function of the system configuration parameter table 29 in system memory is to store (in non-volatile/persistent memory) a set of system configuration and control parameters (i.e. SCPs) for each of the available features and functionalities, and programmable modes of system operation supported in any particular embodiment of the present invention, and which can be automatically read and used by the system control subsystem 30 as required during its complex operations. Notably, such SCPs can be dynamically managed as taught in great detail in copending U.S. patent application Ser. No. 11/640,814 filed Dec. 18, 2006, incorporated herein by reference.
The detailed structure and function of each subsystem will now be described in detail above.
Specification of the System Implementation Model for the Digital Image Capture and Processing System of the Present Invention
As shown in
During image acquisition operations, the image pixels are sequentially read out of the image detection array 35. Although one may choose to read column-wise or row-wise for some CMOS image sensors, without loss of generality, the row-by-row read out of the data is preferred. The pixel image data set is arranged in the SDRAM 48 sequentially, starting at address OXAOEC0000. To randomly access any pixel in the SDRAM is a straightforward matter: the pixel at row y ¼ column x located is at address (OXAOEC0000+y x 1280+x). As each image frame always has a frame start signal out of the image detection array 35, that signal can be used to start the DMA process at address OXAOEC0000, and the address is continuously incremented for the rest of the frame. But the reading of each image frame is started at address OXAOEC0000 to avoid any misalignment of data. Notably, however, if the microprocessor 46 has programmed the CMOS image detection array 35 to have a ROI window, then the starting address will be modified to (OXAOEC0000+1280 X R1), where R1 is the row number of the top left corner of the ROI. Further details regarding memory access are described in Applicant's prior U.S. Pat. No. 7,128,266, incorporated herein by reference.
Specification of the Multi-Mode LED-Based Illumination Subsystem Employed in the Hand-Supportable Digital Image Capture and Processing System of the Present Invention
In the illustrative embodiment shown in
As shown in
As shown in
As shown in
During system operation, the far-field illumination mode of the multi-mode illumination subsystem 22 is automatically activated in response to detecting that an object resides within the far-field portion of the FOV by the IR object motion detection and analysis subsystem. In response thereto, the multi-mode illumination subsystem 22 drives the far-field illumination array 23B to illuminate the far-field portion of the FOV, as shown in
In general, the multi-mode illumination subsystem 22 is designed to cover the entire optical field of view (FOV) of the digital image capture and processing system with sufficient illumination to generate high-contrast images of bar codes located at both short and long distances from the imaging window.
As shown in
Notably, in the illustrative embodiment, the red-wavelength reflecting high-pass optical filter element 40A is embodied within the imaging window panel 3, whereas the low-pass optical filter element 40B is disposed before the image detection array 35 either disposed among the focusing lens elements of the image formation optics 34, or realized as a dichroic surface on one of the FOV folding mirrors 74 and 75. This forms the integrated narrow-band optical filter subsystem 40 which ensures that the object within the FOV is imaged at the image detection array 35 using only spectral components within the narrow-band of illumination produced from the illumination subsystem 22, while all other components of ambient light outside this narrow range (e.g. 15 nm) are substantially rejected.
Specification of the Digital Image Formation and Detection (i.e. IFD or Camera) Subsystem During its Wide-Area Mode of Digital Image Formation and Detection, Supported by Near and Far Fields of Narrow-Band Wide-Area Illumination
As shown in FIGS. 5A through 5G2 and 6A through 6C, the digital image formation and detection subsystem 21 of the illustrative embodiment has a wide-area 2D image capture mode of operation where either substantially all or a selected region of pixels in its CMOS image detection array 35 are enabled. However, the image formation and detection subsystem 21 can also be easily programmed to support other modes of image capture, namely: (i) a narrow-area image capture mode in which only a few central rows of pixels about the center of the image detection array are enabled, as disclosed in U.S. Pat. No. 7,128,266 and U.S. application Ser. No. 10/989,220, and incorporated herein by reference, and (ii) a wide-area image capture mode in which a predetermined region of interest (ROI) on the CMOS image sensing array is visibly marked as being a region in which its pixel data will be cropped and processed for reading information graphically encoded within the ROI region of captured images, as disclosed in U.S. application Ser. No. 10/989,220 supra.
As shown in
In
Specification of the Narrow-Band Optical Filter Subsystem Integrated within the Housing of the Digital Image Capture and Processing System of the Present Invention
As shown in FIGS. 5D through 5E3, the housing of the digital image capture and processing system of the present invention has integrated within its housing, narrow-band optical filter subsystem 40 for transmitting substantially only the very narrow band of wavelengths (e.g. 620-700 nanometers) of visible illumination produced from the narrow-band multi-mode illumination subsystem 22, and rejecting all other optical wavelengths outside this narrow optical band however generated (i.e. ambient light sources). As shown, narrow-band optical filter subsystem 40 comprises: (i) high-pass (i.e. red-wavelength reflecting) optical filter element 40A embodied within the plastic imaging window; and (ii) low-pass optical filter element 40B disposed before the CMOS image detection array 35. as described above. Alternatively, the high-pass (i.e. red-wavelength reflecting) optical filter element 40A can be embodied as a dichroic film applied to the surface of one of its FOV folding mirrors 74 or 75 employed in the image formation and detection subsystem. Preferably, the red-color window filter 40A will have substantially planar surface characteristics over its central planar region 3A to avoid focusing or defocusing of light transmitted therethrough during imaging operations. During system operation, these optical filter elements 40A and 40B optically cooperate to form a narrow-band optical filter subsystem 40 transmitting substantially only the very narrow band of wavelengths (e.g. 620-700 nanometers) of visible illumination produced from the LED-based illumination subsystem 22 and reflected/scattered off the illuminated object, while rejecting all other optical wavelengths outside this narrow optical band however generated (i.e. ambient light sources).
Alternatively, the band-pass optical filter subsystem 40 may also be realized as an integrated multi-layer filter structure disposed anywhere before its CMOS image detection array 35, or even within the imaging window 3 itself.
As shown in FIG. 5E1, the light transmission characteristics (energy versus wavelength) associated with the low-pass optical filter element 40B indicate that optical wavelengths below 620 nanometers are transmitted therethrough, whereas optical wavelengths above 620 nm are substantially blocked (e.g. absorbed or reflected).
As illustrated in FIG. 5E2, optical wavelengths greater than 620 nanometers are transmitted through the high-pass optical filter element 40B, while optical wavelengths less than 620 nm are substantially blocked (e.g. absorbed or reflected).
FIG. 5E3 shows the transmission characteristics of the narrow-based spectral filter subsystem 40, plotted against the spectral characteristics of the LED-emissions produced from the LED-arrays in the Multi-Mode LED-Based Illumination Subsystem of the illustrative embodiment of the present invention. Notably, the pass-bandwidth of the optical filtering subsystem 40 is slightly greater than the bandwidth of the laser illumination beam generated by the multi-mode illumination subsystem.
During system operation, spectral band-pass filter subsystem 40 greatly reduces the influence of the ambient light, which falls upon the CMOS image detection array 35 during the image capturing operations.
By virtue of the optical filter of the present invention, an optical shutter mechanism is eliminated in the system. In practice, the optical filter can reject more than 85% of incident ambient light, and in typical environments, the intensity of LED illumination is significantly more than the ambient light on the CMOS image detection array 35. Thus, while an optical shutter is required in nearly most conventional CMOS imaging systems, the digital image capture and processing system of the present invention effectively manages the time that the CMOS image detection array 35 is exposed to narrow-band illumination by controlling the time duration that LED-based illumination arrays 23A and 23B generate and project illumination into the FOV in which the object is detected. This method of illumination control is achieved using control signals generated by (i) the CMOS image detection array 35 and (ii) the automatic light exposure measurement and illumination control subsystem 24 in response to real-time measurements of light exposure within the central portion of the FOV, while the delivery of narrow-band illumination is controllably delivered to the object in the FOV by operation of the band-pass optical filter subsystem 40 described above. The result is a simple system design, without moving parts, and having a reduced manufacturing cost.
In
Details regarding a preferred method of designing the image formation (i.e. camera) optics within the image-based bar code reader of the present invention using the modulation transfer function (MTF) are described in Applicants' U.S. Pat. No. 2,270,272, incorporated herein by reference.
Specification of the Automatic Zoom/Focus Mechanism Integrated within the Image Formation and Detection Subsystem of the Digital Image Capture and Processing System of the Present Invention
As shown in
Specification of Modes of Operation of the Area-Type Image Sensing Array Employed in the Digital Image Formation and Detection Subsystem of the Present Invention
In the digital image capture and processing system 1 of the present invention, the CMOS area-type image detection array 35 supports several different modes of suboperation, namely: a Single Frame Shutter Mode (i.e. Snap-Shot Mode) of the operation illustrated in
The Single Frame Shutter Mode (i.e. Snap-Shot Mode) of the Operation
Referring to
The Real Video Mode of the Operation
Referring to
The Periodic Snap Shot (“Pseudo-Video”) Mode of the Operation
Referring to
Specification of the Automatic Object Motion Detection and Analysis Subsystem of the Present Invention: Various Ways to Realize Said Subsystem in Practice
As shown in
In general, automatic object motion detection and analysis subsystem 20 operates as follows. In system modes of operation requiring automatic object presence and/or range detection, automatic object motion detection and analysis subsystem 20 will be activated at system start-up and operational at all times of system operation, typically continuously providing the system control subsystem 30 with information about the state of objects within the object detection field 32 of the imaging-based system of the first illustrative embodiment. During such operation, the system control subsystem responds to such state information and generates control activation signals during particular stages of the system control process, such as, for example, control activation signals which are provided to system control subsystem 30 for (i) activating either the near-field and/or far-field LED illumination arrays, and (ii) controlling how strongly these LED illumination arrays 23A, 23B should be driven to ensure quality image exposure at the CMOS image detection array 35.
It is appropriate at this juncture to describe these different kinds of object motion detection and analysis subsystem hereinbelow.
Automatic Object Motion and Analysis Detection Subsystem Realized Using a Pair of Infra-Red (IR) Transmitting and Receiving Laser Diodes
As shown in
Automatic Object Motion and Analysis Detection Subsystem Realized Using an IR-Based Image Sensing and Processing Device
As shown in
Automatic Object Motion Detection and Analysis Subsystem Realized Using an IR-Based LADAR Pulse-Doppler Based Object Motion and Velocity Detection Device
As shown in
While several techniques have been detailed above for automatically detecting the motion and velocity of objects within the FOV of the digital image capture and processing system of the present invention, it understood that other methods may be employed, as disclosed, for example, in great detail in Applicants' copending application Ser. Nos. 11/489,259 filed Jul. 19, 2006 and 11/880,087 filed Jul. 19, 2007, both being incorporated herein by reference, in their entirety.
Specification of the Automatic Linear Targeting Illumination Subsystem of the Present Invention
Referring to
As shown in
Specification of the Automatic Light Exposure Measurement and Illumination Control Subsystem of the Present Invention
Referring to
As shown in
During object illumination and imaging operations, narrow-band light from the LED arrays 23A and/or 23B is reflected from the target object (at which the digital imager is aimed) and is accumulated by the CMOS image detection array 35. The object illumination process must be carried out for an optimal duration so that each acquired digital image frame has good contrast and is not saturated. Such conditions are required for consistent and reliable bar code decoding operation and performance.
In order to automatically control the brightness and contrast of acquired images, the automatic light exposure measurement and illumination control subsystem 24 carries out the following operations: (i) it automatically measures the amount of light reflected from the target object (i.e measured light exposure at the image plane of the CMOS imaging sensing array); (ii) it automatically calculates the maximum time that the CMOS image detection array 35 should be kept exposed to the actively-driven LED-based illumination array 23A (23B) associated with the multi-mode illumination subsystem 22; (iii) it automatically controls the time duration that the illumination subsystem 22 illuminates the target object with narrow-band illumination generated from the activated LED illumination array; and then (iv) it automatically deactivates the illumination array when the calculated time to do so expires (i.e. lapses).
By virtue of its operation, the automatic light exposure measurement and illumination control subsystem 24 eliminates the need for a complex shuttering mechanism for CMOS-based image detection array 35. This novel mechanism ensures that the digital image capture and processing system of the present invention generates non-saturated images with enough brightness and contrast to guarantee fast and reliable image-based bar code decoding in demanding end-user applications.
Specification of the System Control Subsystem of the Present Invention
Referring to
As shown in
Also, as illustrated, system control subsystem 30 controls the image detection array 35, the illumination subsystem 22, and the automatic light exposure measurement and illumination control subsystem 24 in each of the submodes of operation of the imaging detection array, namely: (i) the snap-shot mode (i.e. single frame shutter mode) of operation; (ii) the real-video mode of operation; and (iii) the pseudo-video mode of operation. Each of these modes of image detection array operation will be described in greater detail below.
Single Frame Shutter Mode (i.e. Snap-Shot Mode) of the Sub-Operation Supported by CMOS Image Detection Array
When the single frame shutter mode (i.e. snap-shot mode) of the sub-operation is selected, as shown in
Notably, during this single frame shutter mode (i.e. snap-shot mode) of the sub-operation, a novel exposure control method is used to ensure that all rows of pixels in the CMOS image detection array have a common integration time, thereby capturing high quality images even when the object is in a state of high speed motion, relative to the image sensing array. This novel exposure control technique shall be referred to as “the global exposure control method” of the present invention, which is described in great detail in the flow chart of
Real-Video Mode of the Sub-Operation Supported by CMOS Image Detection Array
When the real-video mode of sub-operation is selected, as shown in
Periodic Snap Shot (“Pseudo-Video”) Mode of the Operation Supported by the CMOS Image Detection Array
When the periodic snap shot (“pseudo-video”) mode of sub-operation is selected, as shown in
Specification of the Three-Tier Software Architecture of the Digital Image Capture and Processing System of the First Illustrative Embodiment of the Present Invention
As shown in
While the operating system layer of the digital image capture and processing system is based upon the Linux operating system, it is understood that other operating systems can be used (e.g. Microsoft Windows, Apple Mac OSX, Unix, etc), and that the design preferably provides for independence between the main Application Software Layer and the Operating System Layer, and therefore, enables of the Application Software Layer to be potentially transported to other platforms. Moreover, the system design principles of the present invention provides an extensibility of the system to other future products with extensive usage of the common software components, decreasing development time and ensuring robustness.
In the illustrative embodiment, the above features are achieved through the implementation of an event-driven multi-tasking, potentially multi-user, Application layer running on top of the System Core software layer, called SCORE. The SCORE layer is statically linked with the product Application software, and therefore, runs in the Application Level or layer of the system. The SCORE layer provides a set of services to the Application in such a way that the Application would not need to know the details of the underlying operating system, although all operating system APIs are, of course, available to the application as well. The SCORE software layer provides a real-time, event-driven, OS-independent framework for the product Application to operate. The event-driven architecture is achieved by creating a means for detecting events (usually, but not necessarily, when the hardware interrupts occur) and posting the events to the Application for processing in real-time manner. The event detection and posting is provided by the SCORE software layer. The SCORE layer also provides the product Application with a means for starting and canceling the software tasks, which can be running concurrently, hence, the multi-tasking nature of the software system of the present invention.
Specification of Software Modules within the Score Layer of the System Software Architecture Employed in the Digital Image Capture and Processing System of the Present Invention
The SCORE layer provides a number of services to the Application layer.
The Tasks Manager provides a means for executing and canceling specific application tasks (threads) at any time during the product Application run.
The Events Dispatcher provides a means for signaling and delivering all kinds of internal and external synchronous and asynchronous events
When events occur, synchronously or asynchronously to the Application, the Events Dispatcher dispatches them to the Application Events Manager, which acts on the events accordingly as required by the Application based on its current state. For example, based on the particular event and current state of the application, the Application Events Manager can decide to start a new task, or stop currently running task, or do something else, or do nothing and completely ignore the event.
The Input/Output Manager provides a means for monitoring activities of input/output devices and signaling appropriate events to the Application when such activities are detected.
The Input/Output Manager software module runs in the background and monitors activities of external devices and user connections, and signals appropriate events to the Application Layer, which such activities are detected. The Input/Output Manager is a high-priority thread that runs in parallel with the Application and reacts to the input/output signals coming asynchronously from the hardware devices, such as serial port, user trigger switch 2C, bar code reader, network connections, etc. Based on these signals and optional input/output requests (or lack thereof) from the Application, it generates appropriate system events, which are delivered through the Events Dispatcher to the Application Events Manager as quickly as possible as described above.
The User Commands Manager provides a means for managing user commands, and utilizes the User Commands Table provided by the Application, and executes appropriate User Command Handler based on the data entered by the user.
The Input/Output Subsystem software module provides a means for creating and deleting input/output connections and communicating with external systems and devices
The Timer Subsystem provides a means of creating, deleting, and utilizing all kinds of logical timers.
The Memory Control Subsystem provides an interface for managing the multi-level dynamic memory with the device, fully compatible with standard dynamic memory management functions, as well as a means for buffering collected data. The Memory Control Subsystem provides a means for thread-level management of dynamic memory. The interfaces of the Memory Control Subsystem are fully compatible with standard C memory management functions. The system software architecture is designed to provide connectivity of the device to potentially multiple users, which may have different levels of authority to operate with the device.
The User Commands Manager, which provides a standard way of entering user commands, and executing application modules responsible for handling the same. Each user command described in the User Commands Table is a task that can be launched by the User Commands Manager per user input, but only if the particular user's authority matches the command's level of security.
The Events Dispatcher software module provides a means of signaling and delivering events to the Application Events Manager, including the starting of a new task, stopping a currently running task, or doing something or nothing and simply ignoring the event.
Specification of Software Modules within the Application Layer of the System Software Architecture Employed in the Digital Image Capture and Processing System of the Present Invention
The image processing software employed within the system hereof performs its bar code reading function by locating and recognizing the bar codes within the frame of a captured digital image comprising pixel data. The modular design of the image processing software provides a rich set of image processing functions, which can be utilized in future applications, related or not related to bar code symbol reading, such as: optical character recognition (OCR) and verification (OCV); reading and verifying directly marked symbols on various surfaces; facial recognition and other biometrics identification; etc.
The Area Image Capture Task, in an infinite loop, performs the following task. It illuminates the entire field-of-view (FOV) and acquires a wide-area (e.g. 2D) digital image of any objects in the FOV. It then attempts to read bar code symbols represented in the captured frame of image data using the image processing software facilities supported by the digital image processing subsystem 26 to be described in greater detail hereinafter. If a bar code symbol is successfully read, then subsystem 26 saves the decoded data in the special decode data buffer. Otherwise, it clears the decode data buffer. Then, it continues the loop. The Area Image Capture Task routine never exits on its own. It can be canceled by other modules in the system when reacting to other events. For example, when a user pulls the trigger switch 5, the event TRIGGER_ON is posted to the Application. The Application software responsible for processing this event, checks if the Area Image Capture Task is running, and if so, it cancels it and then starts the Main Task. The Area Image Capture Task can also be canceled upon OBJECT_DETECT_OFF event, posted when the user moves the digital imager away from the object, or when the user moves the object away from the digital imager. The Area Image Capture Task routine is enabled (with Main Task) when “semi-automatic-triggered” system modes of programmed operation are to be implemented on the digital image capture and processing platform of the present invention.
The Linear Targeting Illumination Task is a simple routine which is enabled (with Main Task) when manually or automatically triggered system modes of programmed are to be implemented on the illumination and imaging platform of the present invention.
Various bar code symbologies are supported by the digital image capture and processing system of the present invention. Supported bar code symbologies include: Code 128; Code 39; 12 of 5; Code93; Codabar; UPC/EAN; Telepen; UK-Plessey; Trioptic; Matrix 2 of 5; Ariline 2 of 5; Straight 2 of 5; MSI-Plessey; Code11; and PDF417.
Specification of Method of Reading a Programming-Type Bar Code Symbol Using the Hand-Supportable Digital Image Capture and Processing System of the Present Invention
Referring to FIGS. 14A1 and 14A2, a novel method of reading a “programmable bar code symbol” using the digital image capture and processing system of the present invention will now be described.
As shown in FIG. 14A1, when configured in the programming-type bar code reading mode of the present invention, the image capture and processing system of the present invention automatically generates a visible linear target illumination beam upon detection of the target menu, enabling the user/operator to target a programming-type code symbol with the visible targeting illumination beam. As shown in FIG. 14A2, with the programming bar code symbol aligned with the targeting illumination beam, the operator then manually actuates the trigger switch 5 and in response thereto, the system automatically generates a field of illumination within the FOV which illuminates the targeted programming-type bar code symbol, while (i) only an imaged subregion of the FOV, centered about the linear targeting illumination beam, is made decode-processing activated during illumination and imaging operations, and (ii) the linear targeting illumination beam is deactivated (i.e. turned off). This technique enables only a narrow-area image, centered about the reference location of the linear illumination targeting beam, to be captured and decode processed, for the purpose of decoding the targeted programming-type bar code symbol, which is typically a 1D symbology. By virtue of the present invention here, it is possible to avoid the inadvertent reading of multiple programming-type bar code symbols (i) printed on a bar code menu page or sheet, or (ii) displayed on a LCD display screen, as the case may be.
Specification of the Various Modes of Operation in the Digital Image Capture and Processing System of The Present Invention
The digital image capture and processing system of the illustrative embodiment supports many different methods and modes of digital image capture and processing. Referring to FIGS. 15A1 through 22D, a number of these methods will now be described in detail below.
First Illustrative Method of Hands-Free Digital Imaging Using the Digital Image Capture and Processing System of the Present Invention
Referring to FIGS. 15A1 through 15D, a first illustrative method of hands-free (i.e. presentation/pass-through) digital imaging will be described using the digital image capture and processing system of the first illustrative embodiment, wherein its image formation and detection subsystem is operated in either (i) snap-shot and real video modes of sub-operation of the CMOS image sensing array 35, illustrated in
The flow chart shown in FIGS. 15A1 through 15A3 describes the primary steps involved in carrying out the first illustrative method of hands-free (i.e. presentation/pass-through) digital imaging according to the present invention.
As shown at Block A in FIG. 15A1, the system is configured by enabling the automatic object presence detector, and the IFD subsystem (i.e. CMOS image sensing array) initialized in the snap-shot mode of suboperation. At this stage, this system is ready to be used as shown in
Then at Block B, the system control subsystem determines whether or not the object is detected in the FOV. If the object is not present in the FOV, the system continues to this check this condition about Block B. If the object is detected at Block B, then the system control subsystem proceeds to Block C and sets the operation of the Timer (0<t1<T1), configures the IFD subsystem in a video mode (e.g. real or pseudo video mode) as shown
If at Block E, the system control subsystem determines that image processing has not produced a successful decoded output within T2, then the system proceeds to Block H and determines whether or not a PDF codce symbol has been detected. If a PDF code symbol has been detected, then at Block I more time is allowed for the image processor to code the PDF code symbol.
Then at Block J, the system control subsystem determines whether or not a PDF code symbol is in fact decoded, and if so, then at Block K, the system generates symbol character data for the decoded PDF symbol. If a PDF code symbol has not been decoded with the extra time allowed, then the system proceeds to Block L and determines whether or not the object is still in the FOV of the system. If the object has move out of the FOV, then the system returns to Block G, where the IFD subsystem is reset to its snap-shot mode (e.g. for approximately 40 milliseconds).
If, at Block L in FIG. 15A2, the system control subsystem determines that the object is still present within the FOV, then the system control subsystem proceeds to Block M and determines whether or not the time allowed for the video mode (e.g. 300 milliseconds) has lapsed. If the time allowed for video mode operation has not elapsed, then the system proceeds to Block D, where the next frame of digital image data is detected, and next frame of image data processed in an attempt to decode a code symbol within the allowed time for decoding (e.g. less than 30 ms).
If at Block M the system control subsystem determines that the time for Video Mode operation has lapsed, then the system control subsystem proceeds to Block N and reconfigures the IFD subsystem to the snap-shot mode (shown in
At Block O in FIG. 15A3, the system control subsystem determines whether or not image processing has produced decoded output, and if so, then at Block P, symbol character data (representative of the read code symbol) is generated and transmitted to the host computer.
If at Block O in FIG. 15A3 the system control subsystem determines that image processing has not produced successful decoded output, then at Block Q the system control subsystem determines whether or not the object is still present within the FOV. If it is determined at Block Q that the object is no longer present in the FOV, then the system control subsystem returns to Block G, where the IFD subsystem is reset to its snap-shot mode. However, if at Block Q the system control subsystem determines that the object is still present in the FOV, then at Block R the system control subsystem determines whether the Timer set at Block D has run out of time (t1<T1). If the Timer has run out of time (t1>T1), then the system control subsystem proceeds to Block G, where the IFD subsystem is reset to its snap-shot mode and returns to Block B to determine whether an object is present within the FOV. However, if the system control subsystem determines at a Block R that the Timer has not yet run out of time (t1<T1), then the system control subsystem proceeds to Block N, and reconfigures the IFD Subsystem to its snap-shot mode, and then acquires and processes a single digital image of the object in the FOV, allowing up to approximately 500 milliseconds to do so.
Notably, during the video mode of sub-operation, then IFD subsystem can be running either the real or pseudo video modes illustrated in
Second Illustrative Method of Hands-Free Digital Imaging Using the Digital Image Capture and Processing System of the Present Invention
Referring to
The flow chart shown in
As shown at Block A in
Then at Block B in
If at Block F, the system control subsystem determines that image processing has not produced a successful decoded output within T2, then the system proceeds to Block H and determines whether or not an object is still present within the FOV. If the object has move out of the FOV, then the system returns to Block B, where automatic object detection operations resume. If, however, at Block H in
Third Illustrative Method of Hands-Free Digital Imaging Using the Digital Image Capture and Processing System of the Present Invention
Referring to FIGS. 17A1 through 17C, a third illustrative method of hands-free (i.e. presentation/pass-through) digital imaging will be described using the digital image capture and processing system of the first illustrative embodiment, wherein its image formation and detection subsystem is operated in its video mode of operation for a first predetermined time period (e.g. approximately 5000 milliseconds), to repeatedly attempt to read a bar code symbol within one or more digital images captured during system operation.
The flow chart shown in FIG. 17A1 describes the primary steps involved in carrying out the second illustrative method of hands-free (i.e. presentation/pass-through) digital imaging according to the present invention.
As shown at Block A in FIG. 17A1, the system is configured by enabling the automatic object presence detection subsystem, and initializing (i.e. configuring) the IFD subsystem (i.e. CMOS image sensing array) in the (real or pseudo) video mode of suboperation (illustrated in
Then at Block B in FIG. 17A1, the system control subsystem determines whether or not the object is detected in the FOV. If the object is not present in the FOV, then the system continues to this check this condition about Block B. If the object is detected at Block B, then the system control subsystem proceeds to Block C and sets the operation of the Timer (0<t1<T1) and starts continuous image acquisition (i.e. object illumination and imaging operations), as shown in
Then, as indicated at Block D in FIG. 17A1, the IFD subsystem detects the next image frame of the detected object in the FOV, and the image processing subsystem processes the digital image frame in an attempt to produce a successful decoded output (e.g. decode a bar code symbol), and allow time for decoding to be no more than the image frame acquisition (e.g. t<30 ms) within about T2<30 milliseconds.
At Block E in FIG. 17A2, the system control subsystem determines whether or not image processing has produced a successful decoded output (e.g. read bar code symbol) within T2 (e.g. T2=30 ms). If image processing has produced a successful output within T2, then at Block F, the system control subsystem generates symbol character data and transmits the data to the host system, and then proceeds to Block B, where the object present detection subsystem resumes its automatic object detection operations.
If at Block F, the system control subsystem determines that image processing has not produced a successful decoded output within T2, then the system proceeds to Block H and determines whether or not an object is still present within the FOV. If the object has moved out of the FOV, then the system returns to Block B, where automatic object detection operations resume.
If, however, at Block H in FIG. 17A2, the system control subsystem determines that the object is still present within the FOV, then the system control subsystem proceeds to Block I and determines whether or not the earlier set timer T1 has lapsed. If timer T1 has not elapsed, then the system returns to Block D, where the next frame of digital image data is detected and processed in an attempt to decode a code symbol within the allowed time T2 for decoding. If at Block I, the system control subsystem determines that timer T1 has lapsed, then the system control subsystem proceeds to Block B, where automatic object detection resumes.
First Illustrative Method of Hand-Held Digital Imaging Using the Digital Image Capture and Processing System of the Present Invention
Referring to
The flow chart shown in
As shown at Block A in
Then at Block B in
If at Block F in
Second Illustrative Method of Hand-Held Digital Imaging Using the Digital Image Capture and Processing System of the Present Invention
Referring to FIGS. 19A1 through 19C, a second illustrative method of hand-held digital imaging will be described using the digital image capture and processing system of the first illustrative embodiment, wherein its image formation and detection subsystem is operated in its video mode of operation for a first predetermined time period (e.g. approximately 5000 milliseconds), to repeatedly attempt to read a bar code symbol within one or more digital images captured during system operation.
The flow chart shown in FIG. 19A1 describes the primary steps involved in carrying out the second illustrative method of hand-held digital imaging according to the present invention.
As shown at Block A in FIG. 19A1, the system is configured by enabling the automatic object presence detection subsystem, and initializing (i.e. configuring) the IFD subsystem (i.e. CMOS image sensing array) in the (real or pseudo) video mode of suboperation (illustrated in
Then at Block B in FIG. 19A1, the system control subsystem determines whether or not the object is detected in the FOV. If the object is not detected in the FOV, then the system control subsystem continues to this check this condition about Block B. If the object is detected at Block B, then the system control subsystem proceeds to Block C and sets the operation of the Timer (0<t1<T1) and starts continuous image acquisition (i.e. object illumination and imaging operations), as shown in
Then, as indicated at Block D in FIG. 19A1, the IFD subsystem detects the next image frame of the detected object in the FOV, and the image processing subsystem processes the digital image frame in an attempt to produce a successful decoded output (e.g. decode a bar code symbol), and allow time for decoding to be no more than the image frame acquisition (e.g. t<30 ms) within about T2<30 milliseconds.
At Block E in FIG. 19A2, the system control subsystem determines whether or not image processing has produced a successful decoded output (e.g. read bar code symbol) within T2 (e.g. T2=30 ms). If image processing has produced a successful output within T2, then at Block F, the system control subsystem generates symbol character data and transmits the data to the host system, and then proceeds to Block B, where the object presence detection subsystem resumes its automatic object detection operations.
If at Block E, the system control subsystem determines that image processing has not produced a successful decoded output within T2, then the system proceeds to Block H and determines whether or not a PDF code symbol has been detected within the FOV. If so, then at Block I the system control subsystem allows more time for the image processor to decode the detected PDF code symbol. Then if the system control subsystem determines, at Block J, that a PDF code symbol has been decoded at Block J, then at Block K, the image processor generates symbol character data for the decoded PDF symbol. If, at Block J, a PDF code symbol has not been decoded with the extra time allowed, then the system control subsystem proceeds to Block L and determines whether or not the object is still in the FOV of the system. If the object has moved out of the FOV, then the system returns to Block B, where the object detection subsystem resumes its automatic object detection operations.
If, at Block L in FIG. 19A2, the system control subsystem determines that the object is still present within the FOV, then the system control subsystem proceeds to Block M, where it determines whether or not the allowed time for the video mode (e.g. T1=5000 milliseconds) has elapsed the time for video mode operation has lapsed. If timer T1 has elapsed, then the system control subsystem returns to Block B, where the object detection subsystems resumes its automatic object detection operations. If timer T1 has not elapsed at Block M, then the system control subsystem returns to Block D, where the IFD subsystem detects the next image frame, and the image processor attempts to decode process a code symbol graphically represented in the captured image frame, allowing not more than frame acquisition time (e.g. less than 30 milliseconds) to decode process the image.
Third Illustrative Method of Hand-Held Digital Imaging Using the Digital Image Capture and Processing System of the Present Invention
Referring to
The flow chart shown in
As shown at Block A in
Then at Block B in
If at Block F in
Fourth Illustrative Method of Hand-Held Digital Imaging Using the Digital Image Capture and Processing System of the Present Invention
Referring to FIGS. 21A1 through 21C, a fourth illustrative method of hand-held digital imaging will be described using the hand-supportable digital image capture and processing system of the first illustrative embodiment, wherein its image formation and detection subsystem is operated in its video mode of operation for a first predetermined time period (e.g. approximately 5000 milliseconds), to repeatedly attempt to read a bar code symbol within one or more digital images captured during system operation.
The flow chart shown in FIG. 21A1 describes the primary steps involved in carrying out the second illustrative method of hand-held digital imaging according to the present invention, involving the use of its manually-actuatable trigger switch 5 and video imaging mode of subsystem operation.
As shown at Block A in FIG. 21A1, the system is configured by enabling the automatic object presence detection subsystem, and initializing (i.e. configuring) the IFD subsystem (i.e. CMOS image sensing array) in the (real or pseudo) video mode of suboperation (illustrated in
Then at Block B in FIG. 21A1, the system control subsystem determines whether or not the trigger switch 5 is manually actuated. If the object is not manually actuated at Block B, then the system control subsystem continues to this check this condition about Block B. If the trigger switch is manually actuated at Block B, then the system control subsystem proceeds to Block C and sets the operation of the Timer (0<t1<T1) and starts continuous image acquisition (i.e. object illumination and imaging operations), as shown in
Then, as indicated at Block D in FIG. 21A1, the IFD subsystem detects the next image frame of the object in the FOV, and the image processing subsystem processes the digital image frame in an attempt to produce a successful decoded output (e.g. decode a bar code symbol), and allow time for decoding to be no more than the image frame acquisition (e.g. t<30 ms) within about T2<30 milliseconds.
At Block E in FIG. 21A2, the system control subsystem determines whether or not image processing has produced a successful decoded output (e.g. read bar code symbol) within T2 (e.g. T2=30 ms). If image processing has produced a successful output within T2, then at Block F, the system control subsystem generates symbol character data and transmits the data to the host system, and then proceeds to Block B, where the system control subsystem resumes its trigger switch actuation detection operations.
If at Block E, the system control subsystem determines that image processing has not produced a successful decoded output within T2, then the system proceeds to Block H and determines whether or not a PDF code symbol has been detected within the FOV. If so, then at Block I the system control subsystem allows more time for the image processor to decode the detected PDF code symbol. Then if the system control subsystem determines, at Block J, that a PDF code symbol has been decoded at Block J, then at Block K, the image processor generates symbol character data for the decoded PDF symbol. If, at Block J, a PDF code symbol has not been decoded with the extra time allowed, then the system control subsystem proceeds to Block L and determines whether or not the object is still within the FOV. If the object is no longer in the FOV, then the system returns to Block B, where the system control subsystems resumes trigger switch actuation detection operations.
If, at Block L in FIG. 21A2, the system control subsystem determines that the object is still present within the FOV, then the system control subsystem proceeds to Block M, where it determines whether or not the allowed time for the video mode (e.g. T1=5000 milliseconds) has elapsed the time for video mode operation has lapsed. If timer T1 has elapsed, then the system control subsystem returns to Block B, where the system control subsystem resumes its detection of trigger switch actuation. If timer T1 has not elapsed at Block M, then the system control subsystem returns to Block D, where the IFD subsystem detects the next image frame, and the image processor attempts to decode process a code symbol graphically represented in the captured image frame, allowing not more than frame acquisition time (e.g. less than 30 milliseconds) to decode process the image.
Fifth Illustrative Method of Hand-Held Digital Imaging Using the Digital Image Capture and Processing System of the Present Invention
Referring to
The flow chart shown in
As shown at Block A in
When the manual trigger switch 5 is actuated at Block B, then the system control subsystem proceeds to Block C and sets the operation of the Timer (0<t1<T1). For illustrative purposes, consider T1=5000 milliseconds. Then at Block D in
If at Block F in
Specification of the Second Illustrative Embodiment of the Digital Image Capture and Processing System of Present Invention Employing Single Linear LED Illumination Array to Illuminate the Field of View (FOV) of the System
Referring to
In
As shown in
As shown in
Specification of the Three-Tier Software Architecture of the Digital Image Capture and Processing System of the Second Illustrative Embodiment of the Present Invention
As shown in
While the operating system layer of the digital image capture and processing system is based upon the Linux operating system, it is understood that other operating systems can be used (e.g. Microsoft Windows, Apple Mac OSX, Unix, etc), and that the design preferably provides for independence between the main Application Software Layer and the Operating System Layer, and therefore, enables of the Application Software Layer to be potentially transported to other platforms. Moreover, the system design principles of the present invention provides an extensibility of the system to other future products with extensive usage of the common software components, decreasing development time and ensuring robustness.
In the illustrative embodiment, the above features are achieved through the implementation of an event-driven multi-tasking, potentially multi-user, Application layer running on top of the System Core software layer, called SCORE. The SCORE layer is statically linked with the product Application software, and therefore, runs in the Application Level or layer of the system. The SCORE layer provides a set of services to the Application in such a way that the Application would not need to know the details of the underlying operating system, although all operating system APIs are, of course, available to the application as well. The SCORE software layer provides a real-time, event-driven, OS-independent framework for the product Application to operate. The event-driven architecture is achieved by creating a means for detecting events (usually, but not necessarily, when the hardware interrupts occur) and posting the events to the Application for processing in real-time manner. The event detection and posting is provided by the SCORE software layer. The SCORE layer also provides the product Application with a means for starting and canceling the software tasks, which can be running concurrently, hence, the multi-tasking nature of the software system of the present invention.
Specification of the Third Illustrative Embodiment of the Digital Image Capture and Processing System of the Present Invention, Employing Single Linear LED Illumination Array for Full Field Illumination
Referring now to
In some important respects, the third illustrative embodiment of the digital image capture and processing system 1″ is similar to the second illustrative system embodiment 1′, namely: both systems employ a single linear array of LEDs to illuminate its field of view (FOV) over the working range of the system, in a way to illuminate objects located within the working distance of the system during imaging operations, while minimizing annoyance to the operator, as well as others in the vicinity thereof during object illumination and imaging operations.
However, the third illustrative embodiment has many significant advancements over the second illustrative embodiment, relating particularly to its: (i) prismatic illumination-focusing lens structure 130 illustrated in
As shown in
As show in
As shown in
As shown in
When the front and rear housing panels 2B″ and 2A″ are joined together, with the PC board 8 disposed therebetween, the prismatic illumination-focusing lens panel 3″ will sit within the slanted cut-aways 133E and 133F formed in the top surface of the side panels, and illumination rays produced from the linear array of LEDs will be either directed through the rear surface of the prismatic illumination-focusing lens panel 3″ or absorbed by the black colored interior surface of the optically-opaque light ray containing structure 133. In alternative embodiments, the interior surface of the optically-opaque light ray containing structure may be coated with a light reflecting coating so as to increase the amount of light energy transmitted through the prismatic illumination-focusing lens panel, and thus increasing the light transmission efficiency of the LED-based illumination subsystem employed in the digital image capture and processing system of the present invention.
As shown in
As shown in
The System Architecture of the Third Illustrative Embodiment of the Digital Image Capture and Processing System
In
Implementing the System Architecture of the Third Illustrative Embodiment of the Digital Image Capture and Processing System
The subsystems employed within the digital image capture and processing system of the third illustrative embodiment are implemented with components mounted on the PC board assembly shown in
Specification of the Three-Tier Software Architecture of the Digital Image Capture and Processing System of the Third Illustrative Embodiment of the Present Invention
As shown in
While the operating system layer of the digital image capture and processing system is based upon the Linux operating system, it is understood that other operating systems can be used (e.g. Microsoft Windows, Apple Mac OSX, Unix, etc), and that the design preferably provides for independence between the main Application Software Layer and the Operating System Layer, and therefore, enables of the Application Software Layer to be potentially transported to other platforms. Moreover, the system design principles of the present invention provides an extensibility of the system to other future products with extensive usage of the common software components, decreasing development time and ensuring robustness.
In the illustrative embodiment, the above features are achieved through the implementation of an event-driven multi-tasking, potentially multi-user, Application layer running on top of the System Core software layer, called SCORE. The SCORE layer is statically linked with the product Application software, and therefore, runs in the Application Level or layer of the system. The SCORE layer provides a set of services to the Application in such a way that the Application would not need to know the details of the underlying operating system, although all operating system APIs are, of course, available to the application as well. The SCORE software layer provides a real-time, event-driven, OS-independent framework for the product Application to operate. The event-driven architecture is achieved by creating a means for detecting events (usually, but not necessarily, when the hardware interrupts occur) and posting the events to the Application for processing in real-time manner. The event detection and posting is provided by the SCORE software layer. The SCORE layer also provides the product Application with a means for starting and canceling the software tasks, which can be running concurrently, hence, the multi-tasking nature of the software system of the present invention.
Specification of the Illumination Subsystem of the Present Invention Employing Prismatic Illumination Focusing Lens Structure Integrated within the Imaging Window
Referring to FIGS. 33A through 33K2, the prismatic illumination-focusing lens structure 130 of the illustrative embodiment will now be described in greater detail.
FIG. 33C1 shows several LEDs 62N, 62M (from the linear LED array) transmitting illumination through the rear surface 130A of the prismatic illumination lens component 130 of the imaging window, in a controlled manner, so that a focused field of illumination emerging from the front recessed surface 130D and illuminates the FOV of the system in a substantially uniform manner, without objectionally projecting light rays into the eyes of consumers and/or operators who happen to be present at the point of sale (POS). Most light rays which emerge from the recessed surface section 130D project into the FOV, while a small percentage of the transmitted light rays strike the top wall surface 3A1 formed in the rectangular opening formed about the imaging window, and reflect/scatter off the mirrored surface 160 and into the FOV according to the optical design of the present invention. Light rays that illuminate objects within the FOV of the system scatter off the surface of illuminated objects within the FOV of the system, and are transmitted back through the imaging window panel 3″ and collected by FOV optics 34 and focused onto the area-type image sensing array 35 in the image formation and detection subsystem 21. The light transmission characteristics of the planar panel portion of the imaging window panel 3″ can be selected so that the cooperate with another optical filtering element 40 located near or proximate the image detection array 35 to form an optical band-pass filter system 40 that passes only a narrow band of optical wavelengths (e.g. a narrow band optical spectrum) centered about the characteristic wavelength of the illumination beam, thereby rejecting ambient noise to a significant degree and improving image contrast and quality.
By virtue of the imaging window design of the present invention, particularly its integrated prismatic illumination lens, it is now possible to uniformly illuminate the FOV of a 2D digital imaging system using a single linear array of LEDs that generates and project a field of visible illumination into the FOV of the system, without projecting light rays into the eyes of cashiers, sales clerks, customers and other humans present at the POS station where the digital imaging system of the illustrative embodiment can be installed.
Description of Operation of the Prismatic Illumination-Focusing Lens Component, Integrated within the Imaging Window of the Present Invention
Referring to FIGS. 33C2 through 33K2, operation of the prismatic illumination-focusing lens component, integrated within the imaging window of the present invention, will now be described in greater detail below.
FIG. 33C2 illustrates the propagation of a central light ray which is generated from an LED in the linear LED array 23, and passes through the central portion of the prismatic illumination-focusing lens component of the imaging window panel, and ultimately into the FOV of the system.
FIGS. 33G1 through 33K2 describe the spatial intensity profile characteristics achieved over the working range of the digital imaging system (e. from 50 mm to 150 mm from the imaging window) using the optical design employed in a particular illustrative embodiment of the present invention. In this illustrative embodiment shown in FIGS. 33G1 through 33K2, there is an average spatial intensity value drop off, measured from the center of the image, to its edge, and at each of the five different illumination regions. Notably, this optical design works very well in POS-based digital imaging applications; however, in other illustrative embodiments of the system, different spatial intensity profile characteristics may be desired or required to satisfy the needs of a different classes of digital imaging applications.
Specification of the Optical Function of the Prismatic Illumination-Focusing Lens Structure within the Illumination Subsystem of the Digital Image Capture and Processing System of the Third Illustrative Embodiment
Referring to
Specification of the Linear Visible Illumination Targeting Subsystem Employed in the Hand-Supportable Digital Image Capture and Processing System of the Third Illustrative Embodiment of the Present Invention
As shown in
Specification of the Image Formation and Detection Subsystem Employed in the Hand-Supportable Digital Image Capture and Processing System of the Third Illustrative Embodiment of the Present Invention
Specification of the LED-Driven Optical-Waveguide Structure Used to Illuminate the Manually-Actuated Trigger Switch Integrated in the Housing of the Digital Image Capture and Processing System of the Third Illustrative Embodiment of the Present Invention
Referring to
As shown in
Specification of the Acoustic-Waveguide Structure Used to Couple Sonic Energy, Produced from an Electro-Transducer, to the Sound Output Ports Formed in the Housing of the Digital Image Capture and Processing System of the Third Illustrative Embodiment of the Present Invention
Referring to
In cutaway views of
The acoustic-waveguide structure 172 of the present invention is shown in greater detail in
By way of the acoustic-waveguide structure, sound signals generated from the electro-acoustic transducer 171 are efficiently conducted through the waveguide channel and exit out through sound ports 170 formed in the optical-waveguide structure 165, and corresponding sound ports 170′ formed in the front housing portion 2B, as shown in
As shown in
Specification of the Multi-Interface I/O Subsystem Employed in the Digital Image Capture and Processing System of Present Invention of the Third Illustrative Embodiment
Referring now to
As shown in
As shown in
In
As shown in
The USB microcontroller (from Sci-Labs) supports software which carries out a square-wave signal (i.e. Wiggle) test, using the driver circuits and the interface (I/F) switching circuit 150 described above. This software-controlled automatic interface test/detection process can be summarized as follows. First, the CTS (Clear To Send) (i.e. Pin 2) is set to High and the RS-232 pull down resistor is allowed to go Low. The line which follows the CTS during the wiggle test signal is then checked; if no lines follow the CTS, then the RS-232 interface is determined or indicated. The line that follows CTS pin is tested multiple times. After passing the test, the interface is detected for operation.
The software-based automatic interface test/detection process employed by the multi-interface I/O subsystem 27 will now be described in greater detail with reference to the flow chart of
As shown at Block A in
As indicated at Block B in
As indicated at Block C in
As indicated at Block D in
As indicated at Block E in
As indicated at Block F in
As indicated at Block H in
As indicated at Block I in
As indicated at Block J in
In no tested port levels have gone LOW at Block J, then at Block Q the USB microcontroller releases the Decoder Reset Line, sets interface switches for the RS-232 interface and interface type, and then loads stored RS-232 configuration parameters into memory, so as to implement the RS-232 communication interface with the host system. At Block R, the scanner/imager is ready to run or operate.
If at Block J, any of the tested ports have gone LOW, then at Block K the USB microcontroller stores as possible interfaces, the remaining ports which have gone LOW.
As indicated at Block L in
If at Block L there is only one interface (I/F) candidate on the list of stored possible communication interfaces, the USB microcontroller toggles the EEPROM WP (wiggle) test line multiple (N) times to verify that the port pin for the sole interface candidate tracks the wiggle test signal.
If at Block N, the port pin for the sole interface candidate does not track the wiggle test signal, then the USB microcontroller returns to Block D, as shown in
The multi-interface I/O subsystem design described above has a number of other features which makes it very useful in POS application, namely: it does not require electronic circuitry to be embodied in the connector cables; it supports the option for 12 Volt to 5 Volt power conversion, and 12 Volt to 3.3 Volt power conversion; and its Keyboard Wedge (KW) interface allows for signals to pass therethrough without use of a power adapter.
In the illustrative embodiment, the power requirements for the multi-interface I/O subsystem are as follows: satisfy specification requirements for the USB Mode; consume less than 500 uA during its Sleep Mode; consume less than 100 mA before re-numeration; disable the decode section before USB I/F detection; consume less than 500 mA during operation; verify there is adapter power before switching to the higher power, Imaging Mode; keep the KeyBoard Wedge pass through mode operational without a/c adapter; and maintain the keyboard power fuse limit at about 250 mA for PC.
Specification of Method of Programming a Set of System Configuration Parameters (SCPs) within the Digital Image Capture and Processing System of the Present Invention, During Implementation of the Communication Interface Detected with a Host System
Oftentimes, end-user customers (e.g. retailers) employing multiple digital imaging systems of the present invention will support different-types of host systems within their operating environment. This implies that digital imaging systems of the present invention must be interfaced to at least one host system within such diverse operating environments, Also, typically, these different types of host systems will require different communication methods (e.g. RS232, USB, KBW, etc.). Also, depending on the interface connection, oftentimes the system configuration parameters (SCPs) for these different host system environments (e.g. supporting particular types of decode symbologies, prefixes, suffixes, data parsing, etc.) will be different within each digital imaging system. In general, the term SCP and SCPs as used herein, and in the claims, are intended to cover a broad range of parameters that control features and functions supported within any digital imaging system according to the present invention, and such features and functions include the parameters disclosed herein as well as those that are clearly defined and detailed in Applicants' copending U.S. application Ser. No. 11/640,814 filed Dec. 18, 2006, which is incorporated herein by reference in its entirety.
In order to eliminate the need to scan or read individual programming codes to change system configuration parameters required to interface within an assigned host system, it is an objective of the present invention to provide each digital imaging system of the present invention with the capacity to programmably store, its system memory (e.g. EPROM), a different set of system configuration parameters (SCPs) for each supported communication interface (e.g. RS232, USB, Keyboard Wedge (KBW), and IBM 46xx RS485), as illustrated in
In the flow chart of
As indicated at Block A in
One SCP/CI programming method would be to electronically load a SCP/CI data file into the system memory of each digital imaging system to be deployed within an organization's enterprise typically having diverse types of host systems, to which the digital imaging systems must establish a communication interface. This programming method might take place at the factory where the digital imaging systems are manufactured, or by a technician working at the user's enterprise before the digital imaging systems are deployed for their end use applications.
Another SCP/CI programming method might be to first cause the digital imaging system to enter a SCP/CI programming mode, whereupon a technician reads programming-type bar codes from a programming manual, following a predetermined code reading sequence, e.g. before the digital imaging system is ultimately programmed and deployable for end use.
When programming SCP/CI parameter settings in the system memory of the digital imaging system using a PC-based software application running on a host or client system, the PC-based software application can be designed to provide system configuration specialists with the option of selecting the communication interface (CI) for the set of system configuration parameters that are to be associated therewithin in system memory. Also, upon changing system configuration parameters associated with a particular communication interface (i.e. changing SCP/CI parameter settings within system memory), such users can also be provided with the option of selecting whether updated changes to a full set of system configuration parameters (SCPs) should be applied to (i) a single communication interface (e.g. RS-232 or USB), or (ii) all available communication interfaces (CIs) supported by the digital imaging system, and thereafter programmed into the memory banks of the system memory of the digital imaging system. Notably, selection of option (ii) above would serve as a global programming change within the digital imaging systems.
As indicated at Block B in
As indicated at Block C in
At indicated at Block D in
As indicated at Block E in
By virtue of the present invention, a digital image capture and processing system once initially programmed, avoids the need read individual programming-type codes at its end-user deployment environment in order to change additional configuration parameters (e.g. symbologies, prefix, suffix, data parsing, etc.) for a particular communication interface supported by the host system environment in which it has been deployed. This feature of the present invention offers significant advantages including, for example. a reduction in cost of ownership and maintenance, with a significant improvement in convenience and deployment flexibility within an organizational environment employing diverse host computing systems.
Specification of Method of Unlocking Restricted Features Embodied within the Digital Image Capture and Processing System of Present Invention of the Third Illustrative Embodiment, by Reading Feature-Unlocking Programming Bar Code Symbols
Often times, end-users of digital imaging systems do not want to pay extra for digital image capture and processing capabilities that far exceed any code capture and decode processing challenge that might be foreseeable encountered within a given end-user deployment environment. Also, manufacturers and value-added retailers (VARs) of digital imaging systems do not want to procure the necessary license fees, or incur the necessary software and/or hardware development costs associated with the provision of particular kinds of digital image capture and processing capabilities unless the end-user sees value in purchasing such digital imaging systems based on a real-world need. Examples of such kinds of digital image capture and processing capabilities, which customers may not require in many end-user applications might include, for example: (i) the capacity for decoding particular types of symbologies (i.e, PDF417, Datamstrix, QR code, etc.); (ii) the capacity for performing optical character recognition (OCR) on particular types of fonts; (iii) the capacity for performing digital image transfer to external systems and devices; (iv) the capacity for reading documents bearing machine readable code as well as handwriting (e.g. signatures); etc.
In order to more efficiently deliver value to end-user customers, it is an object of the present invention to provide manufacturers with a way of and means for providing their customers with digital imaging products having features and functions that truly serve their needs at the time of purchase procurement, and at less cost to the customer. This objective is achieved by providing a digital imaging system as shown in
Examples of predetermined classes of features and functions in the “baseline” model of the digital imaging system of
Also, an example of a first “extended” class of features and functions might include, for example: (i) the capacity for decoding particular types of symbologies (i.e PDF417, Datamstrix, and QR code); and (ii) the capacity for performing optical character recognition (OCR) on particular types of fonts. A second extended class of features and functions might include, for example: (iii) the capacity for performing digital image transfer to external systems and devices. Also, a third extended class of features and functions might include, for example: (iv) the capacity for reading documents bearing machine readable code as well as handwriting (e.g. signatures). Typically, each of these extended classes of feature and functionalities are locked and inaccessible to end-users unless authorized to do so after purchasing a license to access the extended class of features and functionalities.
Therefore, in accordance with the principle of the present invention, a unique “license key” is assigned to each extended class of features and functionalities, and it is stored in system memory along with the SCPs that implement the extended class of features and functionalities. This license key is required to unlock or activate the extended class of features and functionalities. This license key must be properly loaded into the system memory in order for the SCPs associated with the corresponding extended class of features and functionalities to operate properly, after the license has been procured by the customer or end-user, as the case may be.
As will be explained below, the license key can be loaded into the digital imaging system by way of reading a uniquely encrypted “extended feature class” activating bar code symbol which is based on the license key itself, as well as the serial # of the digital imaging system/unit. In the case of desiring to activate a number of digital imaging systems by reading the same uniquely encrypted “extended feature class” activating bar code symbol, the single uniquely encrypted “extended feature class” activating bar code symbol can be generated using the license key and the range of serial numbers associated with a number of digital imaging systems/units which are to be functionally upgraded in accordance with the principles of the present invention.
The method of unlocking restricted “extended” classes of features and functionalities embodied within the digital image capture and processing system of present invention is illustrated in the flow chart of
As indicated at Block A thereof, the first step involves (i) providing the system architecture of digital imaging system with all necessary hardware resources, SCPs programmably stored in system memory, and software resources for implementing the predefined baseline classes of features and functions for the digital imaging system, and (ii) assigning a unique license key that can be used to generate a uniquely encrypted “baseline feature class” activating bar code symbol which, when read by the digital imaging system while its is operating in “feature class extension programming” mode of operation, automatically unlocks the baseline class of features, and programs the digital imaging system to operate in its baseline feature and functionality configuration.
As indicated at Block B, the second step involves (ii) providing the system architecture of digital imaging system with all necessary hardware resources, SCPs programmably stored in system memory, and software resources for implementing the predefined “extended” classes of features and functions for the digital imaging system, and (ii) assigning a unique license key that can be used to generate a uniquely encrypted “extended feature class” activating bar code symbol which, when read by the digital imaging system while its is operating in “feature class extension programming” mode of operation, automatically unlocks the corresponding extended class of features, and programs the digital imaging system to operate with the corresponding extended class of features and functionalities, in addition to its baseline class of features and functionalities.
Notably, Steps A and B above can be performed either at the time of manufacturer of the digital imaging system, or during a service-upgrade at the factory or authorized service center.
As indicated at Block C, the third step involves (iii) activating such extended features and functionalities latent within the system by doing the following: (a) contacting the manufacturer, or its agent or service representative and procuring a license(s) for the desired extended class or classes of features and functionalities supported on the purchased digital image; (b) using the assigned license keys stored in system memory of the digital imaging systems to be feature upgraded (and their manufacturer-assigned serial numbers) to generate uniquely encrypted “extended feature class” activating bar code symbols corresponding to the purchased extended class licenses or license keys; (c) using the manufacturer-assigned serial numbers on the digital imaging systems to be feature upgraded to access and display corresponding uniquely encrypted “extended feature class” activating bar code symbols (either on the display screen of computer running a Web-browser programmed connected to a Web-based site supporting the procurement of extended class licenses for the digital imaging system of the customer, or by way of printing such programming bar code symbols by some way and/or means); (iv) inducing the system to enter its “feature class extension programming” mode of operation, by scanning a predetermined programming bar code symbol, and/or generating a hardware-originated signal (e.g. depressing a switch on the unit); and (v) reading the uniquely encrypted “extended feature class” activating bar code symbols, either being displayed on the display screen of the Web-enabled computer system, or printed on paper or plastic substrate material, so as to automatically unlock restricted “extended” classes of features and functionalities embodied within the digital imaging system and to activate such latent extended features and functionalities therewithin.
By virtue of the present invention, it is now possible to economically purchase digital imaging systems as disclosed in
Specification of the Fourth Illustrative Embodiment of the Digital Image Capture and Processing System of the Present Invention, Employing an Electro-Mechanical Optical Image Stabilization Subsystem that is Integrated with the Image Formation and Detection Subsystem
Referring now to
The system shown in
As shown in the system diagram of
Also, image intensification panel can also be incorporated into the image formation and detection subsystem immediately before the image detection array 35 to enable the detection of faint (i.e. low intensity) images of objects in the FOV when using low intensity illumination levels required in demanding environments where high intensity illumination levels are prohibited or undesired from the human safety or comfort point of view.
Specification of Method of Reducing Stray Light Rays Produced from LED-Based Illumination Array Employed in the Digital Image Capture and Processing System of the Present Invention
Referring to
In
Some Modifications which Readily Come to Mind
In alternative embodiments of the present invention, the linear illumination array 23 employed within the illumination subsystem 22″ may be realized using solid-state light sources other than LEDs, such as, for example, visible laser diode (VLDs) taught in great detail in WIPO Publication No. WO 02/43195 A2, published on May 30, 2002, and copending U.S. application Ser. No. 11/880,087 filed Jul. 19, 2007, assigned to Metrologic Instruments, Inc., and incorporated herein by reference in its entirety. However, when using VLD-based illumination techniques in the digital image capture and processing system of the present invention, great care must be taken to eliminate or otherwise substantially reduce speckle-noise generated at the image detection array 35 when using coherent illumination source during object illumination and imaging operations. WIPO Publication No. WO 02/43195 A2, and U.S. patent application Ser. No. 11/880,087 filed Jul. 19, 2007, supra, disclose diverse methods of and apparatus for eliminating or substantially reducing speckle-noise during image formation and detection when using VLD-based illumination arrays.
Also, the linear illumination array can be realized using a combination of both visible and invisible illumination sources as taught in great detail in Applicants' copending U.S. application Ser. No. 11/880,087 filed Jul. 19, 2007, incorporated herein by reference in its entirety. The use of such spectral mixing techniques will enable the capture of images of bar code labels having high contract, while using minimal levels of visible illumination.
While CMOS image detection array technology was described as being used in the preferred embodiments of the present invention, it is understood that in alternative embodiments, CCD-type image detection array technology, as well as other kinds of image detection technology, can be used.
The digital image capture and processing system design described in great detail hereinabove can be readily adapted for use as an industrial or commercial fixed-position bar code reader/imager, having the interfaces commonly used in the industrial world, such as Ethernet TCP/IP for instance. By providing such digital imaging systems with an Ethernet TCP/IP port, a number of useful features will be enabled, such as, for example: multi-user access to such bar code reading systems over the Internet; management control over multiple systems on a LAN or WAN from a single user application; web-servicing of such digital imaging systems; upgrading of software, including extended classes of features and benefits, as disclosed hereinabove; and the like.
While the illustrative embodiments of the present invention have been described in connection with various types of bar code symbol reading applications involving 1-D and 2-D bar code structures, it is understood that the present invention can be use to read (i.e. recognize) any machine-readable indicia, dataform, or graphically-encoded form of intelligence, including, but not limited to bar code symbol structures, alphanumeric character recognition strings, handwriting, and diverse dataforms currently known in the art or to be developed in the future. Hereinafter, the term “code symbol” shall be deemed to include all such information carrying structures and other forms of graphically-encoded intelligence.
Also, digital image capture and processing systems of the present invention can also be used to capture and process various kinds of graphical images including photos and marks printed on driver licenses, permits, credit cards, debit cards, or the like, in diverse user applications.
It is understood that the digital image capture and processing technology employed in bar code symbol reading systems of the illustrative embodiments may be modified in a variety of ways which will become readily apparent to those skilled in the art of having the benefit of the novel teachings disclosed herein. All such modifications and variations of the illustrative embodiments thereof shall be deemed to be within the scope and spirit of the present invention as defined by the Claims to Invention appended hereto.
Knowles, C. Harry, Zhu, Xiaoxun, Wilz, Sr., David M., Au, Ka Man, Giordano, Patrick, Kotlarsky, Anatoly, Veksland, Michael, Ren, Jie, Allen, Christopher, Miraglia, Michael V., Smith, Taylor, Yan, Weizhen, Mandal, Sudhin, De Foney, Shawn
Patent | Priority | Assignee | Title |
10002274, | Sep 11 2013 | Hand Held Products, Inc. | Handheld indicia reader having locking endcap |
10007112, | May 06 2015 | Hand Held Products, Inc. | Hands-free human machine interface responsive to a driver of a vehicle |
10007858, | May 15 2012 | Honeywell International Inc.; HONEYWELL INTERNATIONAL INC D B A HONEYWELL SCANNING AND MOBILITY | Terminals and methods for dimensioning objects |
10013591, | Jun 26 2013 | Hand Held Products, Inc. | Code symbol reading system having adaptive autofocus |
10022993, | Dec 02 2016 | HAND HELD PRODUCTS, INC | Media guides for use in printers and methods for using the same |
10025314, | Jan 27 2016 | Hand Held Products, Inc. | Vehicle positioning and object avoidance |
10026187, | Jan 12 2016 | Hand Held Products, Inc. | Using image data to calculate an object's weight |
10026377, | Nov 12 2015 | Hand Held Products, Inc. | IRDA converter tag |
10031018, | Jun 16 2015 | Hand Held Products, Inc. | Calibrating a volume dimensioner |
10035367, | Jun 21 2017 | HAND HELD PRODUCTS, INC | Single motor dynamic ribbon feedback system for a printer |
10038716, | May 01 2015 | Hand Held Products, Inc. | System and method for regulating barcode data injection into a running application on a smart device |
10042593, | Sep 02 2016 | HAND HELD PRODUCTS, INC | Printer smart folders using USB mass storage profile |
10044880, | Dec 16 2016 | HAND HELD PRODUCTS, INC | Comparing printer models |
10049245, | Jun 20 2012 | Metrologic Instruments, Inc | Laser scanning code symbol reading system providing control over length of laser scan line projected onto a scanned object using dynamic range-dependent scan angle control |
10049246, | Dec 23 2014 | Hand Held Products, Inc. | Mini-barcode reading module with flash memory management |
10049249, | Sep 30 2015 | Hand Held Products, Inc. | Indicia reader safety |
10049290, | Dec 31 2014 | HAND HELD PRODUCTS, INC | Industrial vehicle positioning system and method |
10051446, | Mar 06 2015 | Hand Held Products, Inc. | Power reports in wireless scanner systems |
10055625, | Apr 15 2016 | Hand Held Products, Inc. | Imaging barcode reader with color-separated aimer and illuminator |
10057442, | Oct 27 2015 | Intermec Technologies Corporation | Media width sensing |
10060721, | Jul 16 2015 | Hand Held Products, Inc. | Dimensioning and imaging items |
10060729, | Oct 21 2014 | Hand Held Products, Inc. | Handheld dimensioner with data-quality indication |
10061118, | Feb 04 2016 | Hand Held Products, Inc. | Beam shaping system and scanner |
10061565, | Jan 08 2015 | Hand Held Products, Inc. | Application development using mutliple primary user interfaces |
10064005, | Dec 09 2015 | Hand Held Products, Inc. | Mobile device with configurable communication technology modes and geofences |
10066982, | Jun 16 2015 | Hand Held Products, Inc. | Calibrating a volume dimensioner |
10071575, | Jan 18 2017 | HAND HELD PRODUCTS, INC | Printers and methods for detecting print media thickness therein |
10073197, | Apr 29 2014 | Hand Held Products, Inc. | Autofocus lens system |
10083331, | Sep 11 2015 | Hand Held Products, Inc. | Positioning an object with respect to a target location |
10083333, | Oct 10 2014 | Hand Held Products, Inc. | Depth sensor based auto-focus system for an indicia scanner |
10084556, | Oct 20 2017 | Hand Held Products, Inc. | Identifying and transmitting invisible fence signals with a mobile data terminal |
10085101, | Jul 13 2016 | Hand Held Products, Inc. | Systems and methods for determining microphone position |
10094650, | Jul 16 2015 | Hand Held Products, Inc. | Dimensioning and imaging items |
10096099, | Oct 10 2014 | HAND HELD PRODUCTS, INC | Image-stitching for dimensioning |
10097681, | Jun 14 2016 | Hand Held Products, Inc. | Managing energy usage in mobile devices |
10097949, | Feb 23 2015 | Hand Held Products, Inc. | Device, system, and method for determining the status of lanes |
10099485, | Jul 31 2017 | HAND HELD PRODUCTS, INC | Thermal print heads and printers including the same |
10105963, | Mar 03 2017 | HAND HELD PRODUCTS, INC | Region-of-interest based print quality optimization |
10108832, | Dec 30 2014 | Hand Held Products, Inc. | Augmented reality vision barcode scanning system and method |
10114997, | Nov 16 2016 | Hand Held Products, Inc. | Reader for optical indicia presented under two or more imaging conditions within a single frame time |
10120657, | Jan 08 2015 | Hand Held Products, Inc.; HAND HELD PRODUCTS, INC | Facilitating workflow application development |
10121039, | Oct 10 2014 | Hand Held Products, Inc. | Depth sensor based auto-focus system for an indicia scanner |
10121466, | Feb 11 2015 | Hand Held Products, Inc. | Methods for training a speech recognition system |
10127423, | Jul 06 2017 | Hand Held Products, Inc. | Methods for changing a configuration of a device for reading machine-readable code |
10127674, | Jun 15 2016 | Hand Held Products, Inc. | Automatic mode switching in a volume dimensioner |
10129414, | Nov 04 2015 | Intermec Technologies Corporation | Systems and methods for detecting transparent media in printers |
10134112, | Sep 25 2015 | Hand Held Products, Inc. | System and process for displaying information from a mobile computer in a vehicle |
10134120, | Oct 10 2014 | HAND HELD PRODUCTS, INC | Image-stitching for dimensioning |
10134247, | Dec 18 2014 | Hand Held Products, Inc. | Active emergency exit systems for buildings |
10136715, | Dec 18 2014 | Hand Held Products, Inc. | Wearable sled system for a mobile computer device |
10139495, | Jan 24 2014 | Hand Held Products, Inc. | Shelving and package locating systems for delivery vehicles |
10140487, | Dec 31 2014 | Hand Held Products, Inc. | Reconfigurable sled for a mobile device |
10140724, | Jan 12 2009 | Intermec IP Corporation | Semi-automatic dimensioning with imager on a portable device |
10146194, | Oct 14 2015 | Hand Held Products, Inc. | Building lighting and temperature control with an augmented reality system |
10152622, | Dec 30 2014 | Hand Held Products, Inc. | Visual feedback for code readers |
10152664, | Oct 27 2016 | Hand Held Products, Inc. | Backlit display detection and radio signature recognition |
10158612, | Feb 07 2017 | Hand Held Products, Inc. | Imaging-based automatic data extraction with security scheme |
10158834, | Aug 30 2016 | Hand Held Products, Inc. | Corrected projection perspective distortion |
10163044, | Dec 15 2016 | HAND HELD PRODUCTS, INC | Auto-adjusted print location on center-tracked printers |
10163216, | Jun 15 2016 | Hand Held Products, Inc. | Automatic mode switching in a volume dimensioner |
10176521, | Dec 15 2014 | Hand Held Products, Inc. | Augmented reality virtual product for display |
10181321, | Sep 27 2016 | VOCOLLECT, Inc. | Utilization of location and environment to improve recognition |
10181896, | Nov 01 2017 | Hand Held Products, Inc. | Systems and methods for reducing power consumption in a satellite communication device |
10183500, | Jun 01 2016 | HAND HELD PRODUCTS, INC | Thermal printhead temperature control |
10183506, | Aug 02 2016 | HAND HELD PRODUCTS, INC | Thermal printer having real-time force feedback on printhead pressure and method of using same |
10185860, | Sep 23 2015 | Intermec Technologies Corporation | Evaluating images |
10185906, | Apr 26 2016 | HAND HELD PRODUCTS, INC | Indicia reading device and methods for decoding decodable indicia employing stereoscopic imaging |
10185945, | Apr 04 2014 | Hand Held Products, Inc. | Multifunction point of sale system |
10189285, | Apr 20 2017 | HAND HELD PRODUCTS, INC | Self-strip media module |
10191514, | Dec 23 2014 | Hand Held Products, Inc. | Tablet computer with interface channels |
10192194, | Nov 18 2015 | Hand Held Products, Inc. | In-vehicle package location identification at load and delivery times |
10195880, | Mar 02 2017 | HAND HELD PRODUCTS, INC | Automatic width detection |
10197446, | Sep 10 2015 | Hand Held Products, Inc. | System and method of determining if a surface is printed or a device screen |
10203402, | Jun 07 2013 | Hand Held Products, Inc. | Method of error correction for 3D imaging device |
10210364, | Oct 31 2017 | Hand Held Products, Inc.; HAND HELD PRODUCTS, INC | Direct part marking scanners including dome diffusers with edge illumination assemblies |
10210366, | Jul 15 2016 | Hand Held Products, Inc. | Imaging scanner with positioning and display |
10216969, | Jul 10 2017 | Hand Held Products, Inc. | Illuminator for directly providing dark field and bright field illumination |
10217089, | Jan 05 2016 | Intermec Technologies Corporation | System and method for guided printer servicing |
10220643, | Aug 04 2016 | HAND HELD PRODUCTS, INC | System and method for active printing consistency control and damage protection |
10222514, | Apr 29 2014 | Hand Held Products, Inc. | Autofocus lens system |
10223626, | Apr 19 2017 | Hand Held Products, Inc. | High ambient light electronic screen communication method |
10225544, | Nov 19 2015 | Hand Held Products, Inc. | High resolution dot pattern |
10228452, | Jun 07 2013 | Hand Held Products, Inc. | Method of error correction for 3D imaging device |
10232628, | Dec 08 2017 | HAND HELD PRODUCTS, INC | Removably retaining a print head assembly on a printer |
10235547, | Jan 26 2016 | Hand Held Products, Inc. | Enhanced matrix symbol error correction method |
10237421, | Dec 22 2016 | HAND HELD PRODUCTS, INC | Printers and methods for identifying a source of a problem therein |
10240914, | Aug 06 2014 | Hand Held Products, Inc. | Dimensioning system with guided alignment |
10245861, | Oct 04 2017 | HAND HELD PRODUCTS, INC | Printers, printer spindle assemblies, and methods for determining media width for controlling media tension |
10247547, | Jun 23 2015 | Hand Held Products, Inc. | Optical pattern projector |
10248822, | Oct 29 2015 | Hand Held Products, Inc. | Scanner assembly with removable shock mount |
10249030, | Oct 30 2015 | Hand Held Products, Inc. | Image transformation for indicia reading |
10252874, | Feb 20 2017 | HAND HELD PRODUCTS, INC | Clutch bearing to keep media tension for better sensing accuracy |
10255469, | Jul 28 2017 | Hand Held Products, Inc. | Illumination apparatus for a barcode reader |
10259694, | Dec 31 2014 | Hand Held Products, Inc. | System and method for monitoring an industrial vehicle |
10262660, | Jan 08 2015 | Hand Held Products, Inc. | Voice mode asset retrieval |
10263443, | Jan 13 2017 | HAND HELD PRODUCTS, INC | Power capacity indicator |
10264165, | Jul 11 2017 | Hand Held Products, Inc. | Optical bar assemblies for optical systems and isolation damping systems including the same |
10268858, | Jun 16 2016 | Hand Held Products, Inc. | Eye gaze detection controlled indicia scanning system and method |
10268859, | Sep 23 2016 | Hand Held Products, Inc. | Three dimensional aimer for barcode scanning |
10269342, | Oct 29 2014 | Hand Held Products, Inc. | Method and system for recognizing speech using wildcards in an expected response |
10272784, | May 24 2013 | Hand Held Products, Inc. | System and method for display of information using a vehicle-mount computer |
10275088, | Dec 18 2014 | Hand Held Products, Inc. | Systems and methods for identifying faulty touch panel having intermittent field failures |
10275624, | Oct 29 2013 | HAND HELD PRODUCTS, INC | Hybrid system and method for reading indicia |
10276009, | Jan 26 2017 | Hand Held Products, Inc. | Method of reading a barcode and deactivating an electronic article surveillance tag |
10282526, | Dec 09 2015 | Hand Held Products, Inc. | Generation of randomized passwords for one-time usage |
10286681, | Jul 14 2016 | HAND HELD PRODUCTS, INC | Wireless thermal printhead system and method |
10286694, | Sep 02 2016 | Datamax-O'Neil Corporation | Ultra compact printer |
10293624, | Oct 23 2017 | HAND HELD PRODUCTS, INC | Smart media hanger with media width detection |
10296259, | Dec 22 2014 | Hand Held Products, Inc. | Delayed trim of managed NAND flash memory in computing devices |
10303258, | Jun 10 2015 | Hand Held Products, Inc. | Indicia-reading systems having an interface with a user's nervous system |
10303909, | Nov 24 2015 | Hand Held Products, Inc. | Add-on device with configurable optics for an image scanner for scanning barcodes |
10304174, | Dec 19 2016 | HAND HELD PRODUCTS, INC | Printer-verifiers and systems and methods for verifying printed indicia |
10306051, | Jun 14 2016 | Hand Held Products, Inc. | Managing energy usage in mobile devices |
10308009, | Oct 13 2015 | Intermec IP Corp. | Magnetic media holder for printer |
10311274, | Nov 16 2016 | Hand Held Products, Inc. | Reader for optical indicia presented under two or more imaging conditions within a single frame time |
10312483, | Sep 30 2015 | Hand Held Products, Inc. | Double locking mechanism on a battery latch |
10313340, | Dec 16 2015 | Hand Held Products, Inc. | Method and system for tracking an electronic device at an electronic device docking station |
10313811, | Jul 13 2016 | Hand Held Products, Inc. | Systems and methods for determining microphone position |
10317474, | Dec 18 2014 | Hand Held Products, Inc. | Systems and methods for identifying faulty battery in an electronic device |
10321127, | Aug 20 2012 | Intermec IP CORP | Volume dimensioning system calibration systems and methods |
10323929, | Dec 19 2017 | HAND HELD PRODUCTS, INC | Width detecting media hanger |
10325436, | Dec 31 2015 | HAND HELD PRODUCTS, INC | Devices, systems, and methods for optical validation |
10331609, | Apr 15 2015 | Hand Held Products, Inc. | System for exchanging information between wireless peripherals and back-end systems via a peripheral hub |
10331930, | Sep 19 2016 | Hand Held Products, Inc. | Dot peen mark image acquisition |
10332099, | Jun 09 2017 | Hand Held Products, Inc. | Secure paper-free bills in workflow applications |
10333955, | May 06 2015 | Hand Held Products, Inc. | Method and system to protect software-based network-connected devices from advanced persistent threat |
10336112, | Feb 27 2017 | HAND HELD PRODUCTS, INC | Segmented enclosure |
10339352, | Jun 03 2016 | Hand Held Products, Inc. | Wearable metrological apparatus |
10345383, | Jul 07 2015 | Hand Held Products, Inc. | Useful battery capacity / state of health gauge |
10350905, | Jan 26 2017 | HAND HELD PRODUCTS, INC | Detecting printing ribbon orientation |
10354449, | Jun 12 2015 | HAND HELD PRODUCTS, INC | Augmented reality lighting effects |
10359273, | Oct 21 2014 | Hand Held Products, Inc. | Handheld dimensioning system with measurement-conformance feedback |
10360424, | Dec 28 2016 | Hand Held Products, Inc. | Illuminator for DPM scanner |
10360728, | May 19 2015 | Hand Held Products, Inc. | Augmented reality device, system, and method for safety |
10366380, | Apr 04 2014 | Hand Held Products, Inc. | Multifunction point of sale system |
10369804, | Nov 10 2017 | HAND HELD PRODUCTS, INC | Secure thermal print head |
10369823, | Nov 06 2017 | HAND HELD PRODUCTS, INC | Print head pressure detection and adjustment |
10372389, | Sep 22 2017 | HAND HELD PRODUCTS, INC | Systems and methods for printer maintenance operations |
10372952, | Sep 06 2013 | Hand Held Products, Inc. | Device having light source to reduce surface pathogens |
10372954, | Aug 16 2016 | Hand Held Products, Inc. | Method for reading indicia off a display of a mobile device |
10373032, | Aug 01 2017 | HAND HELD PRODUCTS, INC | Cryptographic printhead |
10373143, | Sep 24 2015 | Hand Held Products, Inc.; HAND HELD PRODUCTS, INC | Product identification using electroencephalography |
10375473, | Sep 20 2016 | VOCOLLECT, INC | Distributed environmental microphones to minimize noise during speech recognition |
10384462, | Aug 17 2016 | HAND HELD PRODUCTS, INC | Easy replacement of thermal print head and simple adjustment on print pressure |
10387699, | Jan 12 2017 | Hand Held Products, Inc. | Waking system in barcode scanner |
10393506, | Jul 15 2015 | Hand Held Products, Inc. | Method for a mobile dimensioning device to use a dynamic accuracy compatible with NIST standard |
10393508, | Oct 21 2014 | Hand Held Products, Inc. | Handheld dimensioning system with measurement-conformance feedback |
10394316, | Apr 07 2016 | Hand Held Products, Inc. | Multiple display modes on a mobile device |
10395081, | Dec 09 2016 | Hand Held Products, Inc. | Encoding document capture bounds with barcodes |
10395116, | Oct 29 2015 | HAND HELD PRODUCTS, INC | Dynamically created and updated indoor positioning map |
10397388, | Nov 02 2015 | Hand Held Products, Inc. | Extended features for network communication |
10399359, | Sep 06 2017 | DATAMAX-O NEIL CORPORATION | Autocorrection for uneven print pressure on print media |
10399361, | Nov 21 2017 | HAND HELD PRODUCTS, INC | Printer, system and method for programming RFID tags on media labels |
10399369, | Oct 23 2017 | HAND HELD PRODUCTS, INC | Smart media hanger with media width detection |
10401436, | May 04 2015 | Hand Held Products, Inc. | Tracking battery conditions |
10402038, | Jan 08 2015 | Hand Held Products, Inc. | Stack handling using multiple primary user interfaces |
10402956, | Oct 10 2014 | Hand Held Products, Inc. | Image-stitching for dimensioning |
10410629, | Aug 19 2015 | HAND HELD PRODUCTS, INC | Auto-complete methods for spoken complete value entries |
10417769, | Jun 15 2016 | Hand Held Products, Inc. | Automatic mode switching in a volume dimensioner |
10424842, | Sep 02 2015 | Hand Held Products, Inc. | Patch antenna |
10427424, | Nov 01 2017 | HAND HELD PRODUCTS, INC | Estimating a remaining amount of a consumable resource based on a center of mass calculation |
10434800, | May 17 2018 | HAND HELD PRODUCTS, INC | Printer roll feed mechanism |
10438098, | May 19 2017 | Hand Held Products, Inc. | High-speed OCR decode using depleted centerlines |
10438409, | Dec 15 2014 | Hand Held Products, Inc. | Augmented reality asset locator |
10463140, | Apr 28 2017 | HAND HELD PRODUCTS, INC | Attachment apparatus for electronic device |
10464349, | Sep 20 2016 | HAND HELD PRODUCTS, INC | Method and system to calculate line feed error in labels on a printer |
10467513, | Aug 12 2015 | Datamax-O'Neil Corporation | Verification of a printed image on media |
10467806, | May 04 2012 | Intermec IP Corp. | Volume dimensioning systems and methods |
10468015, | Jan 12 2017 | VOCOLLECT, Inc. | Automated TTS self correction system |
10484847, | Sep 13 2016 | Hand Held Products, Inc. | Methods for provisioning a wireless beacon |
10506516, | Aug 26 2015 | Hand Held Products, Inc. | Fleet power management through information storage sharing |
10509619, | Dec 15 2014 | Hand Held Products, Inc. | Augmented reality quick-start and user guide |
10523038, | May 23 2017 | Hand Held Products, Inc. | System and method for wireless charging of a beacon and/or sensor device |
10529335, | Aug 19 2015 | HAND HELD PRODUCTS, INC | Auto-complete methods for spoken complete value entries |
10546160, | Jan 05 2018 | HAND HELD PRODUCTS, INC | Methods, apparatuses, and systems for providing print quality feedback and controlling print quality of machine-readable indicia |
10549561, | May 04 2017 | DATAMAX-O NEIL CORPORATION | Apparatus for sealing an enclosure |
10552786, | Dec 26 2014 | Hand Held Products, Inc. | Product and location management via voice recognition |
10559075, | Dec 19 2016 | HAND HELD PRODUCTS, INC | Printer-verifiers and systems and methods for verifying printed indicia |
10584962, | May 01 2018 | HAND HELD PRODUCTS, INC | System and method for validating physical-item security |
10592536, | May 30 2017 | Hand Held Products, Inc. | Systems and methods for determining a location of a user when using an imaging device in an indoor facility |
10593130, | May 19 2015 | Hand Held Products, Inc. | Evaluating image values |
10612958, | Jul 07 2015 | Hand Held Products, Inc. | Mobile dimensioner apparatus to mitigate unfair charging practices in commerce |
10621470, | Sep 29 2017 | HAND HELD PRODUCTS, INC | Methods for optical character recognition (OCR) |
10621538, | Dec 28 2014 | HAND HELD PRODUCTS, INC | Dynamic check digit utilization via electronic tag |
10621634, | May 08 2015 | Hand Held Products, Inc. | Application independent DEX/UCS interface |
10625525, | Mar 02 2017 | HAND HELD PRODUCTS, INC | Automatic width detection |
10635871, | Aug 04 2017 | HAND HELD PRODUCTS, INC | Indicia reader acoustic for multiple mounting positions |
10635876, | Dec 23 2014 | HAND HELD PRODUCTS, INC | Method of barcode templating for enhanced decoding performance |
10635922, | May 15 2012 | Hand Held Products, Inc. | Terminals and methods for dimensioning objects |
10640325, | Aug 05 2016 | HAND HELD PRODUCTS, INC | Rigid yet flexible spindle for rolled material |
10644944, | Jun 30 2017 | HAND HELD PRODUCTS, INC | Managing a fleet of devices |
10650631, | Jul 28 2017 | Hand Held Products, Inc. | Systems and methods for processing a distorted image |
10652403, | Jan 10 2017 | HAND HELD PRODUCTS, INC | Printer script autocorrect |
10654287, | Oct 19 2017 | HAND HELD PRODUCTS, INC | Print quality setup using banks in parallel |
10654697, | Dec 01 2017 | Hand Held Products, Inc. | Gyroscopically stabilized vehicle system |
10679101, | Oct 25 2017 | Hand Held Products, Inc. | Optical character recognition systems and methods |
10685665, | Aug 17 2016 | VOCOLLECT, Inc. | Method and apparatus to improve speech recognition in a high audio noise environment |
10694277, | Oct 03 2016 | VOCOLLECT, Inc. | Communication headsets and systems for mobile application control and power savings |
10698470, | Dec 09 2016 | Hand Held Products, Inc. | Smart battery balance system and method |
10703112, | Dec 13 2017 | HAND HELD PRODUCTS, INC | Image to script converter |
10710375, | Mar 03 2017 | HAND HELD PRODUCTS, INC | Region-of-interest based print quality optimization |
10710386, | Jun 21 2017 | HAND HELD PRODUCTS, INC | Removable printhead |
10714121, | Jul 27 2016 | VOCOLLECT, Inc. | Distinguishing user speech from background speech in speech-dense environments |
10728445, | Oct 05 2017 | HAND HELD PRODUCTS INC. | Methods for constructing a color composite image |
10731963, | Jan 09 2018 | HAND HELD PRODUCTS, INC | Apparatus and method of measuring media thickness |
10732226, | May 26 2017 | HAND HELD PRODUCTS, INC | Methods for estimating a number of workflow cycles able to be completed from a remaining battery capacity |
10733401, | Jul 15 2016 | Hand Held Products, Inc. | Barcode reader with viewing frame |
10733406, | Jun 16 2016 | Hand Held Products, Inc. | Eye gaze detection controlled indicia scanning system and method |
10733748, | Jul 24 2017 | Hand Held Products, Inc. | Dual-pattern optical 3D dimensioning |
10737911, | Mar 02 2017 | Hand Held Products, Inc. | Electromagnetic pallet and method for adjusting pallet position |
10740855, | Dec 14 2016 | Hand Held Products, Inc. | Supply chain tracking of farm produce and crops |
10741347, | Jun 16 2015 | Hand Held Products, Inc. | Tactile switch for a mobile electronic device |
10747227, | Jan 27 2016 | Hand Held Products, Inc. | Vehicle positioning and object avoidance |
10747975, | Jul 06 2017 | Hand Held Products, Inc. | Methods for changing a configuration of a device for reading machine-readable code |
10749300, | Aug 11 2017 | HAND HELD PRODUCTS, INC | POGO connector based soft power start solution |
10753802, | Sep 10 2015 | Hand Held Products, Inc. | System and method of determining if a surface is printed or a device screen |
10754593, | Jan 05 2018 | DATAMAX-O NEIL CORPORATION | Methods, apparatuses, and systems for verifying printed image and improving print quality |
10755154, | Apr 26 2016 | Hand Held Products, Inc. | Indicia reading device and methods for decoding decodable indicia employing stereoscopic imaging |
10756563, | Dec 15 2017 | HAND HELD PRODUCTS, INC | Powering devices using low-current power sources |
10756900, | Sep 28 2017 | HAND HELD PRODUCTS, INC | Non-repudiation protocol using time-based one-time password (TOTP) |
10769393, | Oct 24 2012 | Honeywell International Inc. | Chip on board based highly integrated imager |
10773537, | Dec 27 2017 | HAND HELD PRODUCTS, INC | Method and apparatus for printing |
10775165, | Oct 10 2014 | HAND HELD PRODUCTS, INC | Methods for improving the accuracy of dimensioning-system measurements |
10778690, | Jun 30 2017 | HAND HELD PRODUCTS, INC | Managing a fleet of workflow devices and standby devices in a device network |
10780721, | Mar 30 2017 | HAND HELD PRODUCTS, INC | Detecting label stops |
10789435, | Mar 07 2014 | Hand Held Products, Inc. | Indicia reader for size-limited applications |
10791213, | Jun 14 2016 | Hand Held Products, Inc. | Managing energy usage in mobile devices |
10795618, | Jan 05 2018 | HAND HELD PRODUCTS, INC | Methods, apparatuses, and systems for verifying printed image and improving print quality |
10796119, | Jul 28 2017 | Hand Held Products, Inc. | Decoding color barcodes |
10797498, | Jan 13 2017 | HAND HELD PRODUCTS, INC | Power capacity indicator |
10798316, | Apr 04 2017 | Hand Held Products, Inc. | Multi-spectral imaging using longitudinal chromatic aberrations |
10803264, | Jan 05 2018 | HAND HELD PRODUCTS, INC | Method, apparatus, and system for characterizing an optical system |
10803267, | Aug 18 2017 | Hand Held Products, Inc. | Illuminator for a barcode scanner |
10804718, | Jan 08 2015 | Hand Held Products, Inc. | System and method for charging a barcode scanner |
10805603, | Aug 20 2012 | Intermec IP Corp. | Volume dimensioning system calibration systems and methods |
10809949, | Jan 26 2018 | HAND HELD PRODUCTS, INC | Removably couplable printer and verifier assembly |
10810529, | Nov 03 2014 | Hand Held Products, Inc. | Directing an inspector through an inspection |
10810530, | Sep 26 2014 | HAND HELD PRODUCTS, INC | System and method for workflow management |
10810541, | May 03 2017 | Hand Held Products, Inc. | Methods for pick and put location verification |
10810715, | Oct 10 2014 | HAND HELD PRODUCTS, INC | System and method for picking validation |
10834283, | Jan 05 2018 | HAND HELD PRODUCTS, INC | Methods, apparatuses, and systems for detecting printing defects and contaminated components of a printer |
10845184, | Jan 12 2009 | Intermec IP Corporation | Semi-automatic dimensioning with imager on a portable device |
10846498, | Jan 26 2016 | Hand Held Products, Inc. | Enhanced matrix symbol error correction method |
10859375, | Oct 10 2014 | Hand Held Products, Inc. | Methods for improving the accuracy of dimensioning-system measurements |
10859667, | Jan 12 2016 | Hand Held Products, Inc. | Programmable reference beacons |
10860706, | Apr 24 2015 | Hand Held Products, Inc. | Secure unattended network authentication |
10863002, | May 24 2013 | Hand Held Products, Inc. | System for providing a continuous communication link with a symbol reading device |
10866780, | Dec 15 2014 | Hand Held Products, Inc. | Augmented reality quick-start and user guide |
10867141, | Jul 12 2017 | Hand Held Products, Inc. | System and method for augmented reality configuration of indicia readers |
10867145, | Mar 06 2017 | HAND HELD PRODUCTS, INC | Systems and methods for barcode verification |
10867450, | Jun 12 2015 | Hand Held Products, Inc. | Augmented reality lighting effects |
10868958, | Oct 05 2017 | Hand Held Products, Inc. | Methods for constructing a color composite image |
10872214, | Jun 03 2016 | Hand Held Products, Inc. | Wearable metrological apparatus |
10884059, | Oct 18 2017 | HAND HELD PRODUCTS, INC | Determining the integrity of a computing device |
10894431, | Oct 07 2015 | Intermec Technologies Corporation | Print position correction |
10896304, | Aug 17 2015 | Hand Held Products, Inc. | Indicia reader having a filtered multifunction image sensor |
10896361, | Apr 19 2017 | Hand Held Products, Inc. | High ambient light electronic screen communication method |
10896403, | Jul 18 2016 | VOCOLLECT, INC | Systems and methods for managing dated products |
10897150, | Jan 12 2018 | HAND HELD PRODUCTS, INC | Indicating charge status |
10897940, | Aug 27 2015 | Hand Held Products, Inc. | Gloves having measuring, scanning, and displaying capabilities |
10904453, | Dec 28 2016 | Hand Held Products, Inc. | Method and system for synchronizing illumination timing in a multi-sensor imager |
10908013, | Oct 16 2012 | Hand Held Products, Inc. | Dimensioning system |
10909490, | Oct 15 2014 | VOCOLLECT, Inc.; VOCOLLECT, INC | Systems and methods for worker resource management |
10909708, | Dec 09 2016 | Hand Held Products, Inc. | Calibrating a dimensioner using ratios of measurable parameters of optic ally-perceptible geometric elements |
10953672, | Mar 30 2017 | HAND HELD PRODUCTS, INC | Detecting label stops |
10956033, | Jul 13 2017 | Hand Held Products, Inc. | System and method for generating a virtual keyboard with a highlighted area of interest |
10960681, | Sep 06 2017 | DATAMAX-O NEIL CORPORATION | Autocorrection for uneven print pressure on print media |
10967660, | May 12 2017 | HAND HELD PRODUCTS, INC | Media replacement process for thermal printers |
10976797, | Dec 09 2016 | Hand Held Products, Inc. | Smart battery balance system and method |
10977594, | Jun 30 2017 | HAND HELD PRODUCTS, INC | Managing a fleet of devices |
10984374, | Feb 10 2017 | VOCOLLECT, INC | Method and system for inputting products into an inventory system |
10999460, | Jan 05 2018 | HAND HELD PRODUCTS, INC | Methods, apparatuses, and systems for detecting printing defects and contaminated components of a printer |
11010139, | Jan 08 2015 | Hand Held Products, Inc. | Application development using multiple primary user interfaces |
11014374, | Mar 03 2017 | HAND HELD PRODUCTS, INC | Region-of-interest based print quality optimization |
11042834, | Jan 12 2017 | VOCOLLECT INC ; VOCOLLECT, INC | Voice-enabled substitutions with customer notification |
11047672, | Mar 28 2017 | HAND HELD PRODUCTS, INC | System for optically dimensioning |
11081087, | Jan 08 2015 | HAND HELD PRODUCTS, INC | Multiple primary user interfaces |
11084698, | Dec 31 2014 | Hand Held Products, Inc. | System and method for monitoring an industrial vehicle |
11117407, | Dec 27 2017 | HAND HELD PRODUCTS, INC | Method and apparatus for printing |
11120238, | Jul 28 2017 | Hand Held Products, Inc. | Decoding color barcodes |
11125885, | Mar 15 2016 | Hand Held Products, Inc. | Monitoring user biometric parameters with nanotechnology in personal locator beacon |
11126384, | Jan 26 2018 | HAND HELD PRODUCTS, INC | Removably couplable printer and verifier assembly |
11139665, | Jan 13 2017 | Hand Held Products, Inc. | Power capacity indicator |
11152812, | Dec 15 2017 | HAND HELD PRODUCTS, INC | Powering devices using low-current power sources |
11155102, | Dec 13 2017 | HAND HELD PRODUCTS, INC | Image to script converter |
11157217, | Jan 05 2018 | HAND HELD PRODUCTS, INC | Methods, apparatuses, and systems for verifying printed image and improving print quality |
11157869, | Aug 05 2016 | VOCOLLECT, Inc. | Monitoring worker movement in a warehouse setting |
11178008, | Jun 30 2017 | HAND HELD PRODUCTS, INC | Managing a fleet of devices |
11210483, | Jan 05 2018 | HAND HELD PRODUCTS, INC | Method, apparatus, and system for characterizing an optical system |
11244264, | Dec 29 2014 | Hand Held Products, Inc. | Interleaving surprise activities in workflow |
11257143, | Dec 30 2014 | HAND HELD PRODUCTS, INC | Method and device for simulating a virtual out-of-box experience of a packaged product |
11282323, | Dec 31 2015 | Hand Held Products, Inc. | Devices, systems, and methods for optical validation |
11282515, | Aug 31 2015 | Hand Held Products, Inc. | Multiple inspector voice inspection |
11295182, | May 19 2017 | Hand Held Products, Inc. | High-speed OCR decode using depleted centerlines |
11301646, | Jan 05 2018 | HAND HELD PRODUCTS, INC | Methods, apparatuses, and systems for providing print quality feedback and controlling print quality of machine readable indicia |
11321044, | Dec 15 2014 | Hand Held Products, Inc. | Augmented reality quick-start and user guide |
11328335, | Dec 29 2014 | HAND HELD PRODUCTS, INC | Visual graphic aided location identification |
11353319, | Jul 15 2015 | Hand Held Products, Inc. | Method for a mobile dimensioning device to use a dynamic accuracy compatible with NIST standard |
11372053, | May 26 2017 | Hand Held Products, Inc. | Methods for estimating a number of workflow cycles able to be completed from a remaining battery capacity |
11373051, | Aug 04 2017 | Hand Held Products, Inc. | Indicia reader acoustic for multiple mounting positions |
11423348, | Jan 11 2016 | Hand Held Products, Inc. | System and method for assessing worker performance |
11428744, | May 26 2017 | Hand Held Products, Inc. | Methods for estimating a number of workflow cycles able to be completed from a remaining battery capacity |
11430100, | Dec 19 2016 | HAND HELD PRODUCTS, INC | Printer-verifiers and systems and methods for verifying printed indicia |
11443363, | Dec 29 2014 | Hand Held Products, Inc. | Confirming product location using a subset of a product identifier |
11449700, | Jan 26 2016 | Hand Held Products, Inc. | Enhanced matrix symbol error correction method |
11449816, | Sep 26 2014 | Hand Held Products, Inc. | System and method for workflow management |
11475655, | Sep 29 2017 | HAND HELD PRODUCTS, INC | Methods for optical character recognition (OCR) |
11488366, | Jun 12 2015 | Hand Held Products, Inc. | Augmented reality lighting effects |
11489352, | Jan 08 2015 | Hand Held Products, Inc. | System and method for charging a barcode scanner |
11531825, | Mar 07 2014 | Hand Held Products, Inc. | Indicia reader for size-limited applications |
11546428, | Aug 19 2014 | HAND HELD PRODUCTS, INC | Mobile computing device with data cognition software |
11570321, | Jan 05 2018 | HAND HELD PRODUCTS, INC | Methods, apparatuses, and systems for detecting printing defects and contaminated components of a printer |
11593591, | Oct 25 2017 | Hand Held Products, Inc. | Optical character recognition systems and methods |
11625203, | Jan 05 2018 | HAND HELD PRODUCTS, INC | Methods, apparatuses, and systems for scanning pre-printed print media to verify printed image and improving print quality |
11639846, | Sep 27 2019 | Honeywell International Inc | Dual-pattern optical 3D dimensioning |
11646028, | Aug 31 2015 | Hand Held Products, Inc. | Multiple inspector voice inspection |
11660895, | Dec 27 2017 | HAND HELD PRODUCTS, INC | Method and apparatus for printing |
11669703, | Jan 05 2018 | Datamax-O'Neil Corporation | Method, apparatus, and system for characterizing an optical system |
11694045, | Jan 05 2018 | Datamax-O'Neil Corporation | Method, apparatus, and system for characterizing an optical system |
11704085, | Dec 15 2014 | Hand Held Products, Inc. | Augmented reality quick-start and user guide |
11710980, | Dec 15 2017 | HAND HELD PRODUCTS, INC | Powering devices using low-current power sources |
11727232, | Jan 26 2016 | Hand Held Products, Inc. | Enhanced matrix symbol error correction method |
11745516, | Mar 03 2017 | HAND HELD PRODUCTS, INC | Region-of-interest based print quality optimization |
11763112, | Oct 29 2013 | Hand Held Products, Inc. | Hybrid system and method for reading indicia |
11790196, | Aug 04 2017 | Hand Held Products, Inc. | Indicia reader acoustic for multiple mounting positions |
11810545, | May 20 2011 | VOCOLLECT, Inc. | Systems and methods for dynamically improving user intelligibility of synthesized speech in a work environment |
11817078, | May 20 2011 | VOCOLLECT, Inc. | Systems and methods for dynamically improving user intelligibility of synthesized speech in a work environment |
11837253, | Jul 27 2016 | VOCOLLECT, Inc. | Distinguishing user speech from background speech in speech-dense environments |
11854333, | Dec 31 2015 | Hand Held Products, Inc. | Devices, systems, and methods for optical validation |
11868918, | Jun 30 2017 | HAND HELD PRODUCTS, INC | Managing a fleet of devices |
11893449, | Jan 05 2018 | HAND HELD PRODUCTS, INC | Method, apparatus, and system for characterizing an optical system |
11894705, | Jan 12 2018 | HAND HELD PRODUCTS, INC | Indicating charge status |
11900201, | Jan 05 2018 | HAND HELD PRODUCTS, INC | Methods, apparatuses, and systems for providing print quality feedback and controlling print quality of machine readable indicia |
11906280, | May 19 2015 | Hand Held Products, Inc. | Evaluating image values |
11941307, | Jan 05 2018 | Hand Held Products, Inc. | Methods, apparatuses, and systems captures image of pre-printed print media information for generating validation image by comparing post-printed image with pre-printed image and improving print quality |
11943406, | Jan 05 2018 | HAND HELD PRODUCTS, INC | Methods, apparatuses, and systems for detecting printing defects and contaminated components of a printer |
11962464, | Jun 30 2017 | HAND HELD PRODUCTS, INC | Managing a fleet of devices |
12057139, | Jul 27 2016 | VOCOLLECT, Inc. | Distinguishing user speech from background speech in speech-dense environments |
12073282, | Jan 05 2018 | Datamax-O'Neil Corporation | Method, apparatus, and system for characterizing an optical system |
8985461, | Jun 28 2013 | HAND HELD PRODUCTS, INC | Mobile device having an improved user interface for reading code symbols |
9004364, | Oct 14 2013 | Hand Held Products, Inc. | Indicia reader |
9007368, | May 07 2012 | Intermec IP CORP | Dimensioning system calibration systems and methods |
9037344, | May 24 2013 | Hand Held Products, Inc.; HAND HELD PRODUCTS, INC | System and method for display of information using a vehicle-mount computer |
9053378, | Dec 12 2013 | HAND HELD PRODUCTS, INC | Laser barcode scanner |
9070032, | Apr 10 2013 | HAND HELD PRODUCTS, INC | Method of programming a symbol reading system |
9080856, | Mar 13 2013 | Intermec IP Corp.; Intermec IP CORP | Systems and methods for enhancing dimensioning, for example volume dimensioning |
9082023, | Sep 05 2013 | Hand Held Products, Inc. | Method for operating a laser scanner |
9104929, | Jun 26 2013 | Hand Held Products, Inc. | Code symbol reading system having adaptive autofocus |
9141839, | Jun 07 2013 | Hand Held Products, Inc.; HAND HELD PRODUCTS, INC | System and method for reading code symbols at long range using source power control |
9165174, | Oct 14 2013 | Hand Held Products, Inc. | Indicia reader |
9183426, | Sep 11 2013 | Hand Held Products, Inc. | Handheld indicia reader having locking endcap |
9224022, | Apr 29 2014 | Hand Held Products, Inc. | Autofocus lens system for indicia readers |
9224027, | Apr 01 2014 | Hand Held Products, Inc. | Hand-mounted indicia-reading device with finger motion triggering |
9235737, | Jun 28 2013 | Hand Held Products, Inc. | System having an improved user interface for reading code symbols |
9239950, | Jul 01 2013 | HAND HELD PRODUCTS, INC | Dimensioning system |
9250652, | Jul 02 2013 | HAND HELD PRODUCTS, INC | Electronic device case |
9251411, | Sep 24 2013 | Hand Held Products, Inc. | Augmented-reality signature capture |
9258033, | Apr 21 2014 | Hand Held Products, Inc. | Docking system and method using near field communication |
9277668, | May 13 2014 | HAND HELD PRODUCTS, INC | Indicia-reading module with an integrated flexible circuit |
9280693, | May 13 2014 | HAND HELD PRODUCTS, INC | Indicia-reader housing with an integrated optical structure |
9292969, | May 07 2012 | Intermec IP Corp. | Dimensioning system calibration systems and methods |
9297900, | Jul 25 2013 | Hand Held Products, Inc.; HAND HELD PRODUCTS, INC | Code symbol reading system having adjustable object detection |
9301427, | May 13 2014 | Hand Held Products, Inc. | Heat-dissipation structure for an indicia reading module |
9310609, | Jul 25 2014 | Hand Held Products, Inc. | Axially reinforced flexible scan element |
9373018, | Jan 08 2014 | HAND HELD PRODUCTS, INC D B A HONEYWELL SCANNING & MOBILITY | Indicia-reader having unitary-construction |
9390596, | Feb 23 2015 | Hand Held Products, Inc. | Device, system, and method for determining the status of checkout lanes |
9412242, | Apr 04 2014 | HAND HELD PRODUCTS, INC | Multifunction point of sale system |
9424454, | Oct 24 2012 | Honeywell International, Inc. | Chip on board based highly integrated imager |
9443123, | Jul 18 2014 | Hand Held Products, Inc. | System and method for indicia verification |
9443222, | Oct 14 2014 | Hand Held Products, Inc.; HAND HELD PRODUCTS, INC | Identifying inventory items in a storage facility |
9464885, | Aug 30 2013 | Hand Held Products, Inc. | System and method for package dimensioning |
9478113, | Jun 27 2014 | Hand Held Products, Inc. | Cordless indicia reader with a multifunction coil for wireless charging and EAS deactivation |
9488986, | Jul 31 2015 | Hand Held Products, Inc. | System and method for tracking an item on a pallet in a warehouse |
9490540, | Sep 02 2015 | Hand Held Products, Inc. | Patch antenna |
9507974, | Jun 10 2015 | Hand Held Products, Inc. | Indicia-reading systems having an interface with a user's nervous system |
9510140, | Apr 21 2014 | Hand Held Products, Inc. | Docking system and method using near field communication |
9521331, | Apr 21 2015 | Hand Held Products, Inc. | Capturing a graphic information presentation |
9530038, | Nov 25 2013 | Hand Held Products, Inc. | Indicia-reading system |
9557166, | Oct 21 2014 | Hand Held Products, Inc. | Dimensioning system with multipath interference mitigation |
9564035, | Dec 22 2014 | Hand Held Products, Inc. | Safety system and method |
9572901, | Sep 06 2013 | Hand Held Products, Inc. | Device having light source to reduce surface pathogens |
9581809, | Apr 29 2014 | Hand Held Products, Inc. | Autofocus lens system |
9582698, | Jun 26 2013 | Hand Held Products, Inc. | Code symbol reading system having adaptive autofocus |
9616749, | May 24 2013 | Hand Held Products, Inc. | System and method for display of information using a vehicle-mount computer |
9638512, | Oct 21 2014 | Hand Held Products, Inc. | Handheld dimensioning system with feedback |
9639726, | Jul 25 2013 | Hand Held Products, Inc. | Code symbol reading system having adjustable object detection |
9646189, | Oct 31 2014 | HONEYWELL INTERNATION, INC | Scanner with illumination system |
9646191, | Sep 23 2015 | Intermec Technologies Corporation | Evaluating images |
9651362, | Aug 06 2014 | Hand Held Products, Inc. | Dimensioning system with guided alignment |
9652648, | Sep 11 2015 | Hand Held Products, Inc. | Positioning an object with respect to a target location |
9652653, | Dec 27 2014 | Hand Held Products, Inc. | Acceleration-based motion tolerance and predictive coding |
9656487, | Oct 13 2015 | Intermec Technologies Corporation | Magnetic media holder for printer |
9659198, | Sep 10 2015 | Hand Held Products, Inc. | System and method of determining if a surface is printed or a mobile device screen |
9662900, | Jul 14 2016 | HAND HELD PRODUCTS, INC | Wireless thermal printhead system and method |
9665757, | Mar 07 2014 | Hand Held Products, Inc. | Indicia reader for size-limited applications |
9672398, | Aug 26 2013 | Intermec IP Corporation | Aiming imagers |
9672507, | Apr 04 2014 | Hand Held Products, Inc. | Multifunction point of sale system |
9674430, | Mar 09 2016 | HAND HELD PRODUCTS, INC | Imaging device for producing high resolution images using subpixel shifts and method of using same |
9677872, | Oct 21 2014 | Hand Held Products, Inc. | Handheld dimensioning system with feedback |
9677877, | Jun 23 2015 | Hand Held Products, Inc. | Dual-projector three-dimensional scanner |
9678536, | Dec 18 2014 | Hand Held Products, Inc. | Flip-open wearable computer |
9679178, | Dec 26 2014 | Hand Held Products, Inc. | Scanning improvements for saturated signals using automatic and fixed gain control methods |
9680282, | Nov 17 2015 | Hand Held Products, Inc. | Laser aiming for mobile devices |
9682625, | May 24 2013 | Hand Held Products, Inc. | System and method for display of information using a vehicle-mount computer |
9684809, | Oct 29 2015 | Hand Held Products, Inc. | Scanner assembly with removable shock mount |
9685049, | Dec 30 2014 | Hand Held Products, Inc. | Method and system for improving barcode scanner performance |
9689664, | Aug 06 2014 | Hand Held Products, Inc. | Dimensioning system with guided alignment |
9693038, | Apr 21 2015 | Hand Held Products, Inc.; HAND HELD PRODUCTS, INC | Systems and methods for imaging |
9697401, | Nov 24 2015 | Hand Held Products, Inc. | Add-on device with configurable optics for an image scanner for scanning barcodes |
9697403, | Jan 08 2014 | Hand Held Products, Inc. | Indicia-reader having unitary-construction |
9701140, | Sep 20 2016 | HAND HELD PRODUCTS, INC | Method and system to calculate line feed error in labels on a printer |
9719775, | Oct 21 2014 | Hand Held Products, Inc. | Handheld dimensioning system with feedback |
9721132, | Dec 31 2014 | Hand Held Products, Inc. | Reconfigurable sled for a mobile device |
9721135, | Oct 10 2014 | Hand Held Products, Inc. | Depth sensor based auto-focus system for an indicia scanner |
9726475, | Mar 13 2013 | Intermec IP Corp. | Systems and methods for enhancing dimensioning |
9727083, | Oct 19 2015 | Hand Held Products, Inc. | Quick release dock system and method |
9727769, | Dec 22 2014 | Hand Held Products, Inc. | Conformable hand mount for a mobile scanner |
9727840, | Jan 04 2016 | Hand Held Products, Inc. | Package physical characteristic identification system and method in supply chain management |
9727841, | May 20 2016 | VOCOLLECT, Inc. | Systems and methods for reducing picking operation errors |
9729744, | Dec 21 2015 | Hand Held Products, Inc. | System and method of border detection on a document and for producing an image of the document |
9734639, | Dec 31 2014 | Hand Held Products, Inc. | System and method for monitoring an industrial vehicle |
9741165, | May 04 2012 | Intermec IP Corp. | Volume dimensioning systems and methods |
9741181, | May 19 2015 | Hand Held Products, Inc. | Evaluating image values |
9743731, | Dec 18 2014 | HAND HELD PRODUCTS, INC | Wearable sled system for a mobile computer device |
9752864, | Oct 21 2014 | Hand Held Products, Inc. | Handheld dimensioning system with feedback |
9761096, | Dec 18 2014 | Hand Held Products, Inc. | Active emergency exit systems for buildings |
9767337, | Sep 30 2015 | HAND HELD PRODUCTS, INC | Indicia reader safety |
9767581, | Dec 12 2014 | Hand Held Products, Inc. | Auto-contrast viewfinder for an indicia reader |
9773142, | Jul 22 2013 | Hand Held Products, Inc.; HAND HELD PRODUCTS, INC | System and method for selectively reading code symbols |
9774940, | Dec 27 2014 | Hand Held Products, Inc. | Power configurable headband system and method |
9779276, | Oct 10 2014 | HAND HELD PRODUCTS, INC | Depth sensor based auto-focus system for an indicia scanner |
9779546, | May 04 2012 | Intermec IP CORP | Volume dimensioning systems and methods |
9781502, | Sep 09 2015 | Hand Held Products, Inc. | Process and system for sending headset control information from a mobile device to a wireless headset |
9781681, | Aug 26 2015 | Hand Held Products, Inc. | Fleet power management through information storage sharing |
9784566, | Mar 13 2013 | Intermec IP Corp. | Systems and methods for enhancing dimensioning |
9785814, | Sep 23 2016 | Hand Held Products, Inc. | Three dimensional aimer for barcode scanning |
9786101, | May 19 2015 | Hand Held Products, Inc. | Evaluating image values |
9792582, | Oct 14 2014 | Hand Held Products, Inc. | Identifying inventory items in a storage facility |
9794392, | Jul 10 2014 | Hand Held Products, Inc. | Mobile-phone adapter for electronic transactions |
9798413, | Aug 27 2015 | Hand Held Products, Inc. | Interactive display |
9800293, | Nov 08 2013 | HAND HELD PRODUCTS, INC | System for configuring indicia readers using NFC technology |
9800860, | Oct 21 2014 | Hand Held Products, Inc. | Dimensioning system with feedback |
9802427, | Jan 18 2017 | HAND HELD PRODUCTS, INC | Printers and methods for detecting print media thickness therein |
9804013, | Jul 07 2015 | Hand Held Products, Inc. | Mobile dimensioner apparatus for use in commerce |
9805237, | Sep 18 2015 | Hand Held Products, Inc. | Cancelling noise caused by the flicker of ambient lights |
9805257, | Sep 07 2016 | HAND HELD PRODUCTS, INC | Printer method and apparatus |
9805343, | Jan 05 2016 | Intermec Technologies Corporation | System and method for guided printer servicing |
9811650, | Dec 31 2014 | Hand Held Products, Inc. | User authentication system and method |
9823059, | Aug 06 2014 | Hand Held Products, Inc. | Dimensioning system with guided alignment |
9826106, | Dec 30 2014 | Hand Held Products, Inc. | System and method for detecting barcode printing errors |
9826220, | Oct 21 2014 | Hand Held Products, Inc. | Dimensioning system with feedback |
9827796, | Jan 03 2017 | HAND HELD PRODUCTS, INC | Automatic thermal printhead cleaning system |
9830488, | Dec 30 2014 | Hand Held Products, Inc. | Real-time adjustable window feature for barcode scanning and process of scanning barcode with adjustable window feature |
9835486, | Jul 07 2015 | Hand Held Products, Inc. | Mobile dimensioner apparatus for use in commerce |
9841311, | Oct 16 2012 | HAND HELD PRODUCTS, INC | Dimensioning system |
9843660, | Dec 29 2014 | Hand Held Products, Inc. | Tag mounted distributed headset with electronics module |
9844158, | Dec 18 2015 | Honeywell International, Inc | Battery cover locking mechanism of a mobile terminal and method of manufacturing the same |
9844956, | Oct 07 2015 | Intermec Technologies Corporation | Print position correction |
9849691, | Jan 26 2017 | HAND HELD PRODUCTS, INC | Detecting printing ribbon orientation |
9852102, | Apr 15 2015 | Hand Held Products, Inc.; HAND HELD PRODUCTS, INC | System for exchanging information between wireless peripherals and back-end systems via a peripheral hub |
9853575, | Aug 12 2015 | Hand Held Products, Inc. | Angular motor shaft with rotational attenuation |
9857167, | Jun 23 2015 | Hand Held Products, Inc. | Dual-projector three-dimensional scanner |
9861182, | Feb 05 2015 | Hand Held Products, Inc. | Device for supporting an electronic tool on a user's hand |
9864887, | Jul 07 2016 | Hand Held Products, Inc. | Energizing scanners |
9864891, | Nov 24 2015 | Intermec Technologies Corporation | Automatic print speed control for indicia printer |
9876923, | Oct 27 2015 | Intermec Technologies Corporation | Media width sensing |
9876957, | Jun 21 2016 | Hand Held Products, Inc. | Dual mode image sensor and method of using same |
9879823, | Dec 31 2014 | Hand Held Products, Inc. | Reclosable strap assembly |
9880268, | Jun 07 2013 | Hand Held Products, Inc. | Method of error correction for 3D imaging device |
9881194, | Sep 19 2016 | Hand Held Products, Inc. | Dot peen mark image acquisition |
9883063, | Oct 27 2015 | Intermec Technologies Corporation | Media width sensing |
9891612, | May 05 2015 | Hand Held Products, Inc. | Intermediate linear positioning |
9892356, | Oct 27 2016 | Hand Held Products, Inc. | Backlit display detection and radio signature recognition |
9892876, | Jun 16 2015 | Hand Held Products, Inc. | Tactile switch for a mobile electronic device |
9897434, | Oct 21 2014 | Hand Held Products, Inc. | Handheld dimensioning system with measurement-conformance feedback |
9897441, | Oct 04 2012 | HAND HELD PRODUCTS, INC | Measuring object dimensions using mobile computer |
9898635, | Dec 30 2014 | Hand Held Products, Inc. | Point-of-sale (POS) code sensing apparatus |
9902175, | Aug 02 2016 | HAND HELD PRODUCTS, INC | Thermal printer having real-time force feedback on printhead pressure and method of using same |
9908351, | Feb 27 2017 | HAND HELD PRODUCTS, INC | Segmented enclosure |
9909858, | Oct 21 2014 | Hand Held Products, Inc. | Handheld dimensioner with data-quality indication |
9911023, | Aug 17 2015 | Hand Held Products, Inc. | Indicia reader having a filtered multifunction image sensor |
9911192, | Jun 10 2016 | Hand Held Products, Inc. | Scene change detection in a dimensioner |
9911295, | Jun 27 2014 | Hand Held Products, Inc. | Cordless indicia reader with a multifunction coil for wireless charging and EAS deactivation |
9916488, | Sep 23 2015 | Intermec Technologies Corporation | Evaluating images |
9919547, | Aug 04 2016 | HAND HELD PRODUCTS, INC | System and method for active printing consistency control and damage protection |
9924006, | Oct 31 2014 | Hand Held Products, Inc.; HAND HELD PRODUCTS, INC | Adaptable interface for a mobile computing device |
9930050, | Apr 01 2015 | Hand Held Products, Inc. | Device management proxy for secure devices |
9930142, | May 24 2013 | Hand Held Products, Inc. | System for providing a continuous communication link with a symbol reading device |
9931867, | Sep 23 2016 | HAND HELD PRODUCTS, INC | Method and system of determining a width of a printer ribbon |
9935946, | Dec 16 2015 | Hand Held Products, Inc. | Method and system for tracking an electronic device at an electronic device docking station |
9936278, | Oct 03 2016 | VOCOLLECT, Inc.; VOCOLLECT, INC | Communication headsets and systems for mobile application control and power savings |
9937735, | Apr 20 2017 | HAND HELD PRODUCTS, INC | Self-strip media module |
9939259, | Oct 04 2012 | HAND HELD PRODUCTS, INC | Measuring object dimensions using mobile computer |
9940497, | Aug 16 2016 | Hand Held Products, Inc. | Minimizing laser persistence on two-dimensional image sensors |
9940721, | Jun 10 2016 | Hand Held Products, Inc. | Scene change detection in a dimensioner |
9945777, | Jan 14 2016 | Hand Held Products, Inc. | Multi-spectral imaging using longitudinal chromatic aberrations |
9946962, | Sep 13 2016 | HAND HELD PRODUCTS, INC | Print precision improvement over long print jobs |
9949005, | Jun 18 2015 | Hand Held Products, Inc. | Customizable headset |
9952356, | Apr 29 2014 | Hand Held Products, Inc. | Autofocus lens system |
9953296, | Jan 11 2013 | HAND HELD PRODUCTS, INC | System, method, and computer-readable medium for managing edge devices |
9954871, | May 06 2015 | Hand Held Products, Inc. | Method and system to protect software-based network-connected devices from advanced persistent threat |
9955072, | Mar 09 2016 | Hand Held Products, Inc. | Imaging device for producing high resolution images using subpixel shifts and method of using same |
9955099, | Jun 21 2016 | Hand Held Products, Inc. | Minimum height CMOS image sensor |
9955522, | Jul 07 2015 | Hand Held Products, Inc. | WiFi enable based on cell signals |
9965694, | May 15 2012 | Honeywell International Inc. | Terminals and methods for dimensioning objects |
9975324, | Oct 13 2015 | Intermec Technologies Corporation | Magnetic media holder for printer |
9976848, | Aug 06 2014 | Hand Held Products, Inc. | Dimensioning system with guided alignment |
9978088, | May 08 2015 | Hand Held Products, Inc. | Application independent DEX/UCS interface |
9983588, | Jan 27 2016 | Hand Held Products, Inc. | Vehicle positioning and object avoidance |
9984267, | Jan 08 2014 | Hand Held Products, Inc. | Indicia reader having unitary-construction |
9984366, | Jun 09 2017 | Hand Held Products, Inc. | Secure paper-free bills in workflow applications |
9984685, | Nov 07 2014 | Hand Held Products, Inc. | Concatenated expected responses for speech recognition using expected response boundaries to determine corresponding hypothesis boundaries |
9990524, | Jun 16 2016 | Hand Held Products, Inc. | Eye gaze detection controlled indicia scanning system and method |
9990784, | Feb 05 2016 | HAND HELD PRODUCTS, INC | Dynamic identification badge |
9997935, | Jan 08 2015 | HAND HELD PRODUCTS, INC | System and method for charging a barcode scanner |
D792407, | Jun 02 2015 | Hand Held Products, Inc. | Mobile computer housing |
ER4592, | |||
ER9051, |
Patent | Priority | Assignee | Title |
5576531, | Jan 31 1994 | PSC Inc | Hand held bar code scanning device having a manually operated optical trigger switch |
5811784, | Jun 26 1995 | Symbol Technologies, LLC | Extended working range dataform reader |
6428179, | Nov 09 1999 | Illuminable writing instrument | |
6742913, | Jan 09 2001 | THEORY3, INC | Motion activated decorative light |
6853293, | May 28 1993 | Symbol Technologies, Inc. | Wearable communication system |
7222985, | May 03 2000 | Illuminated article-locator | |
7494063, | Nov 13 2003 | Metrologic Instruments, Inc. | Automatic imaging-based code symbol reading system supporting a multi-tier modular software architecture, automatic illumination control, and video image capture and processing techniques |
7753269, | Jan 11 2002 | Metrologic Instruments, Inc. | POS-based code driven retail transaction system configured to enable the reading of code symbols on cashier and customer sides thereof, during a retail transaction being carried out at a point-of-sale (POS) station, and driven by a retail transaction application program |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Jan 11 2008 | KNOWLES, C HARRY | Metrologic Instruments, Inc | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 026863 | /0525 | |
Jan 28 2008 | ZHU, XIAOXUN | Metrologic Instruments, Inc | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 026863 | /0525 | |
Jan 29 2008 | DE FONEY, SHAWN | Metrologic Instruments, Inc | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 026863 | /0525 | |
Jan 29 2008 | MANDAL, SUDHIN | Metrologic Instruments, Inc | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 026863 | /0525 | |
Jan 29 2008 | MIRAGLIA, MICHAEL | Metrologic Instruments, Inc | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 026863 | /0525 | |
Jan 29 2008 | REN, JIE | Metrologic Instruments, Inc | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 026863 | /0525 | |
Jan 29 2008 | YAN, WEIZHEN | Metrologic Instruments, Inc | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 026863 | /0525 | |
Jan 29 2008 | GIORDANO, PATRICK | Metrologic Instruments, Inc | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 026863 | /0525 | |
Jan 29 2008 | VEKSLAND, MICHAEL | Metrologic Instruments, Inc | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 026863 | /0525 | |
Jan 29 2008 | KOTLARSKY, ANATOLY | Metrologic Instruments, Inc | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 026863 | /0525 | |
Feb 01 2008 | ALLEN, CHRISTOPHER | Metrologic Instruments, Inc | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 026863 | /0525 | |
Feb 01 2008 | WILZ, DAVID M , SR | Metrologic Instruments, Inc | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 026863 | /0525 | |
Feb 04 2008 | SMITH, TAYLOR | Metrologic Instruments, Inc | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 026863 | /0525 | |
Sep 07 2011 | Metrologic Instruments, Inc. | (assignment on the face of the patent) | / |
Date | Maintenance Fee Events |
Jan 26 2017 | M1551: Payment of Maintenance Fee, 4th Year, Large Entity. |
Feb 16 2021 | M1552: Payment of Maintenance Fee, 8th Year, Large Entity. |
Date | Maintenance Schedule |
Aug 27 2016 | 4 years fee payment window open |
Feb 27 2017 | 6 months grace period start (w surcharge) |
Aug 27 2017 | patent expiry (for year 4) |
Aug 27 2019 | 2 years to revive unintentionally abandoned end. (for year 4) |
Aug 27 2020 | 8 years fee payment window open |
Feb 27 2021 | 6 months grace period start (w surcharge) |
Aug 27 2021 | patent expiry (for year 8) |
Aug 27 2023 | 2 years to revive unintentionally abandoned end. (for year 8) |
Aug 27 2024 | 12 years fee payment window open |
Feb 27 2025 | 6 months grace period start (w surcharge) |
Aug 27 2025 | patent expiry (for year 12) |
Aug 27 2027 | 2 years to revive unintentionally abandoned end. (for year 12) |