A method of detecting an edge of a support surface by an imaging controller includes: obtaining a plurality of depth measurements captured by a depth sensor and corresponding to an area containing the support surface; selecting, by the imaging controller, a candidate set of the depth measurements based on at least one of (i) an expected proximity of the edge of the support surface to the depth sensor, and (ii) an expected orientation of the edge of the support surface relative to the depth sensor; fitting, by the imaging controller, a guide element to the candidate set of depth measurements; and detecting, by the imaging controller, an output set of the depth measurements corresponding to the edge from the candidate set of depth measurements according to a proximity between each candidate depth measurement and the guide element.
|
1. A method of detecting an edge of a shelf by an imaging controller, the edge of the shelf facing into an aisle, the method comprising:
obtaining a plurality of depth measurements captured by a depth sensor positioned in the aisle, the depth measurements defining distances from the depth sensor to respective points on the shelf;
selecting, by the imaging controller, a candidate set of the depth measurements based on at least one of (i) an expected proximity of the edge of the shelf to the depth sensor, and (ii) an expected orientation of the edge of the shelf relative to the depth sensor;
fitting, by the imaging controller, a guide element to the candidate set of depth measurements; and
detecting, by the imaging controller, an output set of the depth measurements that lie on the edge of the shelf from the candidate set of depth measurements according to a proximity between each candidate depth measurement and the guide element.
12. A computing device for detecting an edge of a shelf, the edge facing into an aisle, the computing device comprising:
a memory; and
an imaging controller including:
a preprocessor configured to obtain a plurality of depth measurements captured by a depth sensor positioned in the aisle, the depth measurements defining distances from the depth sensor to respective points on the shelf;
a selector configured to select a candidate set of the depth measurements based on at least one of (i) an expected proximity of the edge of the shelf to the depth sensor, and (ii) an expected orientation of the edge of the shelf relative to the depth sensor;
a guide generator configured to fit a guide element to the candidate set of depth measurements; and
an output detector configured to detect an output set of the depth measurements that lie on the edge of the shelf from the candidate set of depth measurements according to a proximity between each candidate depth measurement and the guide element.
2. The method of
3. The method of
4. The method of
wherein selecting the candidate set further comprises, for each sweep angle, selecting a single minimum depth measurement from the plurality of groups corresponding to the sweep angle.
5. The method of
6. The method of
7. The method of
determining, for each candidate depth measurement, a distance between the candidate depth measurement and the guide element;
identifying local minima among the distances; and
selecting the candidate depth measurements corresponding to the local minima.
8. The method of
9. The method of
subdividing the depth image into a plurality of patches;
generating normal vectors for each of the patches; and
selecting the depth measurements contained in patches having normal vectors with a predetermined orientation.
10. The method of
at each of a predetermined sequence of depth ranges, fitting a plane to the candidate depth measurements within the depth range; and
selecting one of the planes intersecting the greatest number of the candidate depth measurements.
11. The method of
13. The computing device of
14. The computing device of
15. The computing device of
wherein the selector is further configured to select the candidate set by, for each sweep angle, selecting a single minimum depth measurement from the plurality of groups corresponding to the sweep angle.
16. The computing device of
17. The computing device of
minimizing a depth of the curve; and
maximizing a population of the candidate depth measurements intersected by the curve.
18. The computing device of
determining, for each candidate depth measurement, a distance between the candidate depth measurement and the guide element;
identifying local minima among the distances; and
selecting the candidate depth measurements corresponding to the local minima.
19. The computing device of
20. The computing device of
subdividing the depth image into a plurality of patches;
generating normal vectors for each of the patches; and
selecting the depth measurements contained in patches having normal vectors with a predetermined orientation.
21. The computing device of
at each of a predetermined sequence of depth ranges, fitting a plane to the candidate depth measurements within the depth range; and
selecting one of the planes intersecting the greatest number of the candidate depth measurements.
22. The computing device of
|
Environments in which inventories of objects are managed, such as products for purchase in a retail environment, may be complex and fluid. For example, a given environment may contain a wide variety of objects with different sizes, shapes, and other attributes. Such objects may be supported on shelves in a variety of positions and orientations. The variable position and orientation of the objects, as well as variations in lighting and the placement of labels and other indicia on the objects and the shelves, can render detection of structural features such as the edges of the shelves difficult.
The accompanying figures, where like reference numerals refer to identical or functionally similar elements throughout the separate views, together with the detailed description below, are incorporated in and form part of the specification, and serve to further illustrate embodiments of concepts that include the claimed invention, and explain various principles and advantages of those embodiments.
Skilled artisans will appreciate that elements in the figures are illustrated for simplicity and clarity and have not necessarily been drawn to scale. For example, the dimensions of some of the elements in the figures may be exaggerated relative to other elements to help to improve understanding of embodiments of the present invention.
The apparatus and method components have been represented where appropriate by conventional symbols in the drawings, showing only those specific details that are pertinent to understanding the embodiments of the present invention so as not to obscure the disclosure with details that will be readily apparent to those of ordinary skill in the art having the benefit of the description herein.
In retail environments in which a plurality of products is supported on shelves, systems may be configured to capture images of the shelves and determine, from the images, various information concerning the products. For example, price labels may be located and decoded within the image, for use in ensuring that products are labelled with the correct prices. Further, gaps between products on the shelves may be identified as potentially indicating that one or more products are out of stock and require replenishment. The above determinations may require the identification of distances between the capture device and the shelf edges to describe the three-dimensional structure of the shelf edges, for use as reference structures for the identification of labels, products, gaps, and the like.
The identification of shelf edges from depth measurements is complicated by a variety of factors, including the proximity to the shelf edges of products having a wide variety of shapes and orientations. Such factors also include lighting variations, reflections, obstructions from products or other objects, and the like.
Examples disclosed herein are directed to a method of detecting an edge of a support surface by an imaging controller. The method includes: obtaining a plurality of depth measurements captured by a depth sensor and corresponding to an area containing the support surface; selecting, by the imaging controller, a candidate set of the depth measurements based on at least one of (i) an expected proximity of the edge of the support surface to the depth sensor, and (ii) an expected orientation of the edge of the support surface relative to the depth sensor; fitting, by the imaging controller, a guide element to the candidate set of depth measurements; and detecting, by the imaging controller, an output set of the depth measurements corresponding to the edge from the candidate set of depth measurements according to a proximity between each candidate depth measurement and the guide element.
Further examples disclosed herein are directed a to computing device for detecting an edge of a support surface, comprising: a memory; and an imaging controller including: a preprocessor configured to obtain a plurality of depth measurements captured by a depth sensor and corresponding to an area containing the support surface; a selector configured to select a candidate set of the depth measurements based on at least one of (i) an expected proximity of the edge of the support surface to the depth sensor, and (ii) an expected orientation of the edge of the support surface relative to the depth sensor; a guide generator configured to fit a guide element to the candidate set of depth measurements; and an output detector configured to detect an output set of the depth measurements corresponding to the edge from the candidate set of depth measurements according to a proximity between each candidate depth measurement and the guide element.
The client computing device 105 is illustrated in
The system 100 is deployed, in the illustrated example, in a retail environment including a plurality of shelf modules 110-1, 110-2, 110-3 and so on (collectively referred to as shelves 110, and generically referred to as a shelf 110—this nomenclature is also employed for other elements discussed herein). Each shelf module 110 supports a plurality of products 112. Each shelf module 110 includes a shelf back 116-1, 116-2, 116-3 and a support surface (e.g. support surface 117-3 as illustrated in
More specifically, the apparatus 103 is deployed within the retail environment, and communicates with the server 101 (via the link 107) to navigate, autonomously or partially autonomously, the length 119 of at least a portion of the shelves 110. The apparatus 103 is equipped with a plurality of navigation and data capture sensors 104, such as image sensors (e.g. one or more digital cameras) and depth sensors (e.g. one or more Light Detection and Ranging (LIDAR) sensors, one or more depth cameras employing structured light patterns, such as infrared light), and is further configured to employ the sensors to capture shelf data. In the present example, the apparatus 103 is configured to capture a plurality of depth measurements corresponding to the shelves 110, each measurement defining a distance from the depth sensor to a point on the shelf 110 (e.g., a product 112 disposed on the shelf 110 or a structural component of the shelf 110, such as a shelf edge 118 or a shelf back 116).
The server 101 includes a special purpose imaging controller, such as a processor 120, specifically designed to control the mobile automation apparatus 103 to capture data (e.g. the above-mentioned depth measurements), obtain the captured data via a communications interface 124 and store the captured data in a repository 132 in a memory 122. The server 101 is further configured to perform various post-processing operations on the captured data and to detect certain structural features—such as the shelf edges 118—within the captured data. The post-processing of captured data by the server 101 will be discussed below in greater detail. The server 101 may also be configured to determine product status data based in part on the above-mentioned shelf edge detections, and to transmit status notifications (e.g. notifications indicating that products are out-of-stock, low stock or misplaced) to the mobile device 105 responsive to the determination of product status data.
The processor 120 is interconnected with a non-transitory computer readable storage medium, such as the above-mentioned memory 122, having stored thereon computer readable instructions for executing control of the apparatus 103 to capture data, as well as the above-mentioned post-processing functionality, as discussed in further detail below. The memory 122 includes a combination of volatile (e.g. Random Access Memory or RAM) and non-volatile memory (e.g. read only memory or ROM, Electrically Erasable Programmable Read Only Memory or EEPROM, flash memory). The processor 120 and the memory 122 each comprise one or more integrated circuits. In an embodiment, the processor 120, further includes one or more central processing units (CPUs) and/or graphics processing units (GPUs). In an embodiment, a specially designed integrated circuit, such as a Field Programmable Gate Array (FPGA), is designed to perform the shelf edge detection discussed herein, either alternatively or in addition to the imaging controller/processor 120 and memory 122. As those of skill in the art will realize, the mobile automation apparatus 103 also includes one or more controllers or processors and/or FPGAs, in communication with the controller 120, specifically configured to control navigational and/or data capture aspects of the apparatus 103. The client device 105 also includes one or more controllers or processors and/or FPGAs, in communication with the controller 120, specifically configured to process (e.g. to display) notifications received from the server 101.
The server 101 also includes the above-mentioned communications interface 124 interconnected with the processor 120. The communications interface 124 includes suitable hardware (e.g. transmitters, receivers, network interface controllers and the like) allowing the server 101 to communicate with other computing devices—particularly the apparatus 103, the client device 105 and the dock 108—via the links 107 and 109. The links 107 and 109 may be direct links, or links that traverse one or more networks, including both local and wide-area networks. The specific components of the communications interface 124 are selected based on the type of network or other links that the server 101 is required to communicate over. In the present example, as noted earlier, a wireless local-area network is implemented within the retail environment via the deployment of one or more wireless access points. The links 107 therefore include either or both wireless links between the apparatus 103 and the mobile device 105 and the above-mentioned access points, and a wired link (e.g. an Ethernet-based link) between the server 101 and the access point.
The memory 122 stores a plurality of applications, each including a plurality of computer readable instructions executable by the processor 120. The execution of the above-mentioned instructions by the processor 120 configures the server 101 to perform various actions discussed herein. The applications stored in the memory 122 include a control application 128, which may also be implemented as a suite of logically distinct applications. In general, via execution of the control application 128 or subcomponents thereof, the processor 120 is configured to implement various functionality. The processor 120, as configured via the execution of the control application 128, is also referred to herein as the controller 120. As will now be apparent, some or all of the functionality implemented by the controller 120 described below may also be performed by preconfigured hardware elements (e.g. one or more Application-Specific Integrated Circuits (ASICs)) rather than by execution of the control application 128 by the processor 120.
Turning now to
In the present example, the mast 205 supports seven digital cameras 207-1 through 207-7, and two LIDAR sensors 211-1 and 211-2. The mast 205 also supports a plurality of illumination assemblies 213, configured to illuminate the fields of view of the respective cameras 207. That is, the illumination assembly 213-1 illuminates the field of view of the camera 207-1, and so on. The sensors 207 and 211 are oriented on the mast 205 such that the fields of view of each sensor face a shelf 110 along the length 119 of which the apparatus 103 is travelling. The apparatus 103 is configured to track a location of the apparatus 103 (e.g. a location of the center of the chassis 201) in a common frame of reference previously established in the retail facility, permitting data captured by the mobile automation apparatus to be registered to the common frame of reference.
To that end, the mobile automation apparatus 103 includes a special-purpose controller, such as a processor 220, as shown in
The processor 220, when so configured by the execution of the application 228, may also be referred to as a controller 220 or, in the context of shelf edge detection from captured data, as an imaging controller 220. Those skilled in the art will appreciate that the functionality implemented by the processor 220 via the execution of the application 228 may also be implemented by one or more specially designed hardware and firmware components, such as FPGAs, ASICs and the like in other embodiments.
The memory 222 may also store a repository 232 containing, for example, a map of the environment in which the apparatus 103 operates, for use during the execution of the application 228. The apparatus 103 may communicate with the server 101, for example to receive instructions to initiate data capture operations, via a communications interface 224 over the link 107 shown in
In the present example, as discussed below, one or both of the server 101 (as configured via the execution of the control application 128 by the processor 120) and the mobile automation apparatus 103 (as configured via the execution of the application 228 by the processor 220), are configured to process depth measurements captured by the apparatus 103 to identify portions of the captured data depicting the shelf edges 118. In further examples, the data processing discussed below may be performed on a computing device other than the server 101 and the mobile automation apparatus 103, such as the client device 105. The data processing mentioned above will be described in greater detail in connection with its performance at the server 101, via execution of the application 128.
Turning now to
The control application 128 includes a preprocessor 200 configured to obtain depth measurements corresponding to the shelves 110 and the products 112 supported thereon, and to preprocess the depth measurements, for example by filtering the depth measurements prior to downstream processing operations. The control application 128 also includes a selector 204 configured to select a candidate set of depth measurements from the preprocessed depth measurements (i.e., the output of the preprocessor 200). As will be discussed below, the candidate set of depth measurements are depth measurements considered likely to correspond to shelf edges 118. The control application 128 also includes a guide generator 208 configured to generate a guide element (such as a curve or a plane) against which the above-mentioned candidate set of depth measurements is evaluated by an output detector 212 to detect an output set among the candidate set of depth measurements. The output set of depth measurements contains the depth measurements considered to have the greatest likelihood of corresponding to shelf edges 118.
The functionality of the control application 128 will now be described in greater detail. Turning to
At block 305, the controller 120, and in particular the preprocessor 200, is configured to obtain a plurality of depth measurements captured by a depth sensor and corresponding to an area containing the above-mentioned support surface. In other words, in the present example the depth measurements correspond to an area containing at least one shelf support surface 117 and shelf edge 118. The depth measurements obtained at block 305 are, for example, captured by the apparatus 103 and stored in the repository 132. The preprocessor 200 is therefore configured, in the above example, to obtain the depth measurements by retrieving the measurements from the repository 132.
The depth measurements can take a variety of forms, according to the depth sensor employed (e.g. by the apparatus 103) to capture the measurements. For example, the apparatus 103 can include a lidar sensor, and the depth measurements therefore include one or more lidar scans captured as the apparatus 103 travels the length of an aisle (i.e., a set of adjacent shelf modules 110). The lidar sensor of the apparatus 103 captures depth measurements by sweeping a line of laser light across the shelves 110 through a predetermined set of sweep angles and determining, for each of a the sweep angles, a group of depth measurements along the line.
Thus, in the example illustrated in
In other examples, the apparatus 103 captures the depth measurements with the use of a depth camera, such as a stereoscopic camera including a structured light projector (e.g. which projects a pattern of infrared light onto the shelves 110). In such examples, referring to
Returning to
The control application 128 is then configured to proceed to block 310. At block 310, the control application 128, and more specifically the selector 204, is configured to select a candidate set of the depth measurements obtained at block 305. The candidate set of depth measurements is selected based on at least one of an expected proximity of the shelf edge 118 to the depth sensor, and an expected orientation of the shelf edge 118 relative to the depth sensor. As will be discussed in greater detail below, the apparatus 103 is assumed by the selector 204 to travel in a direction substantially parallel to the shelf edge 118. As a result, the distance between the sensor (e.g. lidar sensor 404 or depth camera 504) is expected to remain substantially constant throughout the captured data. Further, because the support surfaces 117 extend from the shelf backs 116 toward the aisle in which the apparatus 103 travels, the shelf edges 118 are expected to be closer to the apparatus 103 than other structures (e.g. products 112) depicted in the captured data. Still further, each shelf edge 118 is assumed to have a known orientation. For example, each shelf edge 118 may be expected to be a substantially vertical surface. As will be seen below, when the data captured at block 305 captured with the lidar sensor 404, the candidate set of measurements is selected based on an expected proximity to the depth sensor, and when the data captured at block 305 is captured with the depth camera 504, the candidate set of measurements is selected based on an expected orientation to the depth sensor.
Turning to
At block 605, the selector 204 is configured to select a sweep angle. As seen in
At block 610, for the selected sweep angle, the selector 204 is configured to select the minimum depth measurement corresponding to that sweep angle. Thus, referring again to the scan 416-1, the selector 204 is configured to select the minimum depth measurement among the values d-60-1, d-60-2, d-60-3, d-60-19, and d-60-20. In the present example, when the depth measurements obtained at block 305 include a plurality of lidar scans (e.g., the scans 416-1 through 416-4 shown in
In some examples, rather than selecting the minimum depth at block 610, the selector 204 is configured to select a representative sample for each sweep angle other than the minimum depth measurement. For example, the selector 204 can be configured to select the median of the depth measurements for each sweep angle. Such an approach may be employed by the selector 204 in some embodiments when the depth measurements captured by the apparatus 103 contain a level of noise above a predefined threshold.
Having selected the minimum depth measurement for the current sweep angle at block 610, the selector is configured to add the selected depth measurement to the candidate set, along with an indication of the sweep angle corresponding to the minimum depth measurement (i.e., the angle selected at block 605). At block 615, the selector 204 is then configured to determine whether any sweep angles remain to be processed. When the determination is affirmative, the performance of the method 600 returns to block 605, and block 610 is repeated for the next sweep angle (e.g., −55 degrees as shown in
At block 655, the selector 204 is configured to subdivide the image containing depth measurements into a plurality of overlapping patches. For example, each patch may have dimensions of 3×3 pixels, and overlap adjacent patches by 2 pixels in the vertical and horizontal directions. That is, the patches are selected such that every pixel in the depth image is the center of one patch. In other examples, larger patch dimensions may also be employed (e.g. 5×5 pixels), with a greater degree of overlap to provide one patch centered on each pixel. In further examples, the overlap between patches may be reduced to reduce the computational burden imposed by the performance of the method 650, at the expense of reduced resolution of the candidate set, as will be evident below.
At block 655, having selected a patch (e.g. the upper-left patch of 3×3 pixels of the image 516-1 shown in
At block 660, the selector 204 is configured to determine whether the normal vector generated at block 655 has a predefined orientation. As noted earlier, when the data captured at block 305 is captured with the depth camera 504, the candidate set of measurements is selected based on an expected orientation to the depth sensor. The expected orientation of the shelf edge 118 relative to the depth sensor (e.g. the camera 504), as shown in
The selector 204 is therefore configured to perform the determination at block 660 by comparing the normal vector generated at block 655 to the predefined expected orientation. Referring again to
Returning to
Returning to
Referring to
At block 855, the guide generator 208 is configured to select a depth range. In the present example, the guide generator 208 is configured to assess depth ranges in a sequence beginning with a minimum depth (e.g. a depth of zero, indicating a search volume immediately adjacent to the depth sensor at the time of data capture), with each and increasing by predefined distances. Thus, for example, the depth ranges assessed may include a depth range of 0 to 0.2 m, 0.2 m to 0.4 m, 0.4 m to 0.6 m, and so on, until a predefined maximum depth (e.g., 2.0 m). Each depth range may contain a subset of the candidate set of pixels selected through the performance of the method 650. The guide generator 208 may also be configured to determine whether the selected depth range contains any candidate pixels, and when it does not, to immediately advance to the next depth range.
At block 860, the guide generator 208 is configured to fit a plane to the subset of candidate pixels contained within the current depth range. The subset includes any of the candidate pixels having depth measurements (e.g., along the Z axis) within the depth range, regardless of the position of such pixels in the image (e.g., the position on the X and Y axes). Turning to
The guide generator 208 is configured to fit a plane 1008 to the subset 1004 according to a suitable plane fitting operation. For example, a plane fitting operation may be selected for the performance of block 860 that maximizes the number of points in the subset 1004 that are intersected by the plane (i.e., that are inliers of the plane). As seen in
Returning to
When the determination at block 865 is negative, the plane generated at block 860 is discarded, and the guide generator 208 determines at block 870 whether any depth ranges remain to be assessed. When the determination at block 865 is affirmative, however, as in the case of the plane 1008 shown in
The guide generator 208 is then configured to determine, at block 870, whether any depth ranges remain to be assessed. In the present example performance, a second depth range remains to be assessed, as shown in
When all depth ranges have been assessed, a negative determination at block 870 leads the guide generator 208 to pass the current best plane (e.g. as a normal vector and a depth) to the output detector 212 for further processing at block 320 of the method 300.
Returning to
Turning to
At block 1160, the output detector 212 is configured to select local minima among the distances determined at block 1155. The local minima may be selected from a preconfigured range of sweep angles (e.g. one minimum distance may be selected among five consecutive distances). Turning to
Returning to
At block 1170, the output detector 212 is configured to determine whether any scans remain to be processed. When scans remain to be processed, the performance of blocks 1155-1165 is repeated for each remaining scan. When the determination at block 1170 is negative, the output detector 212 proceeds to block 1175. At block 1175, the output detector is configured to discard any local minimum distances that fail to meet a detection threshold. The detection threshold is a preconfigured number of scans in which a local minimum must be detected at the same sweep angle in order to be retained in the output set of depth measurements. For example, if the detection threshold is three, and local minima are selected for the sweep angle of −55 degrees for only two scans (i.e., the remaining scans do not exhibit local minima at −55 degrees), those local minima are discarded. Discarding local minima that do not meet the detection threshold may prevent the selection of depth measurements for the output set that correspond to measurement artifacts or structural anomalies in the shelf edges 118. In other examples, the performance of block 1175 may be omitted.
Following the performance of block 1175, or following the negative determination at block 1170 if block 1175 is omitted, the output detector 212 is configured to proceed to block 325.
At block 325, the output detector 212 is configured to store the output set of depth measurements. The output set of depth measurements are stored, for example in the repository in association with the captured data (e.g. the captured lidar scans 416 or depth images 516), and include at least identifications of the output set of depth measurements. Thus, for lidar data, the output set is stored as a set of sweep angle and line index coordinates corresponding to the local minima selected at block 1160 and retained through blocks 1165 and 1175. For depth image data, the output set is stored as pixel coordinates (e.g. X and Y coordinates) of the inlier pixels identified at block 1100. The output set, as stored in the memory 122, can be passed to further downstream functions of the server or retrieved by such functions.
In the foregoing specification, specific embodiments have been described. However, one of ordinary skill in the art appreciates that various modifications and changes can be made without departing from the scope of the invention as set forth in the claims below. Accordingly, the specification and figures are to be regarded in an illustrative rather than a restrictive sense, and all such modifications are intended to be included within the scope of present teachings.
The benefits, advantages, solutions to problems, and any element(s) that may cause any benefit, advantage, or solution to occur or become more pronounced are not to be construed as a critical, required, or essential features or elements of any or all the claims. The invention is defined solely by the appended claims including any amendments made during the pendency of this application and all equivalents of those claims as issued.
Moreover in this document, relational terms such as first and second, top and bottom, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. The terms “comprises,” “comprising,” “has”, “having,” “includes”, “including,” “contains”, “containing” or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises, has, includes, contains a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. An element proceeded by “comprises . . . a”, “has . . . a”, “includes . . . a”, “contains . . . a” does not, without more constraints, preclude the existence of additional identical elements in the process, method, article, or apparatus that comprises, has, includes, contains the element. The terms “a” and “an” are defined as one or more unless explicitly stated otherwise herein. The terms “substantially”, “essentially”, “approximately”, “about” or any other version thereof, are defined as being close to as understood by one of ordinary skill in the art, and in one non-limiting embodiment the term is defined to be within 10%, in another embodiment within 5%, in another embodiment within 1% and in another embodiment within 0.5%. The term “coupled” as used herein is defined as connected, although not necessarily directly and not necessarily mechanically. A device or structure that is “configured” in a certain way is configured in at least that way, but may also be configured in ways that are not listed.
It will be appreciated that some embodiments may be comprised of one or more generic or specialized processors (or “processing devices”) such as microprocessors, digital signal processors, customized processors and field programmable gate arrays (FPGAs) and unique stored program instructions (including both software and firmware) that control the one or more processors to implement, in conjunction with certain non-processor circuits, some, most, or all of the functions of the method and/or apparatus described herein. Alternatively, some or all functions could be implemented by a state machine that has no stored program instructions, or in one or more application specific integrated circuits (ASICs), in which each function or some combinations of certain of the functions are implemented as custom logic. Of course, a combination of the two approaches could be used.
Moreover, an embodiment can be implemented as a computer-readable storage medium having computer readable code stored thereon for programming a computer (e.g., comprising a processor) to perform a method as described and claimed herein. Examples of such computer-readable storage mediums include, but are not limited to, a hard disk, a CD-ROM, an optical storage device, a magnetic storage device, a ROM (Read Only Memory), a PROM (Programmable Read Only Memory), an EPROM (Erasable Programmable Read Only Memory), an EEPROM (Electrically Erasable Programmable Read Only Memory) and a Flash memory. Further, it is expected that one of ordinary skill, notwithstanding possibly significant effort and many design choices motivated by, for example, available time, current technology, and economic considerations, when guided by the concepts and principles disclosed herein will be readily capable of generating such software instructions and programs and ICs with minimal experimentation.
The Abstract of the Disclosure is provided to allow the reader to quickly ascertain the nature of the technical disclosure. It is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims. In addition, in the foregoing Detailed Description, it can be seen that various features are grouped together in various embodiments for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the claimed embodiments require more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive subject matter lies in less than all features of a single disclosed embodiment. Thus the following claims are hereby incorporated into the Detailed Description, with each claim standing on its own as a separately claimed subject matter.
Patent | Priority | Assignee | Title |
11312379, | Feb 15 2019 | Rockwell Collins, Inc. | Occupancy map synchronization in multi-vehicle networks |
11836937, | Jul 23 2021 | Zebra Technologies Corporation | System and method for dimensioning target objects |
Patent | Priority | Assignee | Title |
10019803, | Oct 17 2016 | Conduent Business Services, LLC | Store shelf imaging system and method using a vertical LIDAR |
10127438, | Aug 07 2017 | STANDARD COGNITION, CORP | Predicting inventory events using semantic diffing |
10229386, | Mar 03 2016 | Ebay Inc. | Product tags, systems, and methods for crowdsourcing and electronic article surveillance in retail inventory management |
10265871, | Jul 28 2016 | BOSTON DYNAMICS, INC | Collaborative inventory monitoring |
5214615, | Feb 26 1990 | ACOUSTIC POSITIONING RESEARCH INC | Three-dimensional displacement of a body with computer interface |
5408322, | Apr 26 1993 | Tokyo Electron Limited | Self aligning in-situ ellipsometer and method of using for process monitoring |
5414268, | Feb 01 1994 | USNR, LLC | Light scanner with interlaced camera fields and parallel light beams |
5534762, | Sep 27 1993 | SAMSUNG KWANG-JU ELECTRONICS CO , LTD | Self-propelled cleaning robot operable in a cordless mode and a cord mode |
5566280, | Sep 20 1993 | Kabushiki Kaisha Toshiba | 3D dynamic image production system with automatic viewpoint setting |
5953055, | Aug 08 1996 | Nomi Corporation | System and method for detecting and analyzing a queue |
5988862, | Apr 24 1996 | Leica Geosystems AG | Integrated system for quickly and accurately imaging and modeling three dimensional objects |
6026376, | Apr 15 1997 | HANGER SOLUTIONS, LLC | Interactive electronic shopping system and method |
6034379, | Mar 01 1996 | Intermec IP CORP | Code reader having replaceable optics assemblies supporting multiple illuminators |
6075905, | Jul 17 1996 | Sarnoff Corporation | Method and apparatus for mosaic image construction |
6115114, | Apr 12 1996 | HOLOMETRICS, INC | Laser scanning system and applications |
6141293, | Oct 30 1997 | Mineral Lassen LLC | Ultrasonic positioning and tracking system |
6304855, | Nov 30 1993 | DASSAULT SYSTEMES, S A | Computer system for allowing a consumer to purchase packaged goods at home |
6442507, | Dec 29 1998 | Extreme Networks, Inc | System for creating a computer model and measurement database of a wireless communication network |
6580441, | Apr 06 1999 | III Holdings 2, LLC | Graph-based visual navigation through store environments |
6711293, | Mar 08 1999 | The University of British Columbia | Method and apparatus for identifying scale invariant features in an image and use of same for locating an object in an image |
6721769, | May 26 1999 | Extreme Networks, Inc | Method and system for a building database manipulator |
6836567, | Nov 26 1997 | Cognex Corporation | Fast high-accuracy multi-dimensional pattern inspection |
6995762, | Sep 13 2001 | Symbol Technologies, LLC | Measurement of dimensions of solid objects from two-dimensional image(s) |
7090135, | Jul 07 2003 | Symbol Technologies, LLC | Imaging arrangement and barcode imager for imaging an optical code or target at a plurality of focal planes |
7137207, | Jun 23 2003 | Measuring arrangement to determine location of corners for a building foundation and a wooden base frame, and the use thereof | |
7245558, | Jun 18 2004 | Symbol Technologies, LLC | System and method for detection using ultrasonic waves |
7248754, | May 05 2003 | Toshiba Global Commerce Solutions Holdings Corporation | Apparatus and method for determining whether machine readable information on an item matches the item |
7277187, | Jun 29 2001 | QUANTRONIX, INC | Overhead dimensioning system and method |
7373722, | May 12 2006 | GLOBAL SENSOR SYSTEMS INC | Device for measuring package size |
7474389, | Dec 19 2006 | MCBRIDE, MICHAEL L | Cargo dimensional and weight analyzing system |
7487595, | Jun 02 2006 | Measuring arrangement to determine location of corners for a building foundation and a wooden base frame, and the use thereof | |
7493336, | Jul 22 2003 | Toshiba Global Commerce Solutions Holdings Corporation | System and method of updating planogram information using RFID tags and personal shopping device |
7527205, | Jun 07 1999 | Metrologic Instruments, Inc. | Automated package dimensioning system |
7605817, | Nov 09 2005 | MEDIT CORP | Determining camera motion |
7647752, | Jul 12 2006 | WESTROCK BOX ON DEMAND, LLC | System and method for making custom boxes for objects of random size or shape |
7693757, | Sep 21 2006 | International Business Machines Corporation | System and method for performing inventory using a mobile inventory robot |
7726575, | Aug 10 2007 | HAND HELD PRODUCTS, INC | Indicia reading terminal having spatial measurement functionality |
7839531, | Dec 10 2002 | Canon Kabushiki Kaisha | Printing control method |
7845560, | Dec 14 2004 | HAI ROBOTICS U S A INC | Method and apparatus for determining position and rotational orientation of an object |
7885865, | May 11 2004 | KROGER CO , THE | System and method for mapping of planograms |
7925114, | Sep 19 2002 | VI TECHNOLOGIES, LLC | System and method for mosaicing digital ortho-images |
7957998, | Jan 28 2005 | Target Brands, Inc. | Systems and method for generating planogram images |
8009864, | Aug 31 2007 | Accenture Global Services Limited | Determination of inventory conditions based on image processing |
8049621, | May 28 2009 | WALGREEN CO.; WALGREEN CO | Method and apparatus for remote merchandise planogram auditing and reporting |
8091782, | Nov 08 2007 | Toshiba Global Commerce Solutions Holdings Corporation | Using cameras to monitor actual inventory |
8094902, | May 03 2000 | LEICA BIOSYSTEMS IMAGING, INC | Data management in a linear-array-based microscope slide scanner |
8094937, | Apr 17 2007 | PIXART IMAGING INC | System and method for labeling feature clusters in frames of image data for optical navigation |
8132728, | Apr 04 2007 | SICK, INC | Parcel dimensioning measurement system and method |
8134717, | May 21 2010 | NIELCO , INC D B A LTS SCALE | Dimensional detection system and associated method |
8189855, | Aug 31 2007 | Accenture Global Services Limited | Planogram extraction based on image processing |
8199977, | May 07 2010 | Honeywell International Inc. | System and method for extraction of features from a 3-D point cloud |
8207964, | Feb 22 2008 | VISUAL REAL ESTATE, INC | Methods and apparatus for generating three-dimensional image data models |
8233055, | Jul 09 2008 | Casio Computer Co., Ltd. | Image processing device and program |
8265895, | Mar 27 2009 | Symbol Technologies, LLC | Interactive sensor systems and methods for dimensioning |
8284988, | May 13 2009 | APPLIED VISION CORPORATION | System and method for dimensioning objects using stereoscopic imaging |
8423431, | Dec 20 2007 | Amazon Technologies, Inc | Light emission guidance |
8429004, | Apr 13 2005 | MERCHANT-EYES, LLC | Method and system for automatically measuring retail store display compliance |
8463079, | Dec 16 2008 | Intermec IP CORP | Method and apparatus for geometrical measurement using an optical device such as a barcode and/or RFID scanner |
8479996, | Nov 07 2008 | Symbol Technologies, LLC | Identification of non-barcoded products |
8520067, | Feb 17 2005 | ISRA Vision AG | Method for calibrating a measuring system |
8542252, | May 29 2009 | Microsoft Technology Licensing, LLC | Target digitization, extraction, and tracking |
8599303, | May 10 2005 | Continental Autonomous Mobility US, LLC | Dimensioning system |
8630924, | Aug 31 2007 | Accenture Global Services Limited | Detection of stock out conditions based on image processing |
8660338, | Mar 22 2011 | Honeywell International Inc. | Wide baseline feature matching using collobrative navigation and digital terrain elevation data constraints |
8743176, | May 20 2009 | Continental Autonomous Mobility US, LLC | 3-dimensional hybrid camera and production system |
8757479, | Jul 31 2012 | Xerox Corporation | Method and system for creating personalized packaging |
8812226, | Jan 26 2009 | GM Global Technology Operations LLC | Multiobject fusion module for collision preparation system |
8923893, | Aug 07 2012 | Symbol Technologies, LLC | Real-time planogram generation and maintenance |
8939369, | Jan 24 2011 | DATALOGIC ADC, INC | Exception detection and handling in automated optical code reading systems |
8954188, | Sep 09 2011 | SYMBOTIC LLC | Storage and retrieval system case unit detection |
8971637, | Jul 16 2012 | Matrox Electronic Systems Ltd.; Matrox Electronic Systems Ltd | Method and system for identifying an edge in an image |
8989342, | Apr 18 2012 | The Boeing Company | Methods and systems for volumetric reconstruction using radiography |
9007601, | Apr 21 2010 | Faro Technologies, Inc. | Automatic measurement of dimensional data with a laser tracker |
9064394, | Jun 22 2011 | Alarm.com Incorporated; ALARM COM INCORPORATED | Virtual sensors |
9070285, | Jul 25 2011 | UtopiaCompression Corporation | Passive camera based cloud detection and avoidance for aircraft systems |
9129277, | Aug 30 2011 | Digimarc Corporation | Methods and arrangements for identifying objects |
9135491, | Aug 31 2007 | Accenture Global Services Limited | Digital point-of-sale analyzer |
9159047, | Nov 11 2011 | STEVNACY, LLC | Projected image planogram system |
9171442, | Nov 19 2010 | SENSORMATIC ELECTRONICS, LLC | Item identification using video recognition to supplement bar code or RFID information |
9329269, | Mar 15 2012 | GM Global Technology Operations LLC | Method for registration of range images from multiple LiDARS |
9349076, | Dec 20 2013 | Amazon Technologies, Inc | Template-based target object detection in an image |
9367831, | Mar 16 2015 | Nielsen Consumer LLC | Methods and apparatus for inventory determinations using portable devices |
9380222, | Dec 04 2012 | Symbol Technologies, LLC | Transmission of images for inventory monitoring |
9396554, | Dec 05 2014 | Symbol Technologies, LLC | Apparatus for and method of estimating dimensions of an object associated with a code in automatic response to reading the code |
9400170, | Apr 21 2010 | FARO TECHNOLOGIES, INC | Automatic measurement of dimensional data within an acceptance region by a laser tracker |
9424482, | Oct 31 2013 | Symbol Technologies, Inc | Method and apparatus for image processing to avoid counting shelf edge promotional labels when counting product labels |
9549125, | Sep 01 2015 | Amazon Technologies, Inc | Focus specification and focus stabilization |
9562971, | Nov 22 2012 | GEOSIM SYSTEMS LTD | Point-cloud fusion |
9565400, | Dec 20 2013 | Amazon Technologies, Inc | Automatic imaging device selection for video analytics |
9600731, | Apr 08 2015 | Toshiba Tec Kabushiki Kaisha | Image processing apparatus, image processing method and computer-readable storage medium |
9600892, | Nov 06 2014 | Symbol Technologies, LLC | Non-parametric method of and system for estimating dimensions of objects of arbitrary shape |
9639935, | May 25 2016 | GoPro, Inc.; GOPRO, INC | Apparatus and methods for camera alignment model calibration |
9697429, | Oct 31 2013 | Symbol Technologies, LLC | Method and apparatus for image processing to avoid counting shelf edge promotional labels when counting product labels |
9778388, | Dec 22 2016 | THAYERMAHAN, INC | Systems and methods for autonomous towing of an underwater sensor array |
9791862, | Apr 25 2016 | THAYERMAHAN, INC | Systems and method for unmanned undersea sensor position, orientation, and depth keeping |
9805240, | Apr 18 2016 | Symbol Technologies, LLC | Barcode scanning and dimensioning |
9811754, | Dec 10 2014 | Ricoh Co., Ltd. | Realogram scene analysis of images: shelf and label finding |
9827683, | Jul 28 2016 | BOSTON DYNAMICS, INC | Collaborative inventory monitoring |
9928708, | Dec 12 2014 | SHANGHAI HANSHI INFORMATION TECHNOLOGY CO , LTD | Real-time video analysis for security surveillance |
20010041948, | |||
20020006231, | |||
20020097439, | |||
20020158453, | |||
20020164236, | |||
20030003925, | |||
20030174891, | |||
20040131278, | |||
20040240754, | |||
20050016004, | |||
20050114059, | |||
20050213082, | |||
20050213109, | |||
20060032915, | |||
20060045325, | |||
20060106742, | |||
20070074410, | |||
20070272732, | |||
20080025565, | |||
20080159634, | |||
20080164310, | |||
20080175513, | |||
20080181529, | |||
20080238919, | |||
20080294487, | |||
20090009123, | |||
20090024353, | |||
20090057411, | |||
20090059270, | |||
20090060349, | |||
20090063306, | |||
20090063307, | |||
20090074303, | |||
20090088975, | |||
20090103773, | |||
20090125350, | |||
20090125535, | |||
20090152391, | |||
20090160975, | |||
20090192921, | |||
20090206161, | |||
20090236155, | |||
20090252437, | |||
20090323121, | |||
20100026804, | |||
20100070365, | |||
20100091094, | |||
20100118116, | |||
20100131234, | |||
20100141806, | |||
20100171826, | |||
20100208039, | |||
20100214873, | |||
20100295850, | |||
20100315412, | |||
20100326939, | |||
20110047636, | |||
20110052043, | |||
20110093306, | |||
20110137527, | |||
20110168774, | |||
20110172875, | |||
20110216063, | |||
20110242286, | |||
20110254840, | |||
20110286007, | |||
20110288816, | |||
20110310088, | |||
20120019393, | |||
20120022913, | |||
20120075342, | |||
20120169530, | |||
20120179621, | |||
20120185112, | |||
20120201466, | |||
20120209553, | |||
20120236119, | |||
20120249802, | |||
20120250978, | |||
20120269383, | |||
20120287249, | |||
20120323620, | |||
20130030700, | |||
20130119138, | |||
20130132913, | |||
20130134178, | |||
20130138246, | |||
20130142421, | |||
20130144565, | |||
20130154802, | |||
20130156292, | |||
20130162806, | |||
20130176398, | |||
20130178227, | |||
20130182114, | |||
20130226344, | |||
20130228620, | |||
20130235165, | |||
20130236089, | |||
20130278631, | |||
20130299306, | |||
20130299313, | |||
20130300729, | |||
20130303193, | |||
20130321418, | |||
20130329013, | |||
20130341400, | |||
20140002597, | |||
20140003655, | |||
20140003727, | |||
20140016832, | |||
20140019311, | |||
20140028837, | |||
20140047342, | |||
20140049616, | |||
20140052555, | |||
20140086483, | |||
20140098094, | |||
20140100813, | |||
20140104413, | |||
20140156133, | |||
20140192050, | |||
20140195374, | |||
20140214547, | |||
20140267614, | |||
20140267688, | |||
20140277691, | |||
20140277692, | |||
20140300637, | |||
20140344401, | |||
20140351073, | |||
20140369607, | |||
20150015602, | |||
20150019391, | |||
20150029339, | |||
20150039458, | |||
20150088618, | |||
20150088703, | |||
20150092066, | |||
20150106403, | |||
20150117788, | |||
20150154467, | |||
20150161793, | |||
20150170256, | |||
20150181198, | |||
20150262116, | |||
20150298317, | |||
20150352721, | |||
20150363625, | |||
20150363758, | |||
20150379704, | |||
20160012588, | |||
20160026253, | |||
20160044862, | |||
20160061591, | |||
20160070981, | |||
20160092943, | |||
20160104041, | |||
20160107690, | |||
20160112628, | |||
20160132815, | |||
20160150217, | |||
20160156898, | |||
20160163067, | |||
20160171707, | |||
20160191759, | |||
20160253735, | |||
20160328618, | |||
20160353099, | |||
20170004649, | |||
20170011281, | |||
20170011308, | |||
20170032311, | |||
20170041553, | |||
20170066459, | |||
20170074659, | |||
20170150129, | |||
20170193434, | |||
20170219338, | |||
20170227645, | |||
20170227647, | |||
20170228885, | |||
20170261993, | |||
20170280125, | |||
20170286773, | |||
20170286901, | |||
20170323376, | |||
20180001481, | |||
20180005035, | |||
20180005176, | |||
20180020145, | |||
20180053091, | |||
20180053305, | |||
20180101813, | |||
20180114183, | |||
20180143003, | |||
20180174325, | |||
20180201423, | |||
20180204111, | |||
20180293442, | |||
20180313956, | |||
20180314260, | |||
20180314908, | |||
20180315007, | |||
20180315065, | |||
20180315173, | |||
20180315865, | |||
20190057588, | |||
20190065861, | |||
20190180150, | |||
CA2835830, | |||
CA3028156, | |||
CN104200086, | |||
CN107067382, | |||
EP1311993, | |||
EP2309378, | |||
EP2439487, | |||
EP2472475, | |||
EP2562688, | |||
EP2662831, | |||
EP2693362, | |||
EP766098, | |||
GB2323238, | |||
GB2330265, | |||
KR101234798, | |||
KR1020190031431, | |||
WO2003002935, | |||
WO2003025805, | |||
WO2006136958, | |||
WO2007042251, | |||
WO2008057504, | |||
WO2008154611, | |||
WO2012103199, | |||
WO2012103202, | |||
WO2012154801, | |||
WO2013165674, | |||
WO2014066422, | |||
WO2014092552, | |||
WO2014181323, | |||
WO2015127503, | |||
WO2016020038, | |||
WO2018018007, | |||
WO2019023249, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Sep 06 2017 | RZESZUTEK, RICHARD JEFFREY | Symbol Technologies, LLC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 043534 | /0724 | |
Sep 07 2017 | Symbol Technologies, LLC | (assignment on the face of the patent) | / |
Date | Maintenance Fee Events |
Sep 07 2017 | BIG: Entity status set to Undiscounted (note the period is included in the code). |
Jul 20 2023 | M1551: Payment of Maintenance Fee, 4th Year, Large Entity. |
Date | Maintenance Schedule |
Feb 25 2023 | 4 years fee payment window open |
Aug 25 2023 | 6 months grace period start (w surcharge) |
Feb 25 2024 | patent expiry (for year 4) |
Feb 25 2026 | 2 years to revive unintentionally abandoned end. (for year 4) |
Feb 25 2027 | 8 years fee payment window open |
Aug 25 2027 | 6 months grace period start (w surcharge) |
Feb 25 2028 | patent expiry (for year 8) |
Feb 25 2030 | 2 years to revive unintentionally abandoned end. (for year 8) |
Feb 25 2031 | 12 years fee payment window open |
Aug 25 2031 | 6 months grace period start (w surcharge) |
Feb 25 2032 | patent expiry (for year 12) |
Feb 25 2034 | 2 years to revive unintentionally abandoned end. (for year 12) |