Provided are a method and system for determining an optimal exposure of a structured light based 3D camera. The system includes a projecting means for illuminating a predetermined pattern on a target object, an image capturing means for capturing an image of the target object with the pattern projected, and a processing means for reconstructing 3D data for the target object by identifying the pattern of the captured images from the image capturing means. The system automatically determines an optimal exposure of the structured light based 3D camera system through analyzing the captured image of the target object.
|
0. 22. A method comprising:
projecting, via a projecting means of a structured light based 3D camera system, a predetermined pattern on a target object;
obtaining, via an image capturing means of the structured light based three dimensional (3D) camera system, a first image of the target object without the predetermined pattern projected thereon, the first image including a first pixel and a second pixel;
obtaining, via the image capturing means, a second image of the target object with the predetermined pattern projected thereon, the second image including a third pixel corresponding to the first pixel of the first image and a fourth pixel corresponding to the second pixel of the first image, the structured light based 3D camera system including a processing means for reconstructing 3D data based on the first and second images;
determining automatically an optimal exposure level of the structured light based 3D camera system using the first image and the second image; and
controlling an exposure level of the structured light based 3D camera system based on the determined optimal exposure level,
wherein the determining automatically the optimal exposure level of the structured light base 3D camera system using the first image and the second image includes
determining, from the first image and the second image, estimated values of pixel intensities for a plurality of exposure levels, and
determining one of the plurality of exposure levels as the optimal exposure level based on the estimated values of pixel intensities.
1. A method for determining an optimal exposure of a structured light based three dimensional (3D) camera system including a projecting means for illuminating a predetermined pattern on a target object, an image capturing means for capturing an image of the target object with the pattern projected thereon, and a processing means for reconstructing 3D data based on the captured image, wherein the method comprises the steps of:
obtaining an image of a target object with a predetermined pattern projected thereon, which is illuminated from a projecting means, and an image of the target object without a predetermined pattern projected thereon; and
determining automatically an optimal exposure level of the structured light based 3D camera system using said two kinds of images, said image of the target object with the predetermined pattern projected thereon and said image of the target object without the predetermined pattern projected thereon; and
controlling an exposure level of the structured light based 3D camera system based on the determined optimal exposure level,
wherein the determining automatically the optimal exposure level of the structured light based 3D camera system using said image of the target object with the predetermined pattern projected thereon and said image of the target object without the predetermined pattern projected thereon includes processing data from at least one characteristic curve associated with the image of the target object with the predetermined pattern projected thereon and processing data from at least one characteristic curve associated with the image of the target object without the predetermined pattern projected thereon to determine the optimal exposure level.
0. 17. A method comprising:
projecting, via a projecting means of a structured light based 3D camera system, a predetermined pattern on a target object;
obtaining, via an image capturing means of the structured light based three dimensional (3D) camera system, a first image of the target object without the predetermined pattern projected thereon, the first image including a first pixel and a second pixel;
obtaining, via the image capturing means, a second image of the target object with the predetermined pattern projected thereon, the second image including a third pixel corresponding to the first pixel of the first image and a fourth pixel corresponding to the second pixel of the first image, the structured light based 3D camera system including a processing means for reconstructing 3D data based on the first and second images;
determining automatically an optimal exposure level of the structured light based 3D camera system using the first image and the second image; and
controlling an exposure level of the structured light based 3D camera system based on the determined optimal exposure level,
wherein the determining automatically the optimal exposure level of the structured light based 3D camera system using the first image and the second image includes
determining, for each of a plurality of exposure levels, a difference between an intensity of the first pixel and an intensity of the second pixel,
determining, for each of the plurality of exposure levels, a difference between an intensity of the third pixel and an intensity of the fourth pixel, and
determining one of the plurality of exposure levels as the optimal exposure level based on the determined differences.
2. The method of
a) dividing an intensity of a corresponding pixel into red intensity, green intensity, and blue intensity of three channels (R, G, B) for one same pixel at each of the captured images, and generating a characteristic curve for each of the three channels for showing the intensity of a pixel for each channel according to variation of exposure levels;
b) estimating intensity I of the corresponding pixel according to variation of exposure levels using the generated characteristic curves, and calculating an intensity difference ΔI or a brightness ratio SNR between the image of the target object with the predetermined pattern projected and the image of the target object without the predetermined pattern for the corresponding pixel using the estimated intensity;
c) collecting brightness data about intensity differences ΔI or brightness ratios SNR according to variation of exposure levels for all pixels by repeating the steps a) and b) for all pixels of each image;
d) discriminating pixels with spread phenomenon (saturation) occurred or the pixels without the predetermined pattern projected when the projecting means projects the predetermined pattern from the collected brightness data and excluding the brightness data of the discriminated pixels; and
e) calculating the number of pixels having an intensity difference ΔI and a brightness ratio SNR exceeding a predetermined threshold ΔIth or SNRth at every exposure levels using the brightness data filtered at the step d) and deciding an exposure level having a largest calculated number of pixels as an optimal exposure level.
3. The method of
4. The method of
5. The method of
6. The method of
enable term is set as a factor for checking the collected brightness data, is checked as Disable for a pixel in which spread phenomenon (saturation) occurs when the projecting means illuminates the predetermined pattern and an intensity difference ΔI or a brightness ratio SNR of the pixel is classified as unreliable data.
7. The method of
0. 8. The method of claim 1, further comprising adjusting an exposure level of the image capturing means by adjusting at least one of an aperture of an iris and a shutter speed.
0. 9. The method of claim 1, wherein the data from the at least one characteristic curve associated with the image of the target object with the predetermined pattern projected thereon and the data from the at least one characteristic curve associated with the image of the target object without the predetermined pattern projected thereon includes at least one of pixel intensity data and exposure level data.
0. 10. The method of claim 1, wherein the processing data from the at least one characteristic curve associated with the image of the target object with the predetermined pattern projected thereon includes identifying pixels of the image having an intensity difference exceeding a threshold difference.
0. 11. The method of claim 1, wherein the processing data from the at least one characteristic curve associated with the image of the target object without the predetermined pattern projected thereon includes identifying pixels of the image having an intensity difference exceeding a threshold difference.
0. 12. The method of claim 1, wherein the processing data from the at least one characteristic curve associated with the image of the target object with the predetermined pattern projected thereon includes identifying pixels of the image having a signal to brightness ratio exceeding a threshold brightness ratio.
0. 13. The method of claim 1, wherein the processing data from the at least one characteristic curve associated with the image of the target object without the predetermined pattern projected thereon includes identifying pixels of the image having a brightness ratio exceeding a threshold brightness ratio.
0. 14. The method of claim 1, wherein the projecting means comprises a beam projector.
0. 15. The method of claim 1, wherein the image capturing means comprises a camera.
0. 16. The method of claim 1, wherein the processing means comprises a computer.
0. 18. The method of claim 17, wherein the processing means comprises a computer.
0. 19. The method of claim 17, further comprising adjusting an exposure level of the image capturing means by adjusting at least one of an aperture of an iris and a shutter speed.
0. 20. The method of claim 17, wherein the projecting means comprises a beam projector.
0. 21. The method of claim 17, wherein the image capturing means comprises a camera.
0. 23. The method of claim 22, wherein the image capturing means comprises a camera.
0. 24. The method of claim 22, wherein the processing means comprises a computer.
0. 25. The method of claim 22, further comprising adjusting an exposure level of the image capturing means by adjusting at least one of an aperture of an iris and a shutter speed.
0. 26. The method of claim 22, wherein the projecting means comprises a beam projector.
|
The present application claims priority under 35 U.S.C. 119 to Korean Patent Application No. 10-2008-0007228 (filed on Jan. 23, 2008), which is hereby incorporated by reference in its entirety.
Embodiments relates to a method and system for determining an optimal exposure of a structured light based 3D camera and, more particularly, to a method and system for determining an optimal exposure level of a structured light based 3D camera for improving reconstructable range and reliability of 3D data by controlling camera exposure according to diverse environmental variations based on an automatically determined optimal exposure level.
3D cameras using structured light are a modified version of a stereo camera which uses two or more identical cameras to obtain 3D information. Unlike the stereo camera, the 3D camera includes a camera and a projecting unit such as a beam projector instead of having two identical cameras. Such a structured light based camera system illuminates a predetermined pattern on an object using the projecting unit, captures an image of the object with the pattern illuminated thereon using an image capturing unit such as a camera, and obtains 3D information by analyzing the obtained pattern.
Although a stereo camera system passively uses features of an image, the structured light based camera system actively uses the pattern illuminated from the projecting unit as features. Therefore, the structured light based camera system has advantages of a fast processing speed and a high spatial resolution. Due to the advantages, the structured light based camera system has been widely used for object modeling/recognition, three dimensional (3D) ranging, industrial inspection, and reverse engineering. Particularly, in an intelligent robot engineering field, a home service robot needs a structured light based camera system for the large scale of 3D data for workspace modeling because an ordinary stereo camera system cannot obtain 3D information from a plain and simple environment which does not have sufficient characteristic information or no background color variation, that is, an environment with no feature point.
In the structured light camera system, the precision of 3D data depends on discrimination of patterns, which are illuminated from a projector, from an image. However, it is difficult to discriminate patterns in a real environment that dynamically varies in time or under various object conditions.
For example, it is difficult to identify patterns illuminated on an object having low reflectivity such as a black cloth because a camera cannot accurately capture the patterns illuminated on the black cloth. On the contrary, it is also difficult to identify patterns illuminated on an object having high reflectivity such as an opalescent white object because the patterns show spread phenomenon (saturation) in a captured image due to the opalescent characteristic.
In general, the structured light based camera controls a camera iris to receive more light for an object having low reflectivity. On the contrary, the structured light based camera controls a camera iris to receive limited light in order to prevent spread phenomenon (saturation) for an object having high reflectivity.
Difficulty of pattern discrimination in real environment divides into two kinds. First, it is difficult to control an exposure level according to an object because a real environment includes various objects each having different reflectivity due to colors and textures of objects. Secondly, different exposure levels are required according to the illumination of a peripheral environment. For example, an exposure level of a structured light based camera must be differently controlled when the structured light based camera operates in a bright environment from when the structured light based camera operates in a dark environment.
Although it is required that an exposure level must be adjusted properly in dynamically-varying environmental factors in order to accurately identify patterns, most of researches for structured light based camera systems have been progressed under assumptions of fixed environmental factors with constant surrounding light. Therefore, there is a demand for developing a technology for dynamically controlling an exposure level of a camera according to change of various environmental factors, for example, whenever time, position, and distance changes with respect to a service robot's mission.
Embodiments have been proposed in order to provide a method and system for determining an optimal exposure of a structured light based 3D camera for improving reconstructable range and reliability of 3D data by controlling camera exposure according to diverse environmental variations based on an automatically determined optimal exposure level.
Embodiments have been proposed in order to provide a method for generating a characteristic curve to estimate brightness variation of a pixel according to an exposure level in order to determine an optimal exposure according to various environmental factors.
In embodiments, a method for determining an optimal exposure level of a structured light based three dimensional (3D) camera system, which includes a projecting means for illuminating a predetermined pattern on a target object, an image capturing means for capturing an image of the target object with the pattern projected, and a processing means for reconstructing 3D data based on captured image, automatically determines an optimal exposure level of the structured light based 3D camera system through analyzing the captured image of the target object. The method includes the steps of: a) obtaining an image of a target object with a predetermined pattern projected thereon, which is illuminated from a projecting means, and an image of the target object without a predetermined pattern projected thereon; b) dividing an intensity of a corresponding pixel into red intensity, green intensity, and blue intensity of three channels (R, G, B) for one same pixel at each of the captured images, and generating a characteristic curve for each of the three channels for showing the intensity of a pixel for each channel according to variation of exposure levels; c) estimating intensity I of the corresponding pixel according to variation of exposure levels using the generated characteristic curves, and calculating an intensity difference ΔI or a brightness ratio SNR between the image of the target object with the predetermined pattern projected and the image of the target object without the predetermined pattern for the corresponding pixel using the estimated intensity; d) collecting brightness data about intensity differences ΔI or brightness ratios SNR according to variation of exposure levels for all pixels by repeating the steps b) and c) for all pixels of each image; e) discriminating pixels with spread phenomenon (saturation) occurred or the pixels without the predetermined pattern projected when the projecting means projects the predetermined pattern from the collected brightness data and excluding the brightness data of the discriminated pixels; and f) calculating the number of pixels having an intensity difference ΔI and a brightness ratio SNR exceeding a predetermined threshold ΔIth or SNRth at every exposure levels using the brightness data filtered at the step e) and deciding an exposure level having a largest calculated number of pixels as an optimal exposure level.
In another embodiments, a system for determining an optimal exposure of a structured light based three dimensional (3D) camera system includes: a projecting means for illuminating a predetermined pattern on a target object; an image capturing means for capturing an image of the target object with the pattern projected; and a processing means for calculating 3D data for the target object by identifying the pattern of the captured images from the image capturing means. The processing means sequentially performs operation including the steps of: a) obtaining an image of a target object with a predetermined pattern projected thereon, which is illuminated from a projecting means, and an image of the target object without a predetermined pattern projected thereon; b) dividing an intensity of a corresponding pixel into red intensity, green intensity, and blue intensity of three channels (R, G, B) for one same pixel at each of the captured images, and generating a characteristic curve for each of the three channels for showing the intensity of a pixel for each channel according to variation of exposure levels; c) estimating intensity I of the corresponding pixel according to variation of exposure levels using the generated characteristic curves, and calculating an intensity difference ΔI or a brightness ratio SNR between the image of the target object with the predetermined pattern projected and the image of the target object without the predetermined pattern for the corresponding pixel using the estimated intensity; d) collecting brightness data about intensity differences ΔI or brightness ratios SNR according to variation of exposure levels for all pixels by repeating the steps b) and c) for all pixels of each image; e) discriminating pixels with spread phenomenon (saturation) occurred or the pixels without the predetermined pattern projected when the projecting means projects the predetermined pattern from the collected brightness data and excluding the brightness date of the discriminated pixels; and f) calculating the number of pixels having an intensity difference ΔI and a brightness ratio SNR exceeding a predetermined threshold ΔIth or SNRth at every exposure levels using the brightness data filtered at the step e) and deciding an exposure level having a largest calculated number of pixels as an optimal exposure level.
In another embodiments, a method for generating a characteristic curve of a structured light based three dimensional (3D) camera system, which includes a projecting means for illuminating a predetermined pattern on a target object, an image capturing means for capturing an image of the target object with the pattern projected, and a processing means for reconstructing 3D data using the captured image, includes the steps of: a) capturing an image of a target object with a predetermined pattern projected thereon from the projecting means and another image of the target object without the predetermined pattern projected thereon; b) calculating proportional factors kR, kG, and kB by applying an exposure applied for capturing the images in the step a) and red intensity, green intensity, and blue intensity of a red channel, a green channel, and a blue channel measured for one same pixel in each of the captured images to an equation: IR=kR*E, IG=kG*E, IB=kB*E, where IR, IG, and IB denote the intensities in R, G, and B channels, kR, kG, and kB denote a proportional factor, and E denotes an exposure; c) calculating intensity of a corresponding pixel for an input exposure by applying various exposure to the equation; and d) generating a graph by corresponding intensity to a varying exposure using the data obtained in the step c).
The patent or application file contains at least one drawings executed in color. Copies of this patent or patent application publication with color drawings will be provided by the Office upon request and payment of the necessary fee.
Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings.
At first, a concept of camera exposure will be briefly described before describing embodiments.
Exposure
Exposure means an amount of light used when a camera captures an image, and an exposure level is controlled by an iris and a shutter speed.
An iris is metal plates disposed at a camera lens. An amount of incoming light is controlled by opening and closing the iris. The size of an iris is expressed as a F-number. The smaller the F-number is, the wider an aperture becomes, thereby receiving more light in a short time. The shutter speed is an opening time of a lens. The shutter speed is controlled by turning on/off a charge coupled device (CCD) sensor at a designated time. An exposure level related to an amount of light is expressed as STOP. One STOP difference increases or decreases an amount of light two times.
As shown in
In
E=i×t Eq. 1
In Eq. 1, E denotes exposure, i denotes intensity of light, and t denotes an exposure time.
A structured light based 3D camera system includes a projecting means for illuminating a predetermined pattern on a target object, an image capturing means for capturing an image with the pattern illuminated thereon, and a processor for identifying the pattern from the captured image and calculating 3D data for the target object. For example, the projecting unit may be a typical projector using a code pattern, the image capturing means may be a camera, and the processor may be a computer.
Meanwhile, the precision of the calculated 3D data depends on how accurately the pattern is discriminated from the captured image. In order to obtain optimal exposure for pattern discrimination in the present embodiment, an image captured from a projected pattern without spread phenomenon (saturation) is compared with an image captured from a non-projected reference scene, and a pattern may be optimally discriminated when the intensity difference therebetween becomes the maximum. As a scale for the intensity difference, SNR and ΔI are used in the present embodiment. SNR is a ratio of brightness of an image obtained from a non-projected reference scene and brightness of an image obtained from a projected pattern. ΔI denotes difference between brightness of an image captured from a non-projected reference scene and brightness of an image captured from a projected pattern. In order to determine an optimal exposure level of a camera according to dynamically changing environmental factors, the structured light based camera system employs a method for automatically determining an optimal exposure level by analyzing an image captured at a predetermined exposure level.
In order to determine an optimal exposure level, the processor according to the present embodiment performs operations as follows. At step S10, the processor obtains a pattern illuminated image and a non pattern illuminated image for a target object one by one. At step S20, the processor generates a characteristic curve for the same predetermined pixel in each of the captured images. At step S30, the processor estimates an intensity of a pixel according to an exposure level using the generated characteristic curve and calculates an intensity difference ΔI or a brightness ratio SNR for a corresponding pixel. Hereinafter, the intensity denotes brightness of a pixel. At step S40, the processor collects ΔI and SNR data according to exposure level variation for all pixels of each image by repeating the steps S20 and S30. At step S50, the processor excludes ΔI or SNR data for an over exposed pixel or a non-pattern illuminated pixel when the projecting means projects the predetermined pattern. At step S60, the processor calculates the number of pixels having the intensity difference and the brightness ratio exceeding a predetermined threshold ΔIth or SNRth at every exposure levels and determines an exposure level providing the largest number of pixels as an optimal exposure level.
In the step S10, the processor obtains a non-pattern illuminated image and a pattern illuminated image one by one. The non-pattern illuminated image is a color image obtained with the projecting means turned off, which will be referred as a 0 state, hereinafter. The pattern illuminated image is a color image obtained with the projecting means turned on, which will be referred as a 1 state, hereinafter. Here, the image of the 0 state and the image of the 1 state are captured at an initial exposure level.
In the step S20, the processor selects one pixel at the same position from two obtained images and generates characteristic curves showing intensity variation according to exposure levels.
The characteristic curve is a graph showing intensity of a pixel varying according to exposure using a principle that Eq. 1 and intensity changes in proportion to an amount of incoming light. The characteristic curve is used to determine an optimal exposure level in the present embodiment. The characteristic curve can be expressed as Eq. 2 if color intensity of a corresponding pixel divides into three channels, red intensity (R), green intensity (G), and blue intensity (B).
IR=kR*E, IG=kG*E, IB=kB*E Eq. 2
Eq. 2 is limited to when IR, IG, IB>2b−1 and IR, IG, IB=2b−1.
In Eq. 2, IR, IG, and IB denote the intensities in R, G, and B channels, kR, kG, and kB denote a proportional factor, E denotes exposure, and b denotes the number of bits expressing the brightness intensity at a camera.
Since the characteristic curve enables to the processor to estimate the intensity of a pixel for various exposure levels although only one image is captured in a predetermined condition, it is not necessary to obtain a plurality of images with various exposures in order to obtain an optimal exposure setting.
Hereinafter, simulations performed for deducing the characteristic curve and confirming the reliability thereof will be described.
Characteristic Curve
A scene image has intensity values varying according to exposure levels. If the scene image pixel is divided to the same property of radiometry information, the pixel intensity increased or decreased at the same proportion as exposure.
In
Proportional factors kR, kG, and kB are calculated by selecting one image from a plurality of obtained images for measuring intensity of a pixel according to an exposure level in the simulation of
Therefore, the graphs clearly show that reliable estimation values of pixel intensity according to exposure variation can be obtained using only one image obtained from a predetermined exposure based on the method for generating the characteristic curve according to the present embodiment used in the simulations.
Based on the above described simulation, a method for generating a characteristic curve according to an embodiment will be described. In the step S20 shown in
At the step S30, the processor estimates an intensity of a corresponding pixel according to exposure level variation for the 0 state and the 1 state using the created characteristic curve and calculates the intensity difference ΔI or the brightness ratio SNR for a corresponding pixel at the 0 state and the 1 state using the estimated intensity. In the step S30, the characteristic curves of R, G, and B channels for the 0 state and the 1 state can be converted into a characteristic curve of a gray level for each state by adding them together. In this case, the estimated intensity of the pixel is expressed as a gray level.
At the step S40, the processor collects data of the intensity difference ΔI or the brightness ratio SNR in the 0 state and the 1 state according to the exposure level variation for all pixels by repeating the steps S20 and S30 for all pixels in each image.
At the step S50, the processor discriminates pixels having spread phenomenon (saturation) and non-pattern projected pixels from the image of the 1 state among the collected data at the step S40 and excludes the data of intensity difference ΔI or the brightness ratio SNR of the discriminated pixels from the collected data. Here, the processor may set an enable term as a factor for checking the collected data, check the enable term as Disable for a pixel in which spread phenomenon (saturation) occurs in 1 state and classify the intensity difference ΔI or the brightness ratio SNR thereof as unreliable data.
At the step S60, the processor calculates the number pixels having the intensity difference ΔI or the brightness ratio SNR exceeding an intensity difference threshold ΔIth or a brightness ratio threshold SNRth using the data of the intensity difference ΔI or the brightness ratio SNR according to exposure level variation per each pixel, and decides an exposure level having the largest number of pixel exceeding the threshold values as an optimal exposure level. Here, it is preferable to use a minimum intensity difference ΔI and a minimum brightness ratio SNR that can discriminate a pattern projected area and a no pattern projected area from the obtained image as the intensity difference threshold ΔIth or the brightness ratio threshold SNRth. If the optimal exposure level is decided using a SNR value, a threshold SNRth may be about 0.1 to 0.2 when the SNR value is standardized to 256 gray levels.
In
As described above, the intensity difference or the brightness ratio is calculated for all pixels in the images I and II, the number of pixels having the calculated intensity difference or the brightness ratio exceeding a predetermined threshold at every exposure levels using the data of the calculated intensity difference and brightness ratio, and an optimal exposure level is determined by selecting an exposure level having the largest number of pixels exceeding the threshold value, thereby increasing 3D data reconstruction range and improving the reliability thereof
Images in the first row I show images obtained by photographing target objects. The best exposure level is found by photographing the target objects with all of exposure levels. Images in the second row II are images obtained using the found best exposure level. The number of 3D data obtained in each best exposure level is set to 100% so that it can be used as reference for comparing the number of 3D data obtained at different exposure levels. As shown, the best exposure level is STOP 5-2-3-4. Images in the third row III are images obtained using the method for determining an optimal exposure according to an embodiment of the present invention. Images in the fourth row IV are images captured with a fixed STOP 5, and images in the fifth row V are images captured with a fixed STOP 4. Images in the sixth row VI are images captured with a fixed STOP 3.
As shown in
As described above, the method and system for determining an optimal exposure level of a structured light based 3D camera according to an embodiment can improve a reconstructable range and reliability of 3D data in various applications which need to measure objects with different surface properties and different illumination conditions. Therefore, the method and system for determining an optimal exposure of a structured light based 3D camera according to an embodiment can be applied to various industrial fields such as inspection equipment using a 3D measuring method, 3D image modeling/recognition, and robot vision.
It will be apparent to those skilled in the art that various modifications and variations can be made to embodiments without departing from the spirit or scope of the disclosed embodiments. Thus, it is intended that the present invention covers modifications and variations of this invention provided they come within the scope of the appended claims and their equivalents.
The method and system for determining an optimal exposure of a structured light based 3D camera according to an embodiment can improve a reconstructable range and reliability of 3D data by determining an optimal exposure level that can optimally discriminate patterns according to environmental condition variation.
Also, it is not necessary to obtain a plurality of images for determining an optimal exposure for various environment factors by providing the method for creating a characteristic curve to estimate variation of pixel intensity according to an exposure level.
Kim, Dae Sik, Lee, Sukhan, Ryu, Moon Wook
Patent | Priority | Assignee | Title |
Patent | Priority | Assignee | Title |
5085502, | Apr 30 1987 | VIDEK, INC | Method and apparatus for digital morie profilometry calibrated for accurate conversion of phase information into distance measurements in a plurality of directions |
5661667, | Mar 14 1994 | VIRTEK LASER SYSTEMS INC | 3D imaging using a laser projector |
6122062, | May 03 1999 | FANUC ROBOTICS NORTH AMERICA, INC | 3-D camera |
6252623, | May 15 1998 | MEEWORLD INC | Three dimensional imaging system |
6278460, | Dec 15 1998 | Point Cloud, Inc.; POINT CLOUD, INC | Creating a three-dimensional model from two-dimensional images |
6503195, | May 24 1999 | UNIVERSITY OF NORTH CAROLINA, THE | Methods and systems for real-time structured light depth extraction and endoscope using real-time structured light depth extraction |
6512892, | Sep 15 1999 | Sharp Kabushiki Kaisha | 3D camera |
6556706, | Jan 28 2000 | TECHNEST HOLDINGS, INC | Three-dimensional surface profile imaging method and apparatus using single spectral light condition |
7054067, | May 27 2003 | Nippon Hoso Kyokai | Three-dimensional image optical system |
7453456, | Mar 28 2000 | ANDREAS ACQUISITION LLC | System and method of three-dimensional image capture and modeling |
7560679, | May 10 2005 | NAN CHANG O-FILM OPTOELECTRONICS TECHNOLOGY LTD | 3D camera |
8817171, | Sep 14 2006 | Casio Computer Co., Ltd. | Imaging apparatus with automatic exposure adjusting function |
20030107644, | |||
20030123713, | |||
20050237423, | |||
20060029382, | |||
20070052839, | |||
20070086762, | |||
20070195162, | |||
20080012850, | |||
20080106634, | |||
20090040364, | |||
KR1020050041525, |
Date | Maintenance Fee Events |
Sep 04 2018 | BIG: Entity status set to Undiscounted (note the period is included in the code). |
Sep 06 2018 | SMAL: Entity status set to Small. |
Nov 30 2022 | M2553: Payment of Maintenance Fee, 12th Yr, Small Entity. |
Date | Maintenance Schedule |
Jun 15 2024 | 4 years fee payment window open |
Dec 15 2024 | 6 months grace period start (w surcharge) |
Jun 15 2025 | patent expiry (for year 4) |
Jun 15 2027 | 2 years to revive unintentionally abandoned end. (for year 4) |
Jun 15 2028 | 8 years fee payment window open |
Dec 15 2028 | 6 months grace period start (w surcharge) |
Jun 15 2029 | patent expiry (for year 8) |
Jun 15 2031 | 2 years to revive unintentionally abandoned end. (for year 8) |
Jun 15 2032 | 12 years fee payment window open |
Dec 15 2032 | 6 months grace period start (w surcharge) |
Jun 15 2033 | patent expiry (for year 12) |
Jun 15 2035 | 2 years to revive unintentionally abandoned end. (for year 12) |