A stereoscopic image sensor may be formed of a single image sensor having a pair of fields formed therein. The fields may be closer to one another than are a pair of left and right image collectors. The close spacing may be achieved by using an image redirector to redirect image information from the spaced apart collectors to the less spaced apart fields on the image sensor. In some embodiments of the present invention, by using a single imaging sensor, a more compact structure may be achieved which may be of lower cost and may enjoy reduced processing complexity. In addition, because a single sensor is utilized, in some embodiments of the present invention, the left and right images may be captured essentially identically.

Patent
   6108130
Priority
Sep 10 1999
Filed
Sep 10 1999
Issued
Aug 22 2000
Expiry
Sep 10 2019
Assg.orig
Entity
Large
50
5
all paid
1. A stereoscopic image sensor comprising:
an imaging array die including a left and a right imaging array field, said fields being spaced substantially apart; and
a stereoscopic lens system including left and right image collectors, said image collectors being spaced further apart than said left and right imaging array fields.
16. A stereoscopic imaging system comprising:
a single imaging sensor die including a left and a right imaging array field defined on said die, said fields being spaced substantially apart;
a stereoscopic lens system including left and right image collectors; and
a redirector to redirect the information collected by said image collectors and transmit it to said imaging array fields.
10. A method of capturing left and right stereoscopic image information comprising:
collecting image information from a pair of spaced apart image collectors;
redirecting the collected image information to a single digital sensor including left and a right array fields on the same die, said fields being spaced substantially apart; and
capturing separate left and right image information in the single digital sensor.
2. The sensor of claim 1 wherein said collectors include a pair of spherical lenses displaced apart by a distance greater than the distance between said left and right imaging array fields.
3. The sensor of claim 2 including an image redirector which redirects image information from said collectors to said imaging array fields.
4. The sensor of claim 3 wherein said redirector includes a pair of fiber bundles.
5. The sensor of claim 3 wherein said redirector includes a pair of light pipes.
6. The system of claim 3 wherein said redirector includes a gradient index lens.
7. The sensor of claim 3 wherein said redirector includes a holographic plate.
8. The sensor of claim 4 wherein said fiber bundles are rectangular in cross-section.
9. The sensor of claim 4 wherein the cross-sectional shape of the fiber bundles matches the shape of the imaging array fields.
11. The method of claim 10 wherein redirecting includes transferring light through a fiber bundle.
12. The method of claim 10 further including positioning said collectors spaced apart from one another by a distance greater than the left and right array fields are separated.
13. The method of claim 10 including using a light pipe to redirect the collected image information.
14. The method of claim 10 including using a gradient index lens to redirect the collected image information.
15. The method of claim 10 including using fiber bundles to redirect said image information and arranging the ends of said bundles in alignment with said array fields.
17. The system of claim 16 wherein said collectors are spaced apart further than said left and right imaging array fields.
18. The system of claim 16 wherein said redirector includes a fiber bundle.
19. The system of claim 16 wherein said redirector is a light pipe.
20. The system of claim 16 wherein said collectors are spherical lenses.
21. The system of claim 16 wherein said collectors use a gradient index lens.
22. The system of claim 16 wherein said collectors are a holographic plate.

This invention relates generally to capturing stereoscopic images for three dimensional display.

The human eyes are essentially stereoscopic image collectors. Because the left and right eyes are spaced apart, they each capture information from which information about the depth of an object being viewed can be determined. With depth information the human eye can view objects in three dimensions. The eye judges how objects relate to one another spatially when viewed from different positions.

Manmade systems may use similar principles to record stereoscopic image pairs. For example, a pair of image sensors may be spaced apart from one another sufficiently so that each sensor records an image from a different position. The recorded images may be digitally captured, for example in a complementary metal oxide semiconductor (CMOS) imaging array. Those captured images may then be arranged for stereoscopic viewing. Techniques for stereoscopic viewing may involve using left and right filters to reconstruct the depth dimension recorded via the stereoscopic image pairs. The image information may be reconstructed using left and right color filters which decode left and right color coding from the composite stereoscopic image. Alternatively, left and right polarizers may do the same thing by decoding encoded polarization information to separate the left and right stereoscopic pairs from a composite image.

Conventional systems use two sensors, one for capturing the left image and another for capturing the right image, to create a three dimensional composite image. Using two sensors may result in increased cost and increased electronic processing of the captured images. The information from the two sensors is generally separately processed for subsequent recombination and stereoscopic viewing. Moreover, because of the separation between the two image sensors, the compactness of the resulting sensing system suffers.

Thus, there is a continuing need for better devices and techniques for stereoscopic image sensing.

In accordance with one aspect, a stereoscopic image sensor includes an imaging array including a left and right imaging array field. A stereoscopic lens system includes left and right image collectors. The image collectors are spaced further apart than the left and right imaging array fields.

Other aspects are set forth in the accompanying detailed description and claims.

FIG. 1 is a block diagram of one embodiment of the present invention;

FIG. 2 is a cross-sectional view taken generally along the line 2--2 in FIG. 1;

FIG. 3 is a side elevational view of a stereoscopic image sensor in accordance with another embodiment of the present invention; and

FIGS. 4A and 4B show the formation of a holographic plate.

A stereoscopic image sensing system 10, shown in FIG. 1, includes a collecting system 12 made up of a pair of left and right image collectors 12a and 12b. The scene reflectance, indicated by the arrow A, is picked up by the spaced apart image collectors 12a and 12b. Because of their spaced apart orientation, the collectors 12a and 12b have different points of view which can be used to recreate depth information for stereoscopic or three dimensional viewing of the resulting image information.

The image information collected by the collectors 12a and 12b may be redirected to a pair of imaging array fields 16a and 16b of an image sensor 16. The image sensor 16 may conventionally be a complementary metal oxide semiconductor (CMOS) image sensor having a plurality of pixels formed as integrated circuit elements. Alternatively, a charge coupled device (CCD) sensor may be utilized. Advantageously the sensor is formed in a single integrated circuit die.

The image information collected by the collectors 12a and 12b may be forwarded to the imaging array fields 16a and 16b by the image redirecting system 14 including a left element 14a and a right element 14b. The imaging array fields 16a and 16b are more closely situated with respect to one another than the collectors 12a and 12b. The image information is redirected by the system 14 in order for that image information to be captured by the sensor 16.

In this way, the desired spatially separated point of view may be achieved through the collectors 12a and 12b without requiring that the array fields 16a, 16b be similarly spatially displaced. As a result, a single sensor 16 may be used to record both the left and right image information. This may result in a sensing system which is more compact and which processes image information in a more simplified way than is possible in a system using separate left and right image sensors.

The image information captured for the left and right collected fields may be transferred by sensor drive electronics 18 to a host processing system 20. Three dimensional image rendering and display device 22 may be responsible for forming the composite image, using known techniques.

The image redirecting system 14 may be implemented in a variety of different fashions. As illustrated in FIG. 1, the redirectors 14a and 14b may be substantially rectangularly shaped fiber optic bundles. As depicted in FIG. 2, each redirector 14a and 14b may be rectangularly shaped in cross-section so as to take the image information received by the collectors 12a and 12b and transfer it in a position that is consistent with the rectangular arrays commonly utilized in image sensors. Thus, each redirector 14a or 14b may have a cross-sectional shape which matches the shape of the underlying imaging array field 16a and 16b, in one embodiment of the present invention.

Each redirector 14a and 14b may be made up of a plurality of optical fibers covered by a core cladding. Such systems are commonly called fiber bundles. A large number of fiber optic strands, on the order of thousands of strands for example, may be collected together and each strand may convey a portion of the overall image information intended to be captured. The information received on the input end 30 of each redirector 14a or 14b may be transferred with relatively low losses to the output end 32 which may be situated either directly over, or in contact with, an underlying imaging array field 16a or 16b of the sensor 16.

A variety of schemes may be utilized to correlate the number of fibers in the bundle to the density of the pixels in the sensor 16. Commonly a number of fiber bundles may be provided for each pixel. Other relationships may be utilized as well. The pixel information from neighboring bundled fibers may be fused together with information from neighboring pixels to record the information collected by the collecting system 12. In this way, the left and right image information may be captured simultaneously in the same sensor 16.

There are advantages to using a single sensor to capture both the left and the right image information in addition to the advantages of system compactness. Since sensors may vary from device to device, a sensor made by the exact same process conditions may be utilized to sense the left and right image information. Thus, variations due to semiconductor processing are less likely to a significant factor when sensor regions made by the same process, under the exact same conditions, are utilized to capture both left and right image information.

Alternatively, the redirectors 14a and 14b may be implemented using light pipes. Suitable light pipes may be made of fiber optic materials, gels or liquids which are effectively contained so as to transfer the light collected by the collecting system 12 separately to each imaging array field 16a and 16b. Because a light pipe system may have less definition between various strands making up a fiber bundle, the granularity of the resulting image may be better, in some embodiments, using a light pipe.

In still another embodiment, shown in FIG. 3, a gradient (GRIN) index lens 24 may be utilized to create the separated images. A gradient index lens has an index of refraction that decreases as a function of the distance from the optical axis. The gradient index lens 24 may have two separate regions which separately collect the image information and transfer that information, as indicated by the arrows, to the fields 16a and 16b of the sensor 16. A wide variety of gradient index lenses are known to those skilled in the art.

Alternatively, instead of using a gradient index lens essentially the same effect may be encoded into a suitable holographic recording medium. A holographic phase plate may be formed in a fashion to create the desired lens characteristics. For example, a plane wave and a spherical wave may be used to form the hologram to create the effect of each spherical lens 12a and 12b. An object wave (O(x,y)) and a reference wave (R(x,y)) may be combined to record a hologram. The interference pattern "H" that is recorded with the object wave and the reference wave is given by the following equation (where "*" means "conjugate")

H=|O+R|2 =|O|2 +|R|2 +OR*+O*R

The diffraction information is contained in the last two terms of the above equation. Referring to FIG. 4a, the reference wave is indicated as A and the object wave which may be a spherical wave front, is indicated as B, each of which intersects at the master 26.

During reconstruction, when the hologram is illuminated with the reference wave again, the original object wave is recreated as indicated in the following equation:

H*R=|O+R|2 *R=R|O|2 +R|R|2 +O|R|2 +O*R2

Referring to FIG. 4B, the reference wave is indicated as C and reconstructed object wave as indicated as D, extended from the master 26.

If R=1, the original object wave O is recovered from the third term in the equation set forth above. For this application, the hologram may be a volume phase hologram. This effectively suppresses the other terms which would otherwise result in wasted light output. As a result, substantially all the light output may be directed into the third term of the equation above, which is the area of interest here.

Recording the hologram may be done using coherent light. However, the reconstruction may be accomplished with broadband light sources. The desired wavelength selectivity may be no different than the corresponding physical structure. Therefore, the light efficiency may not be altered substantially. The optical master produced in this way may then be used to mass produce holographic lenses using conventional techniques. If desired, relatively low cost holographic lenses may be produced in volume, which may be easily secured two lenses over a substrate.

The holographic plate may record the desired optical effect which is comparable to that achieved by the system shown in FIG. 1. That is, the holographic plate may separately collect two spatially displaced versions of the object image and refract them in a way that they expose a more closely separated imaging array field 16a and 16b.

With embodiments of the present invention, a single sensor may record a pair of stereoscopic images for subsequent three dimensional display. In addition to advantages in terms of compactness and lower processing complexity, the system may be advantageous since each image is captured by imaging fields which, to the greatest possible extent, are identical to one another. The resulting images may then be displayed using known techniques for three dimensional displays.

For example, left and right image pairs may be electronically recombined in the three dimensional image rendering and display system 22. The three dimensional image information from the stereoscopic pairs may be combined in a coded composite. The image information from each pair and the depth information arising from the distance between the collectors 12a and 12b may be encoded in a extractable form such as a color or polarization coding. When viewed using a suitably polarized viewer or a suitably colored filter, the user experiences three dimensions.

While the present invention has been described with respect to a limited number of embodiments, those skilled in the art will appreciate numerous modifications and variations therefrom. It is intended that the appended claims cover all such modifications and variations as fall within the true spirit and scope of this present invention.

Raj, Kannan

Patent Priority Assignee Title
10127629, Aug 02 2006 InnerOptic Technology, Inc. System and method of providing real-time dynamic imagery of a medical procedure site using multiple modalities
10136951, Feb 17 2009 InnerOptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image guided surgery
10188467, Dec 12 2014 INNEROPTIC TECHNOLOGY, INC Surgical guidance intersection display
10278778, Oct 27 2016 INNEROPTIC TECHNOLOGY, INC Medical device navigation using a virtual 3D space
10314559, Mar 14 2013 INNEROPTIC TECHNOLOGY, INC Medical device guidance
10398513, Feb 17 2009 InnerOptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image management in image-guided medical procedures
10433814, Feb 17 2016 InnerOptic Technology, Inc. Loupe display
10682039, May 28 2013 XION GMBH Video endoscopic device
10733700, Aug 02 2006 InnerOptic Technology, Inc. System and method of providing real-time dynamic imagery of a medical procedure site using multiple modalities
10772686, Oct 27 2016 InnerOptic Technology, Inc. Medical device navigation using a virtual 3D space
10820944, Oct 02 2014 InnerOptic Technology, Inc. Affected region display based on a variance parameter associated with a medical device
10820946, Dec 12 2014 InnerOptic Technology, Inc. Surgical guidance intersection display
11103200, Jul 22 2015 InnerOptic Technology, Inc. Medical device approaches
11179136, Feb 17 2016 InnerOptic Technology, Inc. Loupe display
11259879, Aug 01 2017 INNEROPTIC TECHNOLOGY, INC Selective transparency to assist medical device navigation
11369439, Oct 27 2016 InnerOptic Technology, Inc. Medical device navigation using a virtual 3D space
11464575, Feb 17 2009 InnerOptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image guided surgery
11464578, Feb 17 2009 INNEROPTIC TECHNOLOGY, INC Systems, methods, apparatuses, and computer-readable media for image management in image-guided medical procedures
11481868, Aug 02 2006 InnerOptic Technology, Inc. System and method of providing real-time dynamic imagery of a medical procedure she using multiple modalities
11484365, Jan 23 2019 INNEROPTIC TECHNOLOGY, INC Medical image guidance
11534245, Dec 12 2014 InnerOptic Technology, Inc. Surgical guidance intersection display
11647949, Jun 07 2017 SAMSUNG ELECTRONICS CO , LTD Method and system for stereo-visual localization of object
11684429, Oct 02 2014 InnerOptic Technology, Inc. Affected region display associated with a medical device
11931117, Dec 12 2014 InnerOptic Technology, Inc. Surgical guidance intersection display
6955483, Dec 30 2003 Intel Corporation Packages for housing optoelectronic assemblies and methods of manufacture thereof
7062121, Nov 15 2002 TWITTER, INC Method and apparatus for a scalable parallel computer based on optical fiber broadcast
7414783, Jan 26 2000 Apparatus for the optical manipulation of a pair of landscape stereoscopic images
7420750, May 21 2004 TRUSTEES OF COLUMBIA UNIVERSITY IN THE CITY OF NEW YORK, THE Catadioptric single camera systems having radial epipolar geometry and methods and means thereof
7466336, Sep 05 2002 Monument Peak Ventures, LLC Camera and method for composing multi-perspective images
7528355, Aug 06 2003 L3 Technologies, Inc System and method for processing and displaying light energy
7564015, Aug 06 2003 L-3 Communications Corporation System and method for processing and displaying light energy
8340379, Mar 07 2008 INNEROPTIC TECHNOLOGY, INC Systems and methods for displaying guidance data based on updated deformable imaging data
8350902, Aug 02 2006 InnerOptic Technology, Inc. System and method of providing real-time dynamic imagery of a medical procedure site using multiple modalities
8482606, Aug 02 2006 InnerOptic Technology, Inc. System and method of providing real-time dynamic imagery of a medical procedure site using multiple modalities
8554307, Apr 12 2010 INNEROPTIC TECHNOLOGY, INC Image annotation in image-guided medical procedures
8585598, Feb 17 2009 InnerOptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image guided surgery
8641621, Feb 17 2009 INNEROPTIC TECHNOLOGY, INC Systems, methods, apparatuses, and computer-readable media for image management in image-guided medical procedures
8670816, Jan 30 2012 InnerOptic Technology, Inc.; INNEROPTIC TECHNOLOGY, INC Multiple medical device guidance
8690776, Feb 17 2009 INNEROPTIC TECHNOLOGY, INC Systems, methods, apparatuses, and computer-readable media for image guided surgery
8831310, Mar 07 2008 InnerOptic Technology, Inc. Systems and methods for displaying guidance data based on updated deformable imaging data
9107698, Apr 12 2010 InnerOptic Technology, Inc. Image annotation in image-guided medical procedures
9177983, Jan 23 2012 OmniVision Technologies, Inc. Image sensor with optical filters having alternating polarization for 3D imaging
9265572, Jan 24 2008 NORTH CAROLINA, UNIVERSITY OF, THE Methods, systems, and computer readable media for image guided ablation
9282947, Dec 01 2009 INNEROPTIC TECHNOLOGY, INC Imager focusing based on intraoperative data
9364294, Feb 17 2009 InnerOptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image management in image-guided medical procedures
9398936, Feb 17 2009 InnerOptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image guided surgery
9659345, Aug 02 2006 InnerOptic Technology, Inc. System and method of providing real-time dynamic imagery of a medical procedure site using multiple modalities
9675319, Feb 17 2016 InnerOptic Technology, Inc. Loupe display
9901406, Oct 02 2014 INNEROPTIC TECHNOLOGY, INC Affected region display associated with a medical device
9949700, Jul 22 2015 InnerOptic Technology, Inc. Medical device approaches
Patent Priority Assignee Title
5191203, Apr 18 1991 McKinley Optics, Inc. Stereo video endoscope objective lens system
5751341, Jan 05 1993 VIKING SYSTEMS, INC Stereoscopic endoscope system
5790284, Jun 01 1993 Canon Kabushiki Kaisha Display apparatus and displaying method for image display producing stereoscopic effect
5835133, Jan 23 1996 Microsoft Technology Licensing, LLC Optical system for single camera stereo video
5896225, May 24 1993 Deutsche Thomson Brandt GmbH Device for stereoscopic image observation within an increased observation area
/////////////
Executed onAssignorAssigneeConveyanceFrameReelDoc
Sep 06 1999RAJ, KANNANIntel CorporationASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0102390786 pdf
Sep 10 1999Intel Corporation(assignment on the face of the patent)
Mar 25 2008Intel CorporationNUMONYX B V ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0280550625 pdf
Sep 30 2011NUMONYX B V Micron Technology, IncASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0270750682 pdf
Apr 26 2016Micron Technology, IncU S BANK NATIONAL ASSOCIATION, AS COLLATERAL AGENTSECURITY INTEREST SEE DOCUMENT FOR DETAILS 0386690001 pdf
Apr 26 2016Micron Technology, IncU S BANK NATIONAL ASSOCIATION, AS COLLATERAL AGENTCORRECTIVE ASSIGNMENT TO CORRECT THE REPLACE ERRONEOUSLY FILED PATENT #7358718 WITH THE CORRECT PATENT #7358178 PREVIOUSLY RECORDED ON REEL 038669 FRAME 0001 ASSIGNOR S HEREBY CONFIRMS THE SECURITY INTEREST 0430790001 pdf
Apr 26 2016Micron Technology, IncMORGAN STANLEY SENIOR FUNDING, INC , AS COLLATERAL AGENTPATENT SECURITY AGREEMENT0389540001 pdf
Jun 29 2018U S BANK NATIONAL ASSOCIATION, AS COLLATERAL AGENTMicron Technology, IncRELEASE BY SECURED PARTY SEE DOCUMENT FOR DETAILS 0472430001 pdf
Jul 03 2018MICRON SEMICONDUCTOR PRODUCTS, INC JPMORGAN CHASE BANK, N A , AS COLLATERAL AGENTSECURITY INTEREST SEE DOCUMENT FOR DETAILS 0475400001 pdf
Jul 03 2018Micron Technology, IncJPMORGAN CHASE BANK, N A , AS COLLATERAL AGENTSECURITY INTEREST SEE DOCUMENT FOR DETAILS 0475400001 pdf
Jul 31 2019JPMORGAN CHASE BANK, N A , AS COLLATERAL AGENTMICRON SEMICONDUCTOR PRODUCTS, INC RELEASE BY SECURED PARTY SEE DOCUMENT FOR DETAILS 0510280001 pdf
Jul 31 2019JPMORGAN CHASE BANK, N A , AS COLLATERAL AGENTMicron Technology, IncRELEASE BY SECURED PARTY SEE DOCUMENT FOR DETAILS 0510280001 pdf
Jul 31 2019MORGAN STANLEY SENIOR FUNDING, INC , AS COLLATERAL AGENTMicron Technology, IncRELEASE BY SECURED PARTY SEE DOCUMENT FOR DETAILS 0509370001 pdf
Date Maintenance Fee Events
Jan 29 2004ASPN: Payor Number Assigned.
Feb 23 2004M1551: Payment of Maintenance Fee, 4th Year, Large Entity.
Sep 14 2005ASPN: Payor Number Assigned.
Sep 14 2005RMPN: Payer Number De-assigned.
Feb 21 2008M1552: Payment of Maintenance Fee, 8th Year, Large Entity.
Mar 03 2008REM: Maintenance Fee Reminder Mailed.
Sep 21 2011M1553: Payment of Maintenance Fee, 12th Year, Large Entity.
Aug 12 2013ASPN: Payor Number Assigned.
Aug 12 2013RMPN: Payer Number De-assigned.


Date Maintenance Schedule
Aug 22 20034 years fee payment window open
Feb 22 20046 months grace period start (w surcharge)
Aug 22 2004patent expiry (for year 4)
Aug 22 20062 years to revive unintentionally abandoned end. (for year 4)
Aug 22 20078 years fee payment window open
Feb 22 20086 months grace period start (w surcharge)
Aug 22 2008patent expiry (for year 8)
Aug 22 20102 years to revive unintentionally abandoned end. (for year 8)
Aug 22 201112 years fee payment window open
Feb 22 20126 months grace period start (w surcharge)
Aug 22 2012patent expiry (for year 12)
Aug 22 20142 years to revive unintentionally abandoned end. (for year 12)