A device for omnidirectional image viewing providing pan-and-tilt orientation, rotation, and magnification within a hemispherical field-of-view that utilizes no moving parts. The imaging device is based on the effect that the image from a fisheye lens, which produces a circular image of at entire hemispherical field-of-view, which can be mathematically corrected using high speed electronic circuitry. More specifically, an incoming fisheye image from any image acquisition source is captured in memory of the device, a transformation is performed for the viewing region of interest and viewing direction, and a corrected image is output as a video image signal for viewing, recording, or analysis. As a result, this device can accomplish the functions of pan, tilt, rotation, and zoom throughout a hemispherical field-of-view without the need for any mechanical mechanisms. The preferred embodiment of the image transformation device can provide corrected images at real-time rates, compatible with standard video equipment. The device can be used for any application where a conventional pan-and-tilt or orientation mechanism might be considered including inspection, monitoring, surveillance, and target acquisition.

Patent
   RE36207
Priority
Jul 12 1996
Filed
Jul 12 1996
Issued
May 04 1999
Expiry
Jul 12 2016
Assg.orig
Entity
Large
64
7
all paid
1. A device for providing perspective corrected views of a selected portion of a hemispherical view in a desired format that utilizes no moving parts, which comprises:
a camera imaging system for receiving optical images and for producing output signals corresponding to said optical images;
fisheye lens means attached to said camera imaging system for producing said optical images, throughout said hemispherical field-of-view, for optical conveyance to said camera imaging system;
image capture means for receiving said output signals from said camera imaging system and for digitizing said output signals from said camera imaging system;
input image memory means for receiving said digitized signals;
image transform processor means for processing said digitized signals in said input image memory means according to selected viewing angles and magnification, and for producing output transform calculation signals according to a combination of said digitized signals, said selected viewing angles and said selected magnification;
output image memory means for receiving said output signals from said image transform processor means;
input means for selecting said viewing angles and magnification;
microprocessor means for receiving said selected viewing angles and magnification from said input means and for converting said selected viewing angles and magnification for input to said image transform processor means to control said processing of said transform processor means; and
output means connected to said output image memory means for recording said perspective corrected view according to said selected viewing angles and magnification.
11. A device for providing perspective corrected views of a selected portion or a hemispherical view in a desired format that utilizes no moving parts, which comprises:
a camera imaging system for receiving optical images and for producing output signals corresponding to said optical images;
fisheye lens means attached to said camera imaging system for producing said optical images, throughout said hemispherical field-of-view, for optical conveyance to said camera imaging system;
image capture means for receiving said output signals from said camera imaging system and for digitizing said output signals from said camera imaging system;
input image memory means for receiving said digitized signals;
image transform processor means for processing said digitized signals in said input image memory means according to selected viewing angles and magnification, and for producing output transform calculation signals in real-time at video rates according to a combination of said digitized signals, said viewing angles and said selected magnification;
user operated input means for selecting said viewing angles and magnification;
microprocessor means for receiving said selected viewing angles and magnification from said user operated input means and for converting said selected viewing angles and magnification for input to said image transform processor means to control said processing of said transform processor means;
output image memory means for receiving said output transform calculation signals in real-time and at video rates from said image transform processor means; and
output means connected to said output image memory means for recording said perspective corrected views according to said selected viewing angles and magnification.
10. A device for providing perspective corrected views of a selected portion of a hemispherical view in a desired format that utilizes no moving parts, which comprises:
a camera imaging system for receiving optical images and for producing output signals corresponding to said optical images;
fisheye lens means attached to said camera imaging system for producing said optical images, throughout said hemispherical field-of-view, for optical conveyance to said camera imaging system;
image capture means for receiving said output signals from said camera imaging system and for digitizing said output signals from said camera imaging system;
input image memory means for receiving said digitized signals;
image transform processor means for processing said digitized signals in said input image memory means according to selected viewing angles and magnification, and for producing output signals, said selected viewing angles and said selected magnification, according to the equations; ##EQU8## where:
A=(cos Ø cos ∂-sin Ø sin ∂cos β)
B=(sin Ø cos ∂+cos Ø sin ∂cos β)
C=(cos Ø sin ∂+sin Ø cos ∂cos β)
D=(sin Ø sin ∂-cos Ø cos ∂cos β)
and where:
R=radius of the image circle
β=zenith angle
∂=Azimuth angle in image plane
Ø=Object plane rotation angle
m=Magnification
u,v=object plane coordinates
x,y=image plane coordinates
output image memory means for receiving said output signals from said image transform processor means;
input means for selecting said viewing angles and magnification;
microprocessor means for receiving said selected viewing angles and magnification from said input means and for converting said selected viewing and magnification for input to said image transform processor means to control said processing of said transform processor means; and
output means connected to said output image means for recording said perspective corrected views according to said selected viewing angles and implementation.
2. The device of claim 1 wherein said output means includes image display means for providing a perspective corrected image display according to said selected viewing angle and said magnification.
3. The device of claim 1 wherein said input means further provides for input of a selected portion of said hemispherical view to said transform processor means.
4. The device of claim 1 wherein said input means further provides for input of a selected tilting of said viewing angle through 180 degrees.
5. The device of claim 1 wherein said input means further provides for input of a selected rotation of said viewing angle through 360 degrees to achieve said perspective corrected view.
6. The device of claim 1 wherein said input means further provides for input of a selected pan of said viewing angle through 180 degrees.
7. The device of claim 1 wherein said output transform calculation signals of said image transform processor means are produced in real-time at video rates.
8. The device of claim 1 wherein said input means is a user-operated manipulator switch means.
9. The device of claim 1 wherein said image transform processor means is programmed to implement the following two equations: ##EQU7## where:
A=(cos Ø cos ∂-sin Ø sin ∂cos β)
B=(sin Ø cos ∂+cos Ø sin ∂cos β)
C=(cos Ø sin ∂+sin Ø cos ∂cos β)
D=(sin Ø sin ∂-cos Ø cos ∂cos β)
and where:
R=radius of the image circle
β=zenith angle
∂=Azimuth angle in image plane
Ø=Object plane rotation angle
m=Magnification
u,v=object plane coordinates
x,y=image plane coordinates

This invention was made with Government support under contract NAS1-18855 awarded by NASA. The Government has certain rights in this invention.

The invention relates to an apparatus, algorithm, and method for transforming a hemispherical field-of-view image into a non-distorted, normal perspective image at any orientation, rotation, and magnification within the field-of-view. The viewing direction, orientation, and magnification are controlled by either computer or remote control means. More particularly, this apparatus is the electronic equivalent of a mechanical pan, tilt, zoom, and rotation camera viewing system with no moving mechanisms.

Camera viewing systems are utilized in abundance for surveillance, inspection, security, and remote sensing. Remote viewing is critical for robotic manipulation tasks. Close viewing is necessary for detailed manipulation tasks while wide-angle viewing aids positioning of the robotic system to avoid collisions with the work space. The majority of these systems use either a fixed-mount camera with a limited viewing field, or they utilize mechanical pan-and-tilt platforms and mechanized zoom lenses to orient the camera and magnify its image. In the applications where orientation of the camera and magnification of its image are required, the mechanical solution is large and can subtend a significant volume making the viewing system difficult to conceal or use in close quarters. Several cameras are usually necessary to provide wide-angle viewing of the work space.

In order to provide a maximum amount of viewing coverage or subtended angle, mechanical pan/tilt mechanisms usually use motorized drives and gear mechanisms to manipulate the vertical and horizontal orientation. An example of such a device is shown in U.S. Pat. No. 4,728,839 issued to J. B. Coughlan, et al, on Mar. 1, 1988. Collisions with the working environment caused by these mechanical pan/tilt orientation mechanisms can damage both the camera and the work space and impede the remote handling operation. Simultaneously, viewing in said remote environments is extremely important to the performance of inspection and manipulation activities.

Camera viewing systems that use internal optics to provide wide viewing angles have also been developed in order to minimize the size and volume of the camera and the intrusion into the viewing area. These systems rely on the movement of either a mirror or prism to change the tilt-angle of orientation and provide mechanical rotation of the entire camera to change the pitch angle of orientation. Using this means, the size of the camera orientation system can be minimized, but "blind spots" in the center of the view result. Also, these systems typically have no means of magnifying the image and or producing multiple images from a single camera.

Accordingly, it is an object of the present invention to provide an apparatus that can provide an image of any portion of the viewing space within a hemispherical field-of-view without moving the apparatus.

It is another object of the present invention to provide horizontal orientation (pan) of the viewing direction with no moving mechanisms.

It is another object of the present invention to provide vertical orientation (tilt) of the viewing direction with no moving mechanisms.

It is another object of the present invention to provide rotational orientation (rotation) of the viewing direction with no moving mechanisms.

It is another object of the present invention to provide the ability to magnify or scale the image (zoom in and out) electronically.

It is another object of the present invention to provide electronic control of the image intensity (iris level).

It is another object of the present invention to be able to change the image intensity (iris level) without any mechanisms.

It is another object of the present invention to be able to accomplish said pan, tilt, zoom, rotation, and iris with simple inputs made by a lay person from a joystick, keyboard controller, or computer controlled means.

It is also an object of the present invention to provide accurate control of the absolute viewing direction and orientations using said input devices.

A further object of the present invention is to provide the ability to produce multiple images with different orientations and magnifications simultaneously.

Another object of the present invention is to be able to provide these images at real-time video rates, that is 30 transformed images per second, and to support various display format standards such as the National Television Standards Committee RS-170 display format.

These and other objects of the present invention will become apparent upon consideration of the drawings hereinafter in combination with a complete description thereof.

In accordance with the present invention, there is provided an omnidirectional viewing system that produces the equivalent of pan, tilt, zoom, and rotation within a hemispherical field-of-view with no moving parts. This device includes a means for digitizing an incoming video image signal, transforming a portion of said video image based upon operator commands, and producing one or more output images that are in correct perspective for human viewing. In one preferred embodiment, the incoming image is produced by a fisheye lens which has a hemispherical field-of-view. This hemispherical field-of-view image is captured into an electronic memory buffer. A portion of the captured image containing a region-of-interest is transformed into a perspective correct image by image processing computer means. The image processing computer provides direct mapping of the hemispherical image region-of-interest into a corrected image using an orthogonal set of transformation algorithms. The viewing orientation is designated by a command signal generated by either a human operator or computerized input. The transformed image is deposited in a second electronic memory buffer where it is then manipulated to produce the output image as requested by the command signal.

FIG. 1 shows a schematic block diagram of the present invention illustrating the major components thereof.

FIG. 2 is an example sketch of a typical fisheye image used as input by the present invention.

FIG. 3 is an example sketch of the output image after correction for a desired image orientation and magnification within the original image.

FIG. 4 is a schematic diagram of the fundamental geometry that the present invention embodies to accomplish the image transformation.

FIG. 5 is a schematic diagram demonstrating the projection of the object plane and position vector into image plane coordinates.

In order to minimize the size of the camera orientation system while maintaining the ability to zoom, a camera orientation system that utilizes electronic image transformations rather than mechanisms was developed. While numerous patents on mechanical pan-and-tilt systems have been filed, no approach using strictly electronic transforms and fisheye optics has ever been successfully implemented prior to this effort. In addition, the electrooptical approach utilized in the present invention allows multiple images to be extracted from the output of a single camera. Motivation for this device came from viewing system requirements in remote handling applications where the operating envelop of the equipment is a significant constraint to task accomplishment.

The principles of the present invention can be understood by reference to FIG. 1. Shown schematically at 1 is the fisheye lens that provides an image of the environment with a 180 degree field-of-view. The fisheye lens is attached to a camera 2 which converts the optical image into an electrical signal. These signals are then digitized electronically 3 and stored in an image buffer 4 within the present invention. An image processing system consisting of an X-MAP and a Y-MAP processor shown as 6 and 7, respectively, performs the two-dimensional transform mapping. The image transform processors are controlled by the microcomputer and control interface 5. The microcomputer control interface provides initialization and transform parameter calculation for the system. The control interface also determines the desired transformation coefficients based on orientation angle, magnification, rotation, and light sensitivity input from an input means such as a joystick controller 12 or computer input means 13. The transformed image is filtered by a 2-dimensional convolution filter 8 and the output of the filtered image is stored in an output image buffer 9. The output image buffer 9 is scanned out by display electronics 10 to a video display device 11 for viewing.

A range of lens types can be accommodated to support various fields of view. The lens optics 1 correspond directly with the mathematical coefficients used with the X-MAP and Y-MAP processors 6 and 7 to transform the image. The capability to pan and tilt the output image remains even though a different maximum field of view is provided with a different lens element.

The invention can be realized by proper combination of a number of optical and electronic devices. The fisheye lens 1 is exemplified by any of a series of wide angle lenses from, for example, Nikon, particularly the 8 mm F2.8. Any video source 2 and image capturing device 3 that converts the optical image into electronic memory can serve as the input for the invention such as a Videk Digital Camera interfaced with Texas Instrument's TMS 34061 integrated circuits. Input and output image buffers 4 and 9 can be constructed using Texas Instrument TMS44C251 video random access memory chips or their equivalents. The control interface can be accomplished with any of a number of microcontrollers including the Intel 80C196. The X-MAP and Y-MAP transform processors 6 and 7 and image filtering 8 can be accomplished with application specific integrated circuits or other means as will be known to persons skilled in the art. The display driver can also be accomplished with integrated circuits such as the Texas Instruments TMS34061. The output video signal can be of the NTSC RS-170, for example, compatible with most commercial television displays in the United States. Remote control 12 and computer control 13 are accomplished via readily available switches and/or computer systems that also will be well known. These components function as a system to select a portion of the input image (fisheye or wide angle) and then mathematically transform the image to provide the proper prospective for output. The keys to the success of the invention include:

(1) the entire input image need not be transformed, only the portion of interest

(2) the required mathematical transform is predictable based on the lens characteristics.

The transformation that occurs between the input memory buffer 4 and the output memory buffer 9, as controlled by the two coordinated transformation circuits 6 and 7, is better understood by looking at FIG. 2 and FIG. 3. The image shown in FIG. 2 is a pen and ink rendering of the image of a grid pattern produced by a fisheye lens. This image has a field-of-view of 180 degrees and shows the contents of the environment throughout an entire hemisphere. Notice that the resulting image in FIG. 2 is significantly distorted relative to human perception. Vertical grid lines in the environment appear in the image plane as 14a, 14b, and 14c. Horizontal grid lines in the environment appear in the image plane as 15a, 15b, and 15c. The image of an object is exemplified by 16. A portion of the image in FIG. 2 has been correct, magnified, and rotated to produce the image shown in FIG. 3. Item 17 shows the corrected representation of the object in the output display. The results shown in the image in FIG. 3 can be produced from any portion of the image of FIG. 2 using the present invention. Note the corrected perspective as demonstrated by the straightening of the grid pattern displayed in FIG. 3. In the present invention, these transformations can be performed at real-time video rates (30 times per second), compatible with commercial video standards.

The invention as described has the capability to pan and tilt the output image through the entire field of view of the lens element by changing the input means, e.g. the joystick or computer, to the controller. This allows a large area to be scanned for information as can be useful in security and surveillance applications. The image can also be rotated through 360 degrees on its axis changing the perceived vertical of the displayed image. This capability provides the ability to align the vertical image with the gravity vector to maintain a proper perspective in the image display regardless of the pan or tilt angle of the image. The invention also supports modifications in the magnification used to display the output image. This is commensurate with a zoom function that allows a change in the field of view of the output image. This function is extremely useful for inspection operations. The magnitude of zoom provided is a function of the resolution of the input camera, the resolution of the output display, the clarity of the output display, and the amount of picture element (pixel) averaging that is used in a given display. The invention supports all of these functions to provide capabilities associated with traditional mechanical pan (through 180 degrees), tilt (through 180 degrees), rotation (through 360 degrees), and zoom devices. The digital system also supports image intensity scaling that emulates the functionality of a mechanical iris by shifting the intensity or the displayed image based on commands from the user or an external computer.

The postulates and equations that follow are based on the present invention utilizing a fisheye lens as the optical element. There are two basic properties and two basic postulates that describe the perfect fisheye lens system. The first property of a fisheye lens is that the lens has a 2π steradian field-of-view and the image it produces is a circle. The second property is that all objects in the field-of-view are in focus, i.e. the perfect fisheye lens has an infinite depth-of-field. The two important postulates of the fisheye lens system (refer to FIGS. 4 and 5) are stated as follows:

Postulate 1: Azimuth angle invariability--For object points that lie in a content plane that is perpendicular to the image plane and passes through the image plane origin, all such points are mapped as image points onto the line of intersection between the image plane and the content plane, i.e. along a radial line. The azimuth angle or the image points is therefore invariant to elevation and object distance changes within the content plane.

Postulate 2: Equidistant Projection Rule--The radial distance, r, from the image plane origin along the azimuth angle containing the projection of the object point is linearly proportional to the zenith angle β, where β is defined as the angle between a perpendicular line through the image plane origin and the line from the image plane origin to the object point. Thus the relationship:

r=kβ (1)

Using these properties and postulates as the foundation of the fisheye lens system, the mathematical transformation for obtaining a perspective corrected image can be determined. FIG. 4 shows the coordinate reference frames for the object plane and the image plane. The coordinates u,v describe object points within the object plane. The coordinates x,y,z describe points within the image coordinate frame of reference.

The object plane shown in FIG. 4 is a typical region of interest to determine the mapping relationship onto the image plane to properly correct the object. The direction of view vector, DOV[x,y,z], determines the zenith and azimuth angles for mapping the object plane, UV, onto the image plane, XY. The object plane is defined to be perpendicular to the vector, DOV[x,y,z].

The location of the origin of the object plane in terms of the image plane [x y,z] in spherical coordinates is given by:

x=D sin β cos ∂

y=D sin β sin ∂

x=D cos β (2)

where D=scalar length from the image plane origin to the object plane origin, β, is the zenith angle, and ∂ is the azimuth angle in image plane spherical coordinates. The origin of object plane is represented as a vector using the components given in equation 1 as:

DOV[x,y,z]=[D sin β cos ∂, D sin β sin ∂, D cos β] (3)

DOV[x,y,z] is perpendicular to the object plane and its scalar magnitude D provides the distance to the object plane. By aligning the YZ plane with the direction of action of DOV[x,y,z], the azimuth angle α becomes either 90 or 270 degrees and therefore the x component becomes zero resulting in the DOV[x,y,z] coordinates:

DOV[x,y,z]=[0, -D sin β, D cos β] (4)

Referring now to FIG. 5, the object point relative to the UV plane origin in coordinates relative to the origin of the image plane is given by the following:

x=u

y=v cos β

z=v sin β (5)

therefore, the coordinates of a point P(u,v) that lies in the object plane can be represented as a vector P[x y,z] in image plane coordinates:

P[x,y,z]=[u, v cos β, v sin β] (6)

where P[x,y,z] describes the position of the object point in image coordinates relative to the origin of the UV plane. The object vector O[x,y,z] that describes the object point in image coordinates is then given by:

O[x,y,z]=DOV[x,y,z]+P[x,y,z] (7)

O[x,y,z]=[u, v cos β-D sin β, v sin β+D cos β](8)

Projection onto a hemisphere of radius R attached to the image plane is determined by scaling the object vector O[x,y,z] to produce a surface vector S[x,y,z,]: ##EQU1##

By substituting for the components of O[x,y,z] from Equation 8, the vector S[x,y,z] describing the image point mapping onto the hemisphere becomes: ##EQU2##

The denominator in Equation 10 represents the length or absolute value of the vector O[x,y,z] and can be simplified through algebraic and trigonometric manipulation to give: ##EQU3##

From equation 11, the mapping onto the two-dimensional image plane can be obtained for both x and y as: ##EQU4##

Additionally, the image plane center to object plane distance D can be represented in terms of the fisheye image circular radius R by the relation:

D=mR (14)

where m represents the scale factor in radial units R from the image plane origin to the object plane origin. Substituting Equation 14 into Equations 12 and 13 provides a means for obtaining an effective scaling operation or magnification which can be used to provide zoom operation. ##EQU5##

Using the equations for two-dimensional rotation of axes for both the UV object plane and the XY image plane the last two equations can be further manipulated to provide a more general set of equations that provides for rotation within the image plane and rotation within the object plane. ##EQU6## where:

A=(cos ∅ cos ∂-sin ∅ sin ∂ cos β)

B=(sin ∅ cos ∂+cos ∅ sin ∂ cos β)

C=(cos ∅ sin ∂+sin ∅ cos ∂ cos β)

D=(sin ∅ sin ∂-cos ∅ cos ∂ cis β)

and where:

R=radius of the image circle

β=zenith angle

∂=Azimuth angle in image plane

∅=Object plane rotation angle

m=Magnification

u,v=object plane coordinates

x,y=image plane coordinates

The Equations 17 and 18 provide a direct mapping from the UV space to the XY image space and are the fundamental mathematical result that supports the functioning of the present omnidirectional viewing system with no moving parts. By knowing the desired zenith, azimuth, and object plane rotation angles and the magnification, the locations of X and y in the imaging array can be determined. This approach provides a means to transform an image from the input video buffer to the output video buffer exactly. Also, the fisheye image system is completely symmetrical about the zenith, therefore, the vector assignments and resulting signs of various components can be chosen differently depending on the desired orientation of the object plane with respect to the image plane. In addition, these postulates and mathematical equations can be modified for various lens elements as necessary for the desired field-of-view coverage in a given application.

The input means defines the zenith angle, β, the azimuth angle, ∂, the object rotation, ∅, and the magnification, m. These values are substituted into Equations 19 to determine values for substitution into Equations 17 and 18. The image circle radius, R, is a fixed value that is determined by the camera lens ane element relationship. The variables u and v vary throughout the object plane determining the values for x and y in the image plane coordinates.

From the foregoing, it can be seen that a fisheye lens provides a hemispherical view that is captured by a camera. The image is then transformed into a corrected image at a desired pan, tilt, magnification, rotation, and focus based on the desired view as described by a control input. The image is then output to a television display with the perspective corrected. Accordingly, no mechanical devices are required to attain this extensive analysis and presentation of the view of an environment through 180 degrees of pan, 180 degrees of tilt, 360 degrees of rotation, and various degrees of zoom magnification.

Martin, H. Lee, Zimmermann, Steven D.

Patent Priority Assignee Title
10140433, Aug 03 2001 Comcast IP Holdings I, LLC Video and digital multimedia aggregator
10225511, Dec 30 2015 GOOGLE LLC Low power framework for controlling image sensor mode in a mobile image capture device
10349096, Aug 03 2001 Comcast IP Holdings I, LLC Video and digital multimedia aggregator content coding and formatting
10681268, May 15 2014 Ricoh Company, Ltd. Imaging system, imaging apparatus, and system
10728489, Dec 30 2015 GOOGLE LLC Low power framework for controlling image sensor mode in a mobile image capture device
10732809, Dec 30 2015 GOOGLE LLC Systems and methods for selective retention and editing of images captured by mobile image capture device
11159763, Dec 30 2015 GOOGLE LLC Low power framework for controlling image sensor mode in a mobile image capture device
6147709, Apr 07 1997 Sony Semiconductor Solutions Corporation Method and apparatus for inserting a high resolution image into a low resolution interactive image to produce a realistic immersive experience
6181335, Dec 09 1992 Comcast IP Holdings I, LLC Card for a set top terminal
6219089, May 08 1997 CEDAR LANE TECHNOLOGIES INC Method and apparatus for electronically distributing images from a panoptic camera system
6222683, Jan 13 1999 HANGER SOLUTIONS, LLC Panoramic imaging arrangement
6331869, Aug 07 1998 CEDAR LANE TECHNOLOGIES INC Method and apparatus for electronically distributing motion panoramic images
6337708, Jun 24 1996 CEDAR LANE TECHNOLOGIES INC Method and apparatus for electronically distributing motion panoramic images
6341044, Jun 24 1996 CEDAR LANE TECHNOLOGIES INC Panoramic imaging arrangement
6369818, Nov 25 1998 CHARTOLEAUX KG LIMITED LIABILITY COMPANY Method, apparatus and computer program product for generating perspective corrected data from warped information
6373642, Jun 24 1996 CEDAR LANE TECHNOLOGIES INC Panoramic imaging arrangement
6392687, May 08 1997 HANGER SOLUTIONS, LLC Method and apparatus for implementing a panoptic camera system
6426774, Jun 24 1996 CEDAR LANE TECHNOLOGIES INC Panoramic camera
6466254, May 08 1997 CEDAR LANE TECHNOLOGIES INC Method and apparatus for electronically distributing motion panoramic images
6480229, Jun 24 1996 CEDAR LANE TECHNOLOGIES INC Panoramic camera
6493032, Jun 24 1996 CEDAR LANE TECHNOLOGIES INC Imaging arrangement which allows for capturing an image of a view at different resolutions
6515680, Dec 09 1992 Comcast IP Holdings I, LLC Set top terminal for television delivery system
6515696, Jun 24 1996 CEDAR LANE TECHNOLOGIES INC Method and apparatus for presenting images from a remote location
6542184, Jun 24 1996 CEDAR LANE TECHNOLOGIES INC Methods, apparatus, and program products for presenting panoramic images of a remote location
6583815, Jun 24 1996 CEDAR LANE TECHNOLOGIES INC Method and apparatus for presenting images from a remote location
6593969, Jun 24 1996 CEDAR LANE TECHNOLOGIES INC Preparing a panoramic image for presentation
6625812, Oct 22 1999 TRUESENTRY, INC Method and system for preserving and communicating live views of a remote physical location over a computer network
6675386, Sep 04 1996 Comcast IP Holdings I, LLC Apparatus for video access and control over computer network, including image correction
6704434, Jan 27 1999 Suzuki Motor Corporation Vehicle driving information storage apparatus and vehicle driving information storage method
6833843, Dec 03 2001 BIOTRONIX INC Panoramic imaging and display system with canonical magnifier
6924832, Aug 07 1998 CEDAR LANE TECHNOLOGIES INC Method, apparatus & computer program product for tracking objects in a warped video image
7058239, Oct 29 2001 360AI SOLUTIONS LLC System and method for panoramic imaging
7123777, Sep 27 2001 360AI SOLUTIONS LLC System and method for panoramic imaging
7242425, Jun 24 1996 CEDAR LANE TECHNOLOGIES INC Panoramic camera
7274381, Dec 03 2001 BIOTRONIX INC Panoramic imaging and display system with canonical magnifier
7304680, Oct 03 2002 Continental Automotive GmbH Method and device for correcting an image, particularly for occupant protection
7336788, Dec 09 1992 Discovery Communicatoins Inc. Electronic book secure communication with home subsystem
7382399, May 13 1991 Sony Corporation Omniview motionless camera orientation system
7401286, Dec 02 1993 Adrea, LLC Electronic book electronic links
7486324, Jun 11 1997 CEDAR LANE TECHNOLOGIES INC Presenting panoramic images with geometric transformation
7509270, Dec 09 1992 DISCOVERY COMMUNICATIONS, LLC Electronic Book having electronic commerce features
7629995, Aug 06 2004 Sony Semiconductor Solutions Corporation System and method for correlating camera views
7707137, Sep 29 2005 Oracle America, Inc Method and apparatus for browsing media content based on user affinity
7714936, May 13 1991 Sony Corporation Omniview motionless camera orientation system
7750936, Aug 06 2004 Sony Semiconductor Solutions Corporation Immersive surveillance system interface
7834907, Mar 03 2004 Canon Kabushiki Kaisha Image-taking apparatus and image processing method
7865405, Dec 09 1992 Discovery Patent Holdings, LLC Electronic book having electronic commerce features
8060905, Dec 09 1992 Comcast IP Holdings I, LLC Television delivery system having interactive electronic program guide
8134608, Nov 19 2007 ALPS ALPINE CO , LTD Imaging apparatus
8284258, Sep 18 2008 GRANDEYE LTD Unusual event detection in wide-angle video (based on moving object trajectories)
8547423, Sep 24 2009 Imaging system and device
8578410, Aug 03 2001 Comcast IP Holdings, I, LLC Video and digital multimedia aggregator content coding and formatting
8621521, Aug 03 2001 Comcast IP Holdings I, LLC Video and digital multimedia aggregator
8670001, Nov 30 2006 MATHWORKS, INC System and method for converting a fish-eye image into a rectilinear image
8692881, Aug 06 2004 Sony Semiconductor Solutions Corporation System and method for correlating camera views
8723951, Nov 23 2005 GRANDEYE, LTD Interactive wide-angle video server
9153014, Nov 09 2010 AVISONIC TECHNOLOGY CORPORATION Image correction method and related image correction system thereof
9286294, Aug 03 2001 TIVO CORPORATION Video and digital multimedia aggregator content suggestion engine
9529824, Jun 05 2013 MAXAR INTELLIGENCE INC System and method for multi resolution and multi temporal image search
9749525, Aug 06 2004 Sony Semiconductor Solutions Corporation System and method for correlating camera views
9813641, Jun 19 2000 Comcast IP Holdings I, LLC Method and apparatus for targeting of interactive virtual objects
9930225, Feb 10 2011 VILLMER LLC Omni-directional camera and related viewing software
9955073, Sep 12 2003 JOHNSON CONTROLS, INC ; Johnson Controls Tyco IP Holdings LLP; JOHNSON CONTROLS US HOLDINGS LLC Video user interface system and method
RE44087, Jun 24 1996 CEDAR LANE TECHNOLOGIES INC Presenting panoramic images with geometric transformation
Patent Priority Assignee Title
4772942, Jan 11 1986 Pilkington P.E. Limited Display system having wide field of view
5023725, Oct 23 1989 IMMERSIVE LICENSING, INC Method and apparatus for dodecahedral imaging system
5067019, Mar 31 1989 UNITED STATES OF AMERICA, THE, AS REPRESENTED BY THE ADMINISTRATOR OF THE NATIONAL AERONAUTICS AND SPACE ADMINISTRATION Programmable remapper for image processing
5068735, Aug 22 1989 Fuji Photo Optical Co., Ltd. System for controlling the aiming direction, focus, zooming, and/or position of a television camera
EP11909,
JP2127877,
WO8203712,
//////////
Executed onAssignorAssigneeConveyanceFrameReelDoc
Jul 12 1996Omniview, Inc.(assignment on the face of the patent)
Dec 08 1997OMNIVIEW, INC Interactive Pictures CorporationCHANGE OF NAME SEE DOCUMENT FOR DETAILS 0094010428 pdf
Apr 03 1998MARTIN, H LEEIPIXASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0090870230 pdf
May 14 2001PW TECHNOLOGY, INC IMAGE INVESTOR PORTFOLIO, A SEPARATE SERIES OF MEMPHIS ANGELS, LLC, A DELAWARE LIMITED LIABILITY COMPANYINTELLECTUAL PROPERTY SECURITY AGREEMENT0118280088 pdf
May 14 2001Interactive Pictures CorporationIMAGE INVESTOR PORFOLIO, A SEPARATE SERIES OF MEMPHIS ANGELS, LLC, A DELAWARE LIMITED LIABILITY COMPANYINTELLECTUAL PROPERTY SECURITY AGREEMENT0118370431 pdf
May 14 2001Internet Pictures CorporationIMAGE INVESTOR PORTFOLIO, A SEPARATE SERIES OF MEMPHIS ANGELS, LLC, A DELAWARE LIMITED LIABILITY COMPANYINTELLECTUAL PROPERTY SECURITY AGREEMENT0118280054 pdf
Sep 26 2001IMAGE INVESTOR PORTFOLIO, A SEPARATE SERIES OF MEMPHIS ANGELS, LLCPW TECHNOLOGY, INC RELEASE0122950978 pdf
Sep 26 2001IMAGE INVESTOR PORTFOLIO, A SEPARATE SERIES OF MEMPHIS ANGELS, LLCInteractive Pictures CorporationRELEASE0122950982 pdf
Sep 26 2001IMAGE INVESTOR PORTFOLIO, A SEPARATE SERIES OF MEMPHIS ANGELS, LLCINTERMET PICTURES CORPORATIONRELEASE0122950986 pdf
Feb 22 2007IPIX CorporationSony CorporationASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0190840034 pdf
Date Maintenance Fee Events
Aug 24 2000M184: Payment of Maintenance Fee, 8th Year, Large Entity.
Aug 24 2000M186: Surcharge for Late Payment, Large Entity.
Aug 25 2000ASPN: Payor Number Assigned.
Aug 29 2000LSM2: Pat Hldr no Longer Claims Small Ent Stat as Small Business.
Aug 09 2004M1553: Payment of Maintenance Fee, 12th Year, Large Entity.


Date Maintenance Schedule
May 04 20024 years fee payment window open
Nov 04 20026 months grace period start (w surcharge)
May 04 2003patent expiry (for year 4)
May 04 20052 years to revive unintentionally abandoned end. (for year 4)
May 04 20068 years fee payment window open
Nov 04 20066 months grace period start (w surcharge)
May 04 2007patent expiry (for year 8)
May 04 20092 years to revive unintentionally abandoned end. (for year 8)
May 04 201012 years fee payment window open
Nov 04 20106 months grace period start (w surcharge)
May 04 2011patent expiry (for year 12)
May 04 20132 years to revive unintentionally abandoned end. (for year 12)