An image processing apparatus or camera system comprises an image sensor 1, a geometrical position calculation device 6 for performing predetermined correction of a distortion, a first address table 10 for storing information correlating an input side address based on the calculation results of the geometrical position calculation device 6 to an output side address as a reference, a sort unit 11 for sorting the output side addresses according to the input side addresses, a second address table 12 for storing information correlating the output side address to the sorted input side address as a reference, and an address matching device 13 for matching the input side address of input side image data DI with the input side address stored in the second address table 12 and outputting output side image data DO.
|
12. An image processing method, comprising:
extracting a region for display from an image captured by a lens based on one or more parameters, the one or more parameters including a parameter associated with one of pan, tilt, zoom or rotation;
transforming a geometrical position of an object in the region based on the parameter, including transforming geometrical position of each pixel in input image data obtained from an imaging device, wherein the each pixel corresponds to a pixel on an output display, and wherein the geometrical position of the each pixel in input image data based on an output signal of an imaging device;
storing table information in an address table, wherein the table information comprises a combination of information obtained by correlating an input side address with an output side address, wherein the input side address comprises an address of the each pixel of input side image data based on a calculated geometrical position and the output side address comprises a reference address corresponding to an address of the each pixel on the output display;
determining whether a real time input side address of the each pixel of the input side image data is coincident with the input side address stored in the address table;
combining the input side image data at the input side address with the corresponding output side address to form output side image data when the real time input side address is coincident with the input side address; and
transmitting the output side image data.
13. A method, comprising:
providing a digital signal representing an image captured by a lens;
extracting a region to be displayed in the image by setting a parameter related to at least one of pan, tilt, zoom and rotation;
calculating a geometrical position of each pixel in input side image data, the each pixel corresponding to each pixel on an output screen, wherein a predetermined transformation of a geometrical position of an object in the region is performed based on the parameter;
storing table information in an address table, wherein the table information comprises information obtained by correlating an input side address to an output side address, wherein the input side address comprises an address of the each pixel of the input side image data based on the calculated geometrical position, and wherein the output side address comprises an address of the each pixel on the output screen;
rearranging the output side addresses stored in the address table according to the input side addresses;
storing in an address table, information obtained by correlating the output side addresses to the input side addresses after the output side addresses have been rearranged; and
determining whether a real time version of the input side address of the each pixel of the input side image data is coincident with the input side address stored in the address table;
combining the input side image data at the input side address with the corresponding output side address to form output side image data; and
transmitting the output side image data.
1. A method, comprising:
providing a digital signal representative of an image captured through a lens;
extracting a region to be displayed from the image by setting at least one parameter, wherein the at least one parameter includes one or more of a pan parameter, a tilt parameter, a zoom parameter or a rotation parameter;
transforming a geometrical position of an object in the region based on the parameter, wherein the geometrical position of the object is transformed by calculating a geometrical position of each of a plurality of pixels of input side image data, the each pixel of the input side image data corresponding to one of a plurality of pixels on an output screen;
storing information in an address table, wherein the information is obtained by correlating an input side address to an output side address, wherein the input side address corresponds to an address of the each pixel of the input side image data and is based on the geometrical position calculated for the each pixel of the input side image data, and the output side address corresponds to an address of the each pixel on the output screen; and
determining whether a real-time input side address of each pixel of the input side image data is coincident with a stored input side address that is stored in the address table; and
when the real-time input side address is coincident with the stored input side address:
combining the input side image data at the stored input side address with corresponding image data at the output side address to form output side image data; and
transmitting the output side image data in an output signal.
2. The method of
rearranging output side addresses calculated in relation to at least one specific region of the output screen in accordance with an arrangement of corresponding address input side addresses.
3. The method of
storing portions of table information in the address table, wherein output side addresses in portions of the table information are rearranged according to input side addresses in relation to the at least one specific region.
4. The method of
providing the input side address as a decimal value.
5. The method of
selecting as a reference, a specific pixel corresponding to a whole number part of the input side address in the input side image data;
determining a brightness of the specific pixel by weighted interpolation using a value of a fractional part of the input side address based on brightness of a plurality of pixels adjacent to the specific pixel; and
establishing the brightness of the specific pixel as brightness of a pixel in the output side image data corresponding to the specific pixel.
6. The method of
determining the brightness of the specific pixel belonging to a next line using brightness information associated with each pixel stored in the buffer memory.
7. The method of
storing the output side image data in a buffer memory into which the output side image data is written randomly using the output side address as a reference,
wherein the output side image data is stored whenever the real-time input side address of the each pixel is determined to be coincident with the stored input side address, and
wherein the output side image data is read from the buffer memory sequentially.
8. The method of
storing the output side image data in a buffer memory into which the output side image data is written sequentially whenever the real-time input side address of the each pixel is determined to be coincident with the stored input side address;
rearranging memory addresses of the buffer memory in accordance with the output side addresses; and
storing table information in a read-out address table,
wherein the table information stored in the read-out address table is obtained by correlating the memory addresses of the buffer memory to the output side addresses after rearranging the memory addresses of the buffer memory, and
wherein transmitting the output side image data is performed randomly based on the table information of the read-out address table.
9. The method of
10. The method of
correcting distortion of the image in the region.
11. The method of
distorting the image in the region.
14. The method of
15. The method of
|
This application is a continuation of co-pending U.S. application Ser. No. 12/669,750 filed Jan. 19, 2010, which is a national phase filing of PCT/JP2008/062984 filed Jul. 18, 2008, which claims priority to Japanese Application No. 2007-189807 filed Jul. 20, 2007, all of which are incorporated by reference herein in their entireties.
This invention relates to an image processing apparatus and a camera system. More specifically, the invention relates to those useful when applied to an image processing apparatus or an omnidirectional camera system which perform correction of a distortion when doing omnidirectional monitoring through a fish-eye lens, or an image processing apparatus or a camera system which perform processing for distorting a captured image.
A fish-eye lens is known as a lens having a wide-angle field of view or visual field. A proposal has been made for an omnidirectional camera system for monitoring a predetermined region with the use of this fish-eye lens. Such an omnidirectional camera system generally has functions such as panning, tilting, zooming (i.e., pan, tilt, zoom) and rotation (or rolling or roll). In recent years, a proposal has also been made for an omnidirectional camera system which, as compared with the conventional mechanical pan, tilt or zoom mode, electrically processes input side image data captured by the fish-eye lens, thereby fulfilling various functions, such as pan, tilt, zoom and rotation, and eliminating the distortion of the image, without moving the apparatus.
A parameter setting device 5, on the other hand, has settings of parameters concerned with pan, tilt, zoom and rotation for cutting out a region to be displayed, in the image captured by the fish-eye lens 2, and parameters related to the center and radius of the fish-eye lens 2. That is, the parameter setting device 5 functions as an input/output device for these parameters. The parameters about the pan, tilt and rotation are set as angular information, the parameter about the zoom is set as magnification information, and the parameters about the center and radius of the fish-eye lens 2 are set as positional information and numerical information. A geometrical position calculation device 6 calculates the geometrical position of each pixel in the input side image data DI, which corresponds to each pixel on a display screen (output screen) 9a of a display device 9, in order to correct the distortion, by the fish-eye lens 2, of the image in the region to be cut out as output based on the parameters set in the parameter setting device 5.
An output side image data generation unit 7 forms output side image data DO corrected for the distortion based on the geometrical position of each pixel in the input side image data DI calculated by the geometrical position calculation device 6, and the input side image data DI stored in the frame memory 4. This output side image data DO is data obtained by sequentially combining brightness information, etc. based on the input side image data DI for each pixel on the output screen corresponding to the geometrical position. The output side image data DO is sequentially stored into a buffer memory 8, pixel by pixel, and is reproduced on the display screen (output screen) 9a of the display device 9 frame by frame. In this manner, the image in the predetermined cut-out region and corrected for distortion is reproduced on the display screen 9a.
As a publication which discloses a technology of the same type as that of the above-described conventional technology, the following patent document 1 is existent:
Patent Document 1: JP-A-2000-83242
With the above-described omnidirectional camera system, the input side image data DI, which is the output signal of the image sensor 1 as the imaging means, is written into the frame memory 4. After the writing of the input side image data DI corresponding to one frame is completed, the output side image data DO is formed by reference to the contents of storage in the frame memory 4. Thus, during such a series of processing steps, a time lag occurs. Such a time lag manifests itself as a display delay on the display screen 9a.
The present invention has been accomplished in the light of the above-described conventional technologies. It is an object of the invention to provide a camera system capable of achieving a series of processings, without using a frame memory, in the conversion of input side image data into output side image data, which involves predetermined processing of a geometrical position, such as correction of distortion of the input side image data.
A first aspect of the present invention for attaining the above object is an image processing apparatus, comprising:
parameter setting means which has a setting of a parameter concerned with at least one of pan, tilt, zoom and rotation for cutting out a region to be displayed in an image taken in by a lens;
geometrical position calculation means for calculating a geometrical position of each pixel in input side image data based on an output signal of imaging means, the each pixel corresponding to each pixel on an output screen, in order to perform predetermined transformation of a geometrical position of an image in the region based on the parameter;
an address table for storing table information which is combined information obtained by correlating an input side address, as an address of the each pixel of the input side image data based on calculation results of the geometrical position calculation means, to an output side address as a reference which is an address of the each pixel on the output screen; and
address matching means which checks the input side address of the each pixel of the input side image data, loaded in real time, against the input side address stored in the address table, and when both input side addresses are coincident, combines the input side image data at the input side address with the corresponding output side address to form output side image data, and also sends out the output side image data.
A second aspect of the present invention is a camera system, comprising:
imaging means for forming a digital signal representing an image taken in by a lens;
parameter setting means which has a setting of a parameter concerned with at least one of pan, tilt, zoom and rotation for cutting out a region to be displayed in the image;
geometrical position calculation means for calculating a geometrical position of each pixel in input side image data, the each pixel corresponding to each pixel on an output screen, in order to perform predetermined transformation of a geometrical position of an image in the region based on the parameter;
an address table for storing table information which is combined information obtained by correlating an input side address, as an address of the each pixel of the input side image data based on calculation results of the geometrical position calculation means, to an output side address as a reference which is an address of the each pixel on the output screen; and
address matching means which checks the input side address of the each pixel of the input side image data, loaded in real time, against the input side address stored in the address table, and when both input side addresses are coincident, combines the input side image data at the input side address with the corresponding output side address to form output side image data, and also sends out the output side image data.
A third aspect of the present invention is the camera system according to the second aspect, wherein
the address table stores table information in which the output side addresses in the table information based on the calculation results of the geometrical position calculation means in connection with a specific region are rearranged according to the input side addresses.
A fourth aspect of the present invention is the camera system according to the third aspect, wherein
the address table stores a plurality of pieces of table information in which the output side addresses are rearranged according to the input side addresses in connection with a plurality of the specific regions.
A fifth aspect of the present invention is a camera system, comprising:
imaging means for forming a digital signal representing an image taken in by a lens;
parameter setting means which has a setting of a parameter concerned with at least one of pan, tilt, zoom and rotation for cutting out a region to be displayed in the image;
geometrical position calculation means for calculating a geometrical position of each pixel in input side image data, the each pixel corresponding to each pixel on an output screen, in order to perform predetermined transformation of a geometrical position of an image in the region based on the parameter;
an address table for storing table information which is combined information obtained by correlating an input side address, as an address of the each pixel of the input side image data based on calculation results of the geometrical position calculation means, to an output side address as a reference which is an address of the each pixel on the output screen;
matching sort means for rearranging the output side addresses stored in the address table according to the input side addresses;
a matching address table for storing table information which is combined information obtained by correlating the output side addresses to the input side addresses upon rearrangement by the matching sort means; and
address matching means which checks the input side address of the each pixel of the input side image data, loaded in real time, against the input side address stored in the matching address table, and when both input side addresses are coincident, combines the input side image data at the input side address with the corresponding output side address to form output side image data, and also sends out the output side image data.
A sixth aspect of the present invention is the camera system according to any one of the second to fifth aspects, wherein
the geometrical position calculation means finds the input side address to decimal places, and outputs the input side address as a decimal value, and
the address matching means uses, as a reference, a specific pixel corresponding to a whole number part of the input side address in the input side image data, finds brightness of the specific pixel by interpolation for weighting with a value of a fractional part of the input side address based on brightness of a plurality of pixels adjacent to the specific pixel, and takes the brightness of the specific pixel as brightness of a pixel in the output side image data corresponding to the specific pixel.
A seventh aspect of the present invention is the camera system according to the sixth aspect, wherein
the address matching means has a buffer memory for storing at least one line equivalent of data, and is adapted to find the brightness of the specific pixel belonging to a next line by use of brightness information on each pixel stored in the buffer memory.
An eighth aspect of the present invention is the camera system according to the second to seventh aspects,
further comprising a buffer memory for storing the output side image data,
the buffer memory having the output side image data written randomly thereinto, with the output side address as a reference, each time results of the checking are coincident, and
wherein readout of the output side image data is performed sequentially.
A ninth aspect of the present invention is the camera system according to the second to seventh aspects,
further comprising a buffer memory for storing the output side image data,
the buffer memory having the output side image data written sequentially thereinto each time results of the checking are coincident,
further comprising read-out sort means for rearranging memory addresses of the buffer memory according to the output side addresses, and a read-out address table for storing table information which is combined information obtained by correlating the memory addresses to the output side addresses as a reference upon rearrangement by the read-out sort means, and
wherein readout of the output side image data is performed randomly based on the table information of the read-out address table.
A tenth aspect of the present invention is the camera system according to any one of the second to ninth aspects, wherein
the lens is a fish-eye lens having a wide-angle visual field, and
the transformation in the geometrical position calculation means is processing for correcting distortion of the image in the region.
An eleventh aspect of the present invention is the camera system according to any one of the second to ninth aspects, wherein
the transformation in the geometrical position calculation means is processing for distorting the image in the region.
According to the present invention, the input side address of each pixel of the input side image data, which has been loaded in real time, against the input side address stored in the address table. When both input side addresses are coincident, the input side image data at the input side address is combined with the corresponding output side address to form output side image data. Thus, there is no need for a frame memory as in the conventional technology, which stores the one frame equivalent of input side image data as output signals from imaging means.
Consequently, a delay due to the time for writing the one frame equivalent of image data into the frame memory can be eliminated, and the image taken in can be displayed promptly as a reproduced image. Such effects are remarkable, particularly, with a monitoring camera system or the like which targets a moving body as an object of imaging.
Embodiments of the present invention will now be described in detail based on the accompanying drawings.
A parameter setting device 5 has settings of parameters concerned with pan, tilt, zoom and rotation for cutting out a region to be displayed, in the image captured by the fish-eye lens 2, and parameters related to the center and radius of the fish-eye lens 2. The parameters about the pan, tilt and rotation are set as angular information, the parameter about the zoom is set as magnification information, and the parameters about the center and radius of the fish-eye lens 2 are set as positional information and numerical information.
A mode selector button 5g selects a mode based on a difference in the method of installing the fish-eye lens 2. In more detail, as shown in
A geometrical position calculation device 6 shown in
The calculation for correction of distortion in the geometrical position calculation device 6 can be performed suitably, for example, by making use of the following principle:
A circular image, which is formed on the surface of the image sensor 1 by the fish-eye lens 2, is equivalent to an image projected on a spherical screen of a hemispherical body with a radius R centered on the fish-eye lens 2. Thus, desired correction of distortion can be made, if the spherical image is converted into a flat image. That is, as shown in
Here, moving the eyepoint is equal to moving the tangential plane P, as the projection plane, on the spherical surface. The method of moving the tangential plane P comes in the following three types:
1) Rotation: Amount of rotation (α) about the eyepoint vector OO′ of the tangential plane P
2) Tilt: Amount of angular movement (β) in the vertical direction of the eyepoint vector OO′
3) Pan: Amount of angular movement (θ) in the horizontal direction of the eyepoint vector OO′
A computation concerning the correction of distortion (coordinate transformation) is performed by the following procedure:
An XY coordinate system on the tangential plane P is transformed into an xyz coordinate system of the omnidirectional camera system, whereby the circular image of the fish-eye lens 2 is converted into a flat image.
Concretely, as shown in
1) The barycenter O of the tangential plane P is placed at coordinates (R,0,0). At this time, the space coordinates of a point P0 (X,Y) in the tangential plane P are (R,X,Y) (see
2) The tangential plane P is processed in accordance with a set zoom magnification to scale up or down the size of the tangential plane P to (1/zoom magnification) (see
3) The tangential plane P is rotated about the x-axis by α rad (rotation angle) from the y-axis toward the z-axis (see
4) The tangential plane P is rotated about the y-axis by (90°−β) rad (tilt angle) from the x-axis toward the z-axis (see
5) The tangential plane P is rotated about the z-axis by θ rad (pan angle) from the x-axis toward the y-axis (see
If the destination of movement of the given point P0 (X,Y) on the tangential plane P (see
The fish-eye lens 2 takes a surrounding 360-degree scene into its fish-eye circle to form a fish-eye image. Generally, the fish-eye lens 2 has image height characteristics, which are inherent distortion characteristics, with respect to an incident angle θ. That is, as shown in
Here, as shown in
OP1=√{square root over (X12+Y12+Z12)} [Equation 2]
Thus, the incident angle θ from the point P1 (X1,Y1,Z1) is given by the following equation:
As a result, the image height h can be found based on the incident angle θ obtained by the above equation and the image height characteristics shown in
Then, a point P1′ shown in
In this manner, the coordinates (X1,Y1, Z1) of the given point P1 in the tangential plane P are transformed into the coordinates (x,y) of the focused point Q upon correction of the distortion involved. That is, it becomes possible to calculate the geometrical position of each pixel in the input side image data (the pixel corresponding to the point Q on the circular image), the pixel conformed to each pixel on the output screen (i.e., one corresponding to the given point on the tangential plane P). Here, the geometrical position in the present embodiment is found to decimal places.
An address table 10 shown in
A matching sort unit 11 rearranges or sorts the output side addresses, stored in the address table 10, according to the respective input side addresses (a concrete method for this work will be described in detail later).
A matching address table 12 stores table information which is combined information correlating the output side address to the input side address upon rearrangement in the matching sort unit 11.
The matching sort unit 11 rearranges the output side addresses in the sequence of the input side addresses. As a result, the output side addresses correlated to the input side addresses as a reference after sorting are arranged in the matching address table 12.
The address matching device 13 shown in
The output side image data DO formed in the output side image data generation unit 15 is one obtained by eliminating the distortion of the input side image data DI based on the information on the geometrical position. The output side image data DO is sequentially stored in a buffer memory 8 as data which is a sequential combination of brightness information and color information based on the input side image data DI for each pixel. This combined data is reproduced, frame by frame, on the display screen (output screen) 9a of the display device 9 via a read-out circuit 16.
In the present embodiment, each time the results of the above checking (matching) are coincident, the output side image data DO is written randomly into the buffer memory 8, with the output side address as a reference. The readout of the output side image data DO is performed sequentially via the read-out circuit 16. A detailed description will be offered later in connection with these points.
In this manner, the image in the predetermined cut-out region (fan-shaped region 22), the image corrected for distortion, is reproduced on the display screen 9a.
According to the present embodiment, the address matching device 13 checks the input side address of each pixel of the input side image data DI, which has been loaded in real time, against the input side address stored in the matching address table 12. When both input side addresses are coincident, the input side image data DI at the coincident input side address is combined with the corresponding output side address in the output side image data generation unit 15 to form the output side image data DO.
In the present embodiment, the matching sort unit 11 is provided to rearrange the output side addresses in the sequence of the input side addresses, so that the predetermined address matching in the address matching unit 14 can be carried out rationally. This is because the input side addresses in the matching address table 12 are arranged in the sequence of the input side image data DI inputted to the address matching unit 14, and single retrieval is enough to bring all the pixels of the output side image data DO into correspondence with one pixel of the input side image data DI.
As shown in
The output side image data generation unit 15 selects a total of 4 pixels including the pixel at the coincident address and pixels in the vicinity of this pixel, namely, respective pixel data on the pixel at the input side address (y,x), the input side address (y, x−1) adjacent on the same line to the pixel at the input side address (y,x), and the input side addresses (y−1, x−1) and (y−1, x) adjacent one line ahead to them, and forms each pixel data DO-1 for the output side image data DO from the total 4 pixels designated as 43. Thus, the output side image data generation unit 15 has a line buffer memory 44 for storing the one line equivalent of data. In forming the each pixel data DO-1 as stated above, brightness with respect to the adjacent pixel is found by linear interpolation based on the value of the fractional part of the input side address (y,x) in the matching address table 12. Here, the interpolation need not be limited to linear interpolation, if it is an interpolation weighted with the value of the fractional part. The each pixel data DO-1 also contains color information generated based on the four pixels 43, although this is not shown.
The each pixel data DO-1 formed in the output side image data generation unit 15 is written into the buffer memory 8. In this case, whenever the each pixel data DO-1 is formed, it is written randomly into the buffer memory 8, with the output side address as a reference. Thus, the buffer memory 8 has the each pixel data DO-1 written thereinto in a state in which these data are arranged sequentially in the sequence of the output addresses. As a result, the readout of the each pixel data DO-1 is performed sequentially, beginning at the start of the buffer memory 8.
In the first embodiment shown in
In writing the each pixel data DO-1 formed in the output side image data generation unit 15 into the buffer memory 8, the writing is carried out randomly, with the output side addresses as a reference. In writing the each pixel data DO-1 corresponding to the peripheral portion of the fan-shaped region 22 (see
The present embodiment is designed to avoid such a writing disabling state, and differs from the embodiment of
The read-out address table 18 stores table information which is combined information obtained by correlating a memory address of the buffer memory 8 to an output side address as a reference upon sorting by a read-out sort unit 17. The read-out sort unit 17 rearranges the memory addresses of the buffer memory 8 based on the output side addresses stored in a matching address table 12.
According to the present embodiment, writing of the output side image data DO into the buffer memory 8 is performed sequentially, so that an overflow of written information as in random writing does not occur. Instead, the buffer memory 8 has the output side image data DO randomly written thereinto, so that the data need to be read out randomly in the sequence of the output side addresses. Information for such readout is in storage at the read-out address table 18. Thus, by reference to the contents of storage in the read-out address table 18, random readout can be carried out in the sequence of the output side addresses, as determined beforehand. This point will be described in further detail based on
As shown in
According to the present embodiment, as described above, loading into the buffer memory 8 can be performed sequentially, so that an overflow of data when read in can be avoided. For readout, by contrast, data need to be read out randomly. In this case, however, the interval for readout is constant, so that an overflow of information is not caused.
In the above first and second embodiments, the matching sort unit 11 is provided to rearrange the output side addresses in the sequent of the input side addresses, but sorting need not necessarily be performed. Although the number of retrievals for matching is increased, the present invention includes a case where no sorting is performed. That is, the mere provision of the address table 10 and the address matching device 13 makes it possible to construct a camera system rid of the frame memory 4 (
Also, table information on a specific region is stored in an address table having the same functions as those of the address table 10. Moreover, the respective output side addresses are rearranged beforehand in correspondence with the respective input side addresses. By these measures, the same table information as the sorted table information stored in the matching address table 12 can be stored in connection with the above specific region. In this case, therefore, rational matching comparable to address matching involving the matching sort unit 11 can be performed in connection with the above-mentioned specific region.
If a plurality of the above specific regions are set, and sorted table information as mentioned above is stored for each of the regions, rational matching can be performed in regard to each of the regions. In this case, control may be exercised such that the respective regions are automatically switched as appropriate.
In the aforementioned first and second embodiments, moreover, the each pixel data DO-1 for the output side image data DO is formed with the use of the pixels located one line ahead. Thus, the line buffer memory 44 covering one line is provided. If only the input side address adjacent on the same line is utilized, however, the line buffer memory 44 naturally becomes unnecessary. The provision of the line buffer memories enough for two lines or more, on the other hand, can form high accuracy output side image data DO utilizing information on correspondingly many input side addresses. Hence, the number of the line buffer memories may be selected in consideration of the accuracy of a reproduced image.
The aforementioned first and second embodiments have the fish-eye lens 2 as their lens. Thus, the calculation in the geometrical position calculation device 6 is designed to correct the distortion of the input side image data DI by the fish-eye lens 2, but this processing is not limitative. Processing for imparting a desired distortion to distortion-free input side image data DI taken in by an ordinary lens is also included in the present invention. That is, the geometrical position calculation means includes not only the correction of distortion of an image incorporated by the lens, but also processing such as correction for distorting an undistorted image. It is essential that no particular limitation be imposed on processing, if it transforms the geometrical position of an image taken in by a lens.
Patent | Priority | Assignee | Title |
10140687, | Jan 27 2016 | RAPC Systems, Inc. | Real time wide angle video camera system with distortion correction |
10142544, | Jan 27 2016 | RAPC Systems, Inc. | Real time wide angle video camera system with distortion correction |
Patent | Priority | Assignee | Title |
5359363, | May 13 1991 | Sony Corporation | Omniview motionless camera surveillance system |
5691765, | Jul 27 1995 | SENSORMATIC ELECTRONICS, LLC | Image forming and processing device and method for use with no moving parts camera |
6345129, | Feb 03 1999 | SCAN VISION TECHNOLOGIES, LTD | Wide-field scanning tv |
6865028, | Jul 20 2001 | IMMERVISION, INC | Method for capturing a panoramic image by means of an image sensor rectangular in shape |
7450165, | May 02 2003 | Grandeye, Ltd. | Multiple-view processing in wide-angle video camera |
7505068, | Jul 12 1997 | Silverbrook Research Pty LTD | Image processing apparatus for applying effects to a stored image |
7773772, | Jun 24 2005 | NISSAN MOTOR CO , LTD | Image generation device and method for vehicle |
8189949, | Jun 12 2006 | Nissan Motor Co., Ltd. | Image processing apparatus and image processing method |
20050007478, | |||
20060038895, | |||
20080175507, | |||
JP2005286820, | |||
JP6284424, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Dec 03 2012 | INTERSIL AMERICAS LLC | (assignment on the face of the patent) | / | |||
Mar 07 2013 | TECHWELL JAPAN KABUSHIKI KAISHA | INTERSIL AMERICAS LLC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 030038 | /0652 |
Date | Maintenance Fee Events |
Feb 25 2019 | M1551: Payment of Maintenance Fee, 4th Year, Large Entity. |
Feb 14 2023 | M1552: Payment of Maintenance Fee, 8th Year, Large Entity. |
Date | Maintenance Schedule |
Aug 25 2018 | 4 years fee payment window open |
Feb 25 2019 | 6 months grace period start (w surcharge) |
Aug 25 2019 | patent expiry (for year 4) |
Aug 25 2021 | 2 years to revive unintentionally abandoned end. (for year 4) |
Aug 25 2022 | 8 years fee payment window open |
Feb 25 2023 | 6 months grace period start (w surcharge) |
Aug 25 2023 | patent expiry (for year 8) |
Aug 25 2025 | 2 years to revive unintentionally abandoned end. (for year 8) |
Aug 25 2026 | 12 years fee payment window open |
Feb 25 2027 | 6 months grace period start (w surcharge) |
Aug 25 2027 | patent expiry (for year 12) |
Aug 25 2029 | 2 years to revive unintentionally abandoned end. (for year 12) |