An image capturing device includes a lens unit, an image sensor, a face recognition unit, a selecting unit, an auto-focusing (af) unit and a driving unit. The lens unit focuses images onto the image sensor. The image sensor converts the images into electronic signals. The face recognition unit locates any faces in a current image. If faces are detected, the selecting unit indicates to a user where the faces are located in the image, and selects one of the locations as a main focus area in response to a user input. The driving unit is configured for moving the lens unit. The af unit controls the driving unit to focus the lens unit according to the selected location.
|
1. An image capturing device comprising:
an image sensor;
a lens unit configured for focusing images onto an image sensor;
a face recognition unit configured for detecting a number of locations where the faces of the persons are located in a current image;
a selecting unit configured for displaying where the faces are located in the image and selecting one of the locations as a main focus area in response to a user input;
a driving unit configured for moving the lens unit; and
an af unit configured for calculating a in-focus position and controlling the driving unit to focus the lens unit according to the in-focus position.
6. An image capturing device comprising:
an image sensor;
a lens unit configured for focusing images onto an image sensor;
a face recognition unit configured for detecting a number of locations where the faces of the persons are located in a current image;
a selecting unit configured for displaying where the faces are located in the image and selecting one of the locations as a main focus area in response to a user input;
a memory configured for storing the main focus area selected by the selecting unit and focus powers corresponding to the main focus area, and focus powers corresponding to other areas besides the main focus area of the current image;
a driving unit configured for moving the lens unit; and
an af unit is configured for calculating a in-focus position and controlling the driving unit to focus the lens unit according to the in-focus position.
2. The image capturing device as claimed in
3. The image capturing device as claimed in
4. The image capturing device as claimed in
5. The image capturing device as claimed in
7. The image capturing device as claimed in
8. The image capturing device as claimed in
9. The image capturing device as claimed in
10. The image capturing device as claimed in
11. The image capturing device as claimed in
|
1. Technical Field
The present invention relates to image capturing devices, and particularly, to an image capturing device and an auto-focus method thereof.
2. Description of the Related Art
Image capturing devices, such as those used in digital cameras and mobile phones, have an auto-focus (AF) feature. An AF method using face recognition technology is a relatively new technology. This AF method enables image capturing devices to automatically detect where a face of a person is located in a captured image, and adjust the lens to focus on that face accordingly. However, if more than one face is in view, the AF method may not be smart enough to judge which face in the image should be used for focusing.
What is needed, therefore, is an image capturing device that can overcome the above-described problem.
In a present embodiment, an image capturing device includes a lens unit, an image sensor, a face recognition unit, a selecting unit, an auto-focusing (AF) unit and a driving unit. The lens unit forms images onto the image sensor. The image sensor converts the images into electronic signals. The face recognition unit locates any faces in a current image. If faces are detected, the selecting unit indicates to a user where the faces are located in the image, and selects one of the locations as a main focus area in response to a user input. The driving unit is configured for moving the lens unit. The AF unit controls the driving unit to focus the lens unit according to the selected location.
Many aspects of the present image capturing device and auto-focus method can be better understood with reference to the accompanying drawings. The components in the drawings are not necessarily drawn to scale, the emphasis instead being placed upon clearly illustrating the principles of the present image capturing device and the auto-focus method. Moreover, in the drawings, like reference numerals designate corresponding parts throughout the several views.
Embodiments of the present image capturing device and auto-focus method will be now described in detail with reference to the drawings. In the following described embodiments, the image capturing device can be but is not limited to a digital still camera, a digital video camera, or any other electronic device equipped with a camera module.
Referring to
The lens unit 11 may include one or many lenses. The lens unit 11 can be, for example a zoom lens, a macro lens etc.
The image sensor 12 can be a charge coupled device (CCD) image sensor or a complementary metal oxide semiconductor (CMOS) image sensor. Ceramic leaded chip carrier (CLCC) package, a plastic leaded chip carrier (PLCC), or a chip scale package (CSP) may be used for packaging the image sensor 12.
In this embodiment the face recognition unit 13 is configured for locating any faces in the initial image then causing those faces to be framed in the image when it is displayed. The face recognition unit 13 may be realized by any existing face recognition method.
The selecting unit 14 is configured for displaying the locations on a display of the image capturing device 100, and selecting one of the locations as a main focus area in response to a user input. The selecting unit 14 shows the image with framed locations of face thereon and sets focus power for determining which location(s) need to be given greater priority in response to a user input. In this embodiment, the selecting unit 14 is a LCD touch screen. For example, if a framed location is touched, the focus power of this touched location is higher than those of other portions. In addition, the selecting unit 14 can also be used as a viewfinder of the image capturing device 100.
Users can also set focus powers of framed locations as primary focus area and subordinate focus areas. For example, the users can touch several framed locations, and the degrees of emphasis can be determined under the control of the duration of touch on the corresponding framed locations. Referring to
The image capturing device 100 further comprises a memory 15 configured for storing the area of the image containing the highlighted face (main focus area) selected by the selecting unit 14 according to the user's input, and focus powers corresponding to the highlighted face and focus powers corresponding to other areas besides the highlighted face.
The AF unit 16 can focus the lens unit 11 with emphasis on any portion of the image. In this embodiment, the AF unit 16 receives the focus powers and thereby focuses on locations accordingly. In detail, various methods, such as a contrast measurement method, can be used by the unit.
The AF unit 16 can receive information regarding the selected main focus area and the non-emphasized focus areas and corresponding focus powers of the main focus area and the non-focus areas from the memory 15. The AF unit 16 calculates a base evaluation value of an image of the current image. This is obtained by taking the total of multiplying a base evaluation value of each main focus area and non-focus area by the focus power corresponding to the area. Then the AF unit 16 gets a maximal evaluation value of these photographing images and the position of the lens unit 11 corresponding to the maximal evaluation value named as in-focus position (main focus position). The base evaluation value of each area can be contrast, grayscale, or intensity of the area. The AF unit 16 can control the auto-focus operation of the image capturing device 100 to detect an in-focus position of the lens unit 11 basing upon the base evaluation value. The method of detecting the in-focus position of the lens unit 11 can be any of various kinds of known methods, such as a hill-climbing method. The AF unit 16 controls the driving unit 17 to move the lens unit 11 to the in-focus position based upon the position of the lens unit 11 corresponding to the maximal evaluation value.
The driving unit 17 is configured for moving the lens unit 11 back and forth under control of the AF unit 16. The driving unit 17 moves the lens unit 11 during the process of detecting the in-focus position of the lens unit 11, and moves the lens unit 11 to the in-focus position once the in-focus position of the lens unit 11 has been detected.
Referring to
It will be understood that the above particular embodiments and methods are shown and described by way of illustration only. The principles and the features of the present invention may be employed in various and numerous embodiment thereof without departing from the scope of the invention as claimed. The above-described embodiments illustrate the scope of the invention but do not restrict the scope of the invention.
Wang, Wei-Jen, Cheng, Shih-Pao
Patent | Priority | Assignee | Title |
8285000, | Jun 20 2008 | Hon Hai Precision Industry Co., Ltd. | Monitoring system and method |
9480539, | Nov 03 2011 | Viewing system and viewing method for assisting user in carrying out surgery by identifying a target image | |
9955064, | Sep 15 2014 | LG Electronics Inc. | Mobile terminal and control method for changing camera angles with respect to the fixed body |
Patent | Priority | Assignee | Title |
7376347, | Mar 31 2004 | FUJIFILM Corporation | Digital still camera, image reproducing apparatus, face image display apparatus and methods of controlling same |
20080002048, | |||
20090015703, | |||
20090096913, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Jun 30 2008 | CHENG, SHIH-PAO | HON HAI PRECISION INDUSTRY CO , LTD | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 021181 | /0511 | |
Jun 30 2008 | WANG, WEI-JEN | HON HAI PRECISION INDUSTRY CO , LTD | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 021181 | /0511 | |
Jul 02 2008 | Hon Hai Precision Industry Co., Ltd. | (assignment on the face of the patent) | / | |||
Dec 27 2012 | HON HAI PRECISION INDUSTRY CO , LTD | Gold Charm Limited | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 029544 | /0464 | |
Jul 08 2016 | Gold Charm Limited | RPX Corporation | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 040993 | /0584 | |
Oct 27 2016 | Gold Charm Limited | RPX Corporation | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 040259 | /0574 | |
Jun 19 2018 | RPX Corporation | JEFFERIES FINANCE LLC | SECURITY INTEREST SEE DOCUMENT FOR DETAILS | 046486 | /0433 | |
Aug 23 2020 | RPX Corporation | BARINGS FINANCE LLC, AS COLLATERAL AGENT | PATENT SECURITY AGREEMENT | 054244 | /0566 | |
Aug 23 2020 | RPX CLEARINGHOUSE LLC | BARINGS FINANCE LLC, AS COLLATERAL AGENT | PATENT SECURITY AGREEMENT | 054244 | /0566 | |
Oct 23 2020 | RPX Corporation | BARINGS FINANCE LLC, AS COLLATERAL AGENT | PATENT SECURITY AGREEMENT | 054198 | /0029 | |
Oct 23 2020 | JEFFERIES FINANCE LLC | RPX Corporation | RELEASE BY SECURED PARTY SEE DOCUMENT FOR DETAILS | 054486 | /0422 | |
Oct 23 2020 | RPX CLEARINGHOUSE LLC | BARINGS FINANCE LLC, AS COLLATERAL AGENT | PATENT SECURITY AGREEMENT | 054198 | /0029 |
Date | Maintenance Fee Events |
Feb 20 2014 | M1551: Payment of Maintenance Fee, 4th Year, Large Entity. |
Mar 21 2018 | M1552: Payment of Maintenance Fee, 8th Year, Large Entity. |
May 10 2022 | REM: Maintenance Fee Reminder Mailed. |
Oct 24 2022 | EXP: Patent Expired for Failure to Pay Maintenance Fees. |
Date | Maintenance Schedule |
Sep 21 2013 | 4 years fee payment window open |
Mar 21 2014 | 6 months grace period start (w surcharge) |
Sep 21 2014 | patent expiry (for year 4) |
Sep 21 2016 | 2 years to revive unintentionally abandoned end. (for year 4) |
Sep 21 2017 | 8 years fee payment window open |
Mar 21 2018 | 6 months grace period start (w surcharge) |
Sep 21 2018 | patent expiry (for year 8) |
Sep 21 2020 | 2 years to revive unintentionally abandoned end. (for year 8) |
Sep 21 2021 | 12 years fee payment window open |
Mar 21 2022 | 6 months grace period start (w surcharge) |
Sep 21 2022 | patent expiry (for year 12) |
Sep 21 2024 | 2 years to revive unintentionally abandoned end. (for year 12) |