An optical module includes a first optics group, a second optics group, and an image sensor, wherein the first optics group and second optics group are configured to provide an image having a focus and a magnification to the image sensor. In some embodiments of the present invention, a first optics assembly includes a first optics group coupled to a threaded portion of a first lead screw so that translation of the first lead screw results in translation of the first optics group along an axis of the first lead screw, a first actuator for rotating the first lead screw; and a first sensing target configured to permit detection of rotation of the first lead screw. In some embodiments of the present invention a second optics assembly includes a second optics group coupled to a threaded portion of a second lead screw so that translation of the second lead screw results in translation of the second optics group along an axis of the second lead screw, a second actuator for rotating the second lead screw, and second means for sensing configured to detect rotation of the second lead screw.

Patent
   7573011
Priority
Sep 08 2005
Filed
Aug 03 2007
Issued
Aug 11 2009
Expiry
Sep 01 2026
Assg.orig
Entity
Large
4
61
EXPIRED
22. An actuator module, comprising:
a) a vibrational actuator of the type that oscillates in a standing wave pattern to drive a threaded shaft placed therein to rotate, thus translating the threaded shaft, having a preferred standing wave pattern;
b) a lead screw comprising a threaded portion, a first end, and a second end, the threaded portion coupled to the vibrational actuator;
c) an actuator housing, with an actuator retention region therein; and
d) a flexible coupling structure, coupled to the vibrational actuator at a node point of the preferred standing wave pattern, and also coupled to the actuator housing.
1. An optical module, comprising:
a) a first optics group coupled to a first lead screw so that translation of the first lead screw results in translation of the first optics group along an axis parallel to the first lead screw;
b) a first actuator for translating the first lead screw;
c) a first sensing target configured to permit detection of translation of the first lead screw;
d) a second optics group coupled to a second lead screw so that translation of the second lead screw results in translation of the second optics group along an axis parallel to the second lead screw;
e) a second actuator for translating the second lead screw;
f) a second sensing target configured to permit detection of translation of the second lead screw; and
g) an image sensor.
21. An auto focus and zoom module, comprising:
a) a housing
b) an optics assembly, comprising:
i) a lead screw including a threaded portion, a first end, and a second end;
ii) an optics group coupled to the lead screw so that translation of the lead screw results in translation of the optics group along an axis parallel to the lead screw;
iii) a vibrational actuator of the type that oscillates in a standing wave pattern to drive a threaded shaft placed therein to rotate, thus translating the threaded shaft, coupled to the threaded portion of the lead screw and constrained at a node point of its preferred standing wave pattern by a flexible coupling with the housing; and
iv) means for sensing configured to detect rotation of the lead screw; and
c) an image sensor, wherein the optics group is configured to provide an image having a focus and a magnification to the image sensor.
10. An auto-focus and zoom module, comprising:
a) a housing;
b) a first optics assembly, comprising:
i) a first lead screw including a threaded portion, a first end, and a second end
ii) a first optics group coupled to first end of the first lead screw so that translation of the first lead screw results in translation of the first optics group along an axis of the first lead screw;
iii) a first actuator module including a first actuator for translating the first lead screw threaded portion; and
iv) first means for sensing configured to detect rotation of the first lead screw;
c) a second optics assembly, comprising:
i) a second lead screw including a threaded portion, a first end, and a second end;
ii) a second optics group coupled to the first end of the second lead screw so that rotation of the second lead screw results in translation of the second optics group along an axis of the second lead screw;
iii) a second actuator module including a second actuator for translating the second lead screw; and
iv) second means for sensing configured to detect translation of the second optics group; and
d) an image sensor.
18. An auto-focus and zoom module, comprising:
a) a first guide pin;
b) a second guide pin;
c) a first optics assembly, comprising:
i) a first lead screw including a threaded portion, a first end, and a second end;
ii) a first optics group coupled to the first guide pin and the second guide pin and including a drive target and a spring interface feature;
iii) a first spring coupled to the spring interface feature and configured to urge the drive target of the first optics group against the first end of the first lead screw;
iv) a first actuator module including a first actuator for translating the first lead screw; and
v) first means for sensing configured to detect rotation of the first lead screw;
d) a second optics assembly, comprising:
i) a second lead screw including a threaded portion, a first end, and a second end
ii) a second optics group coupled to the first guide pin and the second guide pin and including a drive target and a spring interface feature;
iii) a second spring coupled to the spring interface feature and configured to urge the drive target of the second optics group against the first end of the second lead screw;
iv) a second actuator module including a second actuator for translating the second lead screw; and
v) second means for sensing configured to detect translation of the second optics group; and
e) an image sensor.
2. The module of claim 1, wherein the first optics group and second optics group are configured to provide an image having a focus and a magnification to the image sensor.
3. The module of claim 1, wherein the image sensor is selected from the group consisting of: complementary metal oxide semiconductor (CMOS), and a charge-coupled device (CCD).
4. The module of claim 1 further comprising a prism element coupled to the first optics group, wherein the prism directs to the first optics group an image that is at an angle with respect to a plane of the module.
5. The module of claim 1, wherein the first sensing target is configured to permit measurement of translation of the first optics group along the first lead screw.
6. The module of claim 5, wherein the first sensing target permits measurement of translation of the first optics group over a range of at least 10 millimeters with a resolution of 70 microns or less.
7. The module of claim 1, wherein the second sensing target is configured to permit measurement of translation of the second optics group.
8. The module of claim 7, wherein the second sensing target permits measurement of translation of the second optics group over a range of at least 2 millimeters with a resolution of 10 microns or less.
9. The module of claim 1, wherein both the axis substantially parallel to the first lead screw and the axis substantially parallel to the second lead screw are parallel to a first guide pin.
11. The auto-focus and zoom module of claim 10, wherein the first optics group is coupled to the first end of the first lead screw by a first spring configured to bias the first optics group against first end of the first lead screw.
12. The auto-focus and zoom module of claim 10, wherein the second optics group is coupled to the first end of the second lead screw by a second spring configured to bias the second optics group against first end of the first lead screw.
13. The auto-focus and zoom module of claim 10, wherein at least one of the first actuator and the second actuator is a cylindrical vibrational actuator.
14. The auto-focus and zoom module of claim 10, wherein at least one of the first actuator and the second actuator is constrained at a node point by a flexible coupling.
15. The auto-focus and zoom module of claim 10, wherein the first optics group and second optics group are configured to provide an image having a focus and a magnification to the image sensor.
16. The auto-focus and zoom module of claim 10, wherein the image sensor is selected from the group consisting of: complementary metal oxide semiconductor (CMOS) and a charge-coupled device (CCD).
17. The auto-focus and zoom module of claim 10, wherein both the axis substantially parallel to that of the first lead screw and the axis substantially parallel to that of the second lead screw are parallel to a first guide pin.
19. The auto-focus and zoom module of claim 18, wherein the first optics group and second optics group are configured to provide an image having a focus and a magnification to the image sensor.
20. The auto-focus and zoom module of claim 18, wherein the image sensor is selected from the group consisting of: complementary metal oxide semiconductor (CMOS) and a charge-coupled device (CCD).
23. The actuator module of claim 22, wherein the actuator retention region is a five-sided chamber, with an opening on one side, sized to fit a parallelepiped containing the vibrational actuator so that a surface of the vibrational actuator is parallel with the opening.

This patent application claims priority under 35 U.S.C. 119(e) of the co-pending U.S. Provisional Pat. App. No. 60/836,616, filed Aug. 8, 2006, entitled “Miniaturized zoom module with rotational piezo actuator with anti-lock feature, even force distribution, shock damage prevention and a novel position sensing methods”, which is hereby incorporated by reference.

In addition, this patent application is a continuation-in-part of co-pending U.S. patent application Ser. No. 11/514,811, filed on Sep. 1, 2006 and entitled “Auto-focus and zoom module”, which claims priority under 35 U.S.C. 119(e) of the co-pending U.S. Provisional Pat. App. No. 60/715,533, filed Sep. 8, 2005, entitled “3× zoom module”, both of which are also hereby incorporated by reference.

The invention relates to camera optics, including video optics. More particularly, this invention is directed toward an auto-focus and zoom module.

Recently, there have been numerous developments in digital camera technology. One such development is the further miniaturization of optical and mechanical parts to the millimeter and sub millimeter dimensions. The reduction in size of the moving parts of cameras has allowed the implementation of modem digital camera and optical technology into a broader range of devices. These devices are also being designed and constructed into smaller and smaller form factor embodiments. For example, typical personal electronic devices such as cellular phones, personal digital assistants (PDAs), and wrist and/or pocket watches are commercially available that include a miniature digital camera. Moreover, larger form factor devices are also packed with additional features. For example, a typical video camcorder often has an entire digital camera for “still” photography built into the camcorder device along with the mechanisms and circuitry for motion video recording.

Typically, however, modern digital camera implementations suffer from a variety of constraints. Some of these constraints include cost, size, features, and complexity. For instance, with a reduction in size typically comes an increase in cost, a reduction in features and/or an increase in complexity.

An optical module comprises a first optics group, a second optics group, and an image sensor. The first optics group and second optics group are configured to provide an image having a focus and a magnification to the image sensor.

In some embodiments of the present invention, an optics module comprises a first optics group that is coupled to a threaded portion of a first lead screw. Translation of the first lead screw results in translation of the first optics group along an axis of the first lead screw. A first actuator rotates the first lead screw. A first sensing target is configured to permit detection of rotation of the first lead screw. The optical module further comprises a second optics group coupled to a threaded portion of a second lead screw. Translation of the second lead screw results in translation of the second optics group along an axis of the second lead screw. A second actuator rotates the second lead screw. A second means for sensing configured to detect translation of the second lead screw.

A housing is included in some embodiments to hold the first optics assembly, the second optics assembly, and the image sensor. The first optics group and the second optics group are configured to provide an image having a focus and a magnification to the image sensor. In some embodiments, the first actuator and/or the second actuator are configured within an actuator module. Preferably the axes substantially parallel to both the first lead screw and second lead screw are parallel to a first guide pin.

Preferably, an actuator module includes a vibrational actuator of the type that oscillates in a standing wave pattern to drive a threaded shaft placed therein to rotate, thus translating the threaded shaft. The actuator has a preferred standing wave pattern. The module further includes an actuator housing, with an actuator retention region therein, and a flexible coupling structure. The flexible coupling structure is coupled to the vibrational actuator at a node point of the preferred standing wave pattern, and also coupled to the actuator housing. The module is coupled with a lead screw comprising a threaded portion, a first end, and a second end. Specifically, the threaded portion is coupled to the vibrational actuator.

In some embodiments, the actuator retention region is a five-sided chamber, with an opening on one side. The opening is sized to fit a parallelepiped containing the vibrational actuator so that a surface of the vibrational actuator is parallel with the opening.

Preferably, the first sensing target is configured to permit measurement of translation of the first optics group along the first lead screw and the second sensing target is configured to permit measurement of translation of the second optics group along the second lead screw. Most preferably, the first sensing target permits measurement over a range of at least 10 mm with a resolution of 70 microns or less, while the second sensing target permits measurement over a range of at least 2 mm with a resolution of less than 10 microns.

The first optics assembly can include a first lead screw which has a threaded portion having a first outer thread diameter, a first end, and a second end. The first optics group is coupled to the first end of the first lead screw so that translation of the first lead screw results in translation of the first optics group along the axis of the first lead screw. A first vibrational actuator translates the first lead screw, which is constrained at a node point by a flexible coupling to the housing. The first means for sensing is configured to detect rotation of the first lead screw.

The second optics assembly can include a second lead screw, which has a threaded portion, a first end, and a second end. The second optics group is coupled to the second lead screw so that translation of the second lead screw results in translation of the second optics group along an axis of the second lead screw. A second vibrational actuator translates the second lead screw and constrained at a node point by a flexible coupling to the housing. The second means for sensing is configured to detect rotation of the second lead screw.

Some embodiments of the present invention relate to an auto focus and zoom module that includes a vibrational actuator of the type that oscillates in a standing wave pattern to drive a threaded shaft placed therein to rotate, thus translating the threaded shaft. The vibrational actuator is coupled to a threaded portion of a lead screw and is constrained at a node point of its preferred standing wave pattern by a flexible coupling with the housing. The vibrational actuator and the lead screw are configured as part of an optics assembly, that also includes an optics group coupled to the lead screw. The coupling between the lead screw and the optics group means that translation of the lead screw results in translation of the optics group. The assembly further includes a means for sensing configured to detect rotation of the lead screw. In addition, the module includes an image sensor. The optics group is configured to provide an image having a focus and a magnification to the image sensor. The lead screw of the assembly has a first end and a second end in addition to the threaded portion.

In some embodiments of the present invention an auto-focus and zoom module comprises a first guide pin, a second guide pin, a first optics assembly, a second optics assembly, and an image sensor, wherein the first optics group and second optics group are configured to provide an image having a focus and a magnification to the image sensor.

In another aspect, a hard stop is implemented on a body driven along an axis by threads disposed orthogonal to that axis. The method comprises coupling a feature to the body to form an assembly having non-symmetric region relative to the axis. The body is driven by the threads using an actuator. A movable element is coupled to a point fixed relative to the actuator. The movable element has a latch feature configured to mate with the non-symmetric region. The movable element is disposed at a position such that the latch feature mates with the non-symmetric region and prevents rotation of the body.

In some embodiments, the step of disposing the movable element is performed in part by moving the movable element by driving the body against it. In some embodiments, the feature coupled to the body is a cam.

The novel features of the invention are set forth in the appended claims. However, for purpose of explanation, several embodiments of the invention are set forth in the following figures.

FIG. 1 is an isometric view of an auto-focus and zoom module in accordance with some embodiments of the invention.

FIG. 2 is an isometric view of an auto-focus and zoom module in accordance with some embodiments of the invention.

FIG. 3 is an isometric view of internal parts of an auto-focus and zoom module in an end stop position in accordance with some embodiments of the invention.

FIG. 4 is an alternative isometric view of internal parts of an auto-focus and zoom module in end stop position accordance with some embodiments of the invention.

FIG. 5A is an isometric view of internal parts of an auto-focus and zoom module in a mid position in accordance with some embodiments of the invention.

FIG. 5B is a plan view of internal spring elements of an auto-focus and zoom module in a mid position in accordance with some embodiments of the invention.

FIG. 6A is an alternative plan view of internal spring elements of an auto-focus and zoom module in a mid position in accordance with some embodiments of the invention.

FIG. 6B is an isometric view of internal spring elements of an auto-focus and zoom module in a mid position in accordance with some embodiments of the invention.

FIG. 7A is a plan view of internal spring elements of an auto-focus and zoom module in tele position in accordance with some embodiments of the invention.

FIG. 7B is an isometric view of internal spring elements of an auto-focus and zoom module in tele position in accordance with some embodiments of the invention.

FIG. 8A illustrates an auto-focus and zoom module in an end stop position in accordance with some embodiments of the invention.

FIG. 8B illustrates an auto-focus and zoom module in a mid position in accordance with some embodiments of the invention.

FIG. 8C illustrates an auto-focus and zoom module in a tele position in accordance with some embodiments of the invention.

FIG. 9 is a plan view along the optical axis of an auto-focus and zoom module in accordance with some embodiments of the invention.

FIG. 10A is a plan view of an actuator assembly in accordance with some embodiments of the invention.

FIG. 10B is an isometric view of an actuator assembly in accordance with some embodiments of the invention.

FIG. 10C is an isometric view of an actuator assembly in accordance with some embodiments of the invention.

FIG. 11A is a schematic representation of a rotation sensor in accordance with some embodiments of the invention.

FIG. 11B is a schematic representation of beam spreading that occurs during sensing in accordance with some embodiments of the invention.

FIG. 11C is a schematic representation of beam spreading that occurs during sensing in accordance with some embodiments of the invention.

FIG. 12A is a schematic illustration of a direct imaging approach for reflecting radiation from a radiation emitter to a detector in accordance with some embodiments of the invention.

FIG. 12B is a schematic illustration of a lens-based imaging solution for collimating radiation from a detector in accordance with some embodiments of the invention.

FIG. 12C is a schematic illustration of a pinhole-based imaging solution for preventing ‘bleed over’ in accordance with some embodiments of the invention.

FIG. 13A is an exploded isometric view of an assembly for position sensing in accordance with some embodiments of the invention.

FIG. 13B is an exploded isometric view of an assembly for position sensing in accordance with some embodiments of the invention.

FIG. 14A is a detailed view of a position-sensing portion of an optical module in accordance with some embodiments of the invention.

FIG. 14B is a detailed view of a position-sensing portion of an optical module in accordance with some embodiments of the invention.

FIG. 15 is a detailed schematic of an active area of interface between a sensing target and a sensor consistent with some embodiments of the invention.

FIG. 16 is a schematic representation of a signal produced from a sensing target consistent with some embodiments of the invention.

FIG. 17 is a flowchart illustrating a method of sensing a position consistent with some embodiments of the present invention.

In the following description, numerous details and alternatives are set forth for purpose of explanation. However, one of ordinary skill in the art will realize that the invention can be practiced without the use of these specific details. In other instances, well-known structures and devices are shown in block diagram form in order not to obscure the description of the invention with unnecessary detail.

FIGS. 1 and 2 illustrate an auto-focus and zoom module 1000 in accordance with some embodiments of the invention. The module 1000 is shown with the exterior electro magnetic interference (EMI) shield removed.

As shown, the module is built over an image sensor board 10. The module 1000 comprises a stiffener 1 disposed on a first side of the image sensor board 10, and a main structure 20 disposed opposite the stiffener 1. Preferably, the stiffener 1 and the main body 20 are coupled with one another and with the image sensor board 10.

The main structure 20 comprises a base guide portion 22. The base guide portion 22 includes features configured to retain the guide pins 601 and 602. The end guide plate 2 is disposed opposite the base guide portion 22. The holes 2a and 2b interface with retain the guide pins 601 and 602, respectively. The base guide portion 22 further includes a void region (not shown) configured to permit passage of radiation, e.g. visible light, through the lens structure of the module (discussed below) to the image sensor (discussed below) of the image sensor board 10. In addition, the base guide portion 22 includes a pass-thru 25 configured to permit the image sensor board extension 11 to pass through.

Disposed between the base guide portion 22 and the end guide plate 2 are the remainder of the main body 20 and other components of the module 1000. The main body 20 further includes the upper structure 24, and the lower structure 26. Both the lower 26 and upper 24 structures include specialized features configured to mate with or allow pass-through of working components of the module 1000. Thus, the main body 20 provides both a structure framework and functional support to the workings of the module 1000.

For example, the lower structure 26 includes the pivot boss 32, configured to act as a fulcrum for the low-variation preload lever assembly (discussed below). In addition, the lower structure 26 includes the pass through 27, configured to permit movement of the lever assembly through a desired range. Similarly, the upper portion 24 includes a pass through configured to permit coupling between a main PCB and the sensor 901 discussed below.

A variety of components of the module 1000 are coupled to the main body 20. Some of these components are immobilized relative to the main body 20. In addition to the guide pins 601 and 602, the actuator housings 1020 and 1030 are coupled to the main body 20 in an immobile position. Thus, the actuator housings 1020 and 1030 are in a fixed position relative to the guide pins 601 and 602.

FIGS. 3 and 4 show the internal components of the module 1000. The main body 20 (shown in FIGS. 1 and 2) is not shown. As shown in these figures, the module 1000 includes a front optics group 400, a rear optics group 500, and an image sensor 14. The front optics group 400 and rear optics group 500 typically comprise one or more optical elements such as a lens group. One of ordinary skill will recognize both complex and simple lens arrangements for the optics groups 400 and 500.

Optionally attached to the main body 20 and the end guide plate 2 is a prism 40, (shown in FIG. 2). The module 1000 preferably further includes a casing and a cover mechanism, as well as an EMI shield as mentioned above. The cover mechanism preferably prevents light leakage and dust contamination from affecting the internal components of the module, particularly the lens groups 400 and 500 and the image sensor 14. In some embodiments, a single external housing functions as both an EMI shield and a cover mechanism. An infrared (IR) filter and/or a low pass filter is optionally attached to the image sensor board 10.

FIGS. 3 and 4 illustrate further details of the module 1000. As mentioned above, the actuator housings 1020 and 1030 are coupled to the main body 20. This coupling, along with the coupling between the main body 20, the end guide plate 2, and the guide pins 601 and 602 positions and secures the components relative to one another, and to the target region 12 of the image sensor 14, providing a chassis for an auto-focus and zoom module capable of providing an image with a magnification and zoom to the target region 12.

Image Sensor

As shown in the figures, the image sensor 14 is substantially planar. The plane of the image sensor is perpendicular to the axes of the guide pins 601 and 701. Typically, the module 1000 is configured to provide an image to the image sensor 14 along an image vector parallel to these axes.

Guide Pins

FIGS. 3 and 4 illustrates a guide pin arrangement for an auto-focus and zoom module in accordance with the present invention. Some embodiments include a pair of guide pins, while other embodiments employ a different number of guide pins. Regardless of their number, the guide pins 601 and 602 are typically mounted along a linear axis of the module 1000 to permit the rear barrel 530 and the front barrel 430 to move relative to the image sensor 14. In the module 1000, the primary guide pin 601 and the secondary guide pin 602 are aligned so that their axes are substantially parallel to each other. Further, the lead screw assemblies 200 and 300 are also aligned so that their axes are substantially parallel to each other, and the guide pins 601 and 602.

Typically, the guide pins 601 and 602 are coupled to the main body 20 and the end guide plate 2 as outlined above. Preferably, the guide pins are coupled on opposite sides of the image vector of the image sensor 14. However, one skilled in the art will recognize that other configurations are possible. The lead screws 200 and 300 are typically disposed parallel to one another, along an edge of the image sensor 14 and parallel to its optical axis.

In some embodiments, the range of motion provided to the rear barrel 530 by the guide pins 601 and 602 is approximately 7 millimeters. In some embodiments, the range of motion provided to the front barrel 430 the guide pins 601 and 602 is approximately 2 millimeters. Due to this range of motion, however, the guide pins 601 and 602 of some embodiments often affect the form factor of the module 1000. Hence, some embodiments further include means for modifying and/or concealing the form factor of the module 1000.

Prism Feature

For instance, some embodiments additionally include a prism feature, e.g. 40 of FIG. 2. This feature allows the auto-focus and zoom module to be disposed and/or mounted in a variety of orientations. For instance, the dimension available to a particular implementation along the initial direction of an image vector is often limited such that the module is preferably disposed lengthwise in the vertical plane of an enclosure. This orientation allows the range of motion of the front and rear barrels along the guide pins, as described above, to be implemented in a device having a small width and/or depth form factor. For example, in a mobile phone implementation where a user will want to aim a camera at a desired image using the display as a viewfinder, the image vector is advantageously perpendicular to the display for usability purposes. However, the dimension of the device perpendicular to the display is often the thinnest dimension of a mobile phone.

Referring to FIGS. 2 and 3, the prism feature 40 of some embodiments is mounted adjacent to the front barrel 430. The prism 40 redirects the light from an image at an angle with respect to the front barrel 430. As described above, the front barrel 430 typically houses a front lens group. The front lens group contains one or more front optical elements. Hence the prism 100 allows the module 1000 to be disposed in a variety of orientations within a device that is typically held at an angle with respect to the subject being viewed and/or photographed.

Lens System

As shown in FIGS. 3, 4 and 5A, the rear optics group 500 and front optics group 400 have specialized constructions. The rear optics group 500 further includes the rear barrel 530, the rear guide sleeve 510, and the rear guide slot 520. The rear barrel typically houses one or more lenses or other optical elements. As illustrated, the rear barrel 530 houses the rear lens 540. The rear barrel 530 is a substantially cylindrical body with a central axis. The rear lens 540 is configured to direct light along the central axis of the rear barrel 530. The rear guide sleeve 510 is an elongated, substantially cylindrical body coupled to the rear barrel 530 so that the central axis of the rear barrel 530 and an axis of the rear guide sleeve 510 are substantially parallel. The rear guide slot 520 is a slotted feature configured to interface with a cylinder.

The front optics group 400 further includes the front barrel 430, the front guide sleeve 410, and the front guide slot 420. The front barrel typically houses the front lens group 440. The front barrel 430 is a substantially cylindrical body with a central axis. The front lens group 440 is configured to direct light along the central axis of the front barrel 430. The front guide sleeve 410 (FIG. 6B) is an elongated, substantially cylindrical body coupled to the front barrel 430 so that the central axis of the front barrel 430 and an axis of the front guide sleeve 410 are substantially parallel. The front guide slot 420 is a slotted feature configured to interface with a cylinder.

Lens-Guide Pin Interface

Referring now to FIG. 6B, the front optics group 400 includes the front guide sleeve 410, which couples with the primary guide pin 601. As illustrated, the front guide sleeve 410 is substantially elongated relative to the front barrel 430. Further, the front guide sleeve 410 is rigidly connected to the front barrel 430. This configuration prevents the front optics group 400 from rotating around an axis perpendicular to the axis of the primary guide pin 601, but permits rotation around the axis of the primary guide pin 601. The rear optics group 500 includes the rear guide sleeve 510, which also couples with the primary guide pin 601. As illustrated, the rear guide sleeve 510 is substantially elongated relative to the rear barrel 530. Further, the rear guide sleeve 510 is preferably rigidly connected to the rear barrel 530. This configuration prevents the rear optics group 500 from rotating around an axis perpendicular to the primary guide pin 601, but permits rotation around the axis of the guide pin.

Referring now to FIG. 4, the front optics group 400 also includes the front guide slot 420, configured to couple with the secondary guide pin 602. The coupling between the guide slot 420 and the secondary guide pin 602 prevents the front optics group 400 from rotating around the axis of the primary guide pin 601. The coupling between the front optics group 400 and guide pins 601 and 602 permits the front optics group 400 to translate along an axis substantially parallel to the two guide pins.

The rear optics group 500 also includes the rear guide slot 520, configured to couple with the secondary guide pin 602. The coupling between the guide slot 520 and the secondary guide pin 602 prevents the rear optics group 500 from rotating around the axis of the primary guide pin 601. The coupling between the rear optics group 500 and guide pins 601 and 602 permits the rear optics group 500 to translate along an axis substantially parallel to the two guide pins.

Actuator Modules

Preferably, the actuators used within embodiments of the present invention are vibrational actuators. Most preferably, these vibrational actuators are of the type that oscillates in a standing wave pattern to drive a threaded shaft placed therein to rotate, thus translating the threaded shaft. Embodiments of the present invention include certain preferred standing wave patterns for driving the vibrational actuators. However, a variety of standing wave patterns are contemplated.

The present invention contemplates a variety of actuator constructions. These include vibrational actuators as disclosed in U.S. Pat. No. 5,966,248 issued Oct. 12, 1999 and U.S. Pat. No. 6,940,209 issued Sep. 6, 2005. These also include actuators as shown for example in FIGS. 10A to 10C. The actuator 700′ comprises a flexible body surrounded by a plurality of piezoelectric strips, 701, 702, and 704. A fourth strip, not shown, is disposed opposite the strip 701. The strips are arranged symmetrically around a flexible body that has a plurality of thread-interface features disposed therein. The thread interface features are configured to mate with the threads of the lead screw 360′. During operation, the piezoelectric strips drive an oscillating motion within the flexible body. Actuators of this type typically require an operating preload. Preferably, this preload is applied to the lead screw via techniques disclosed elsewhere in this document.

In order to effectively drive a threaded shaft by using the preferred vibrational actuators, some embodiments of the present invention include specialized actuator housings, designed to constrain the actuator to only the degree necessary and also to provide shock protection for the actuator. In addition, the actuator housings permit close positioning of actuator relative to the guide pin and optics group. Typically, each actuator within the embodiment is combined with an actuator housing to form an actuator module.

Some embodiments of the present invention include actuator modules such as those illustrated in FIGS. 10A to 10C. A typical actuator module, as illustrated, includes an actuator 700′, an actuator housing 1030′, and a flexible coupling 710.

The flexible coupling 710 constrains a portion of the actuator 700′ to a substantially fixed position relative to the actuator housing 1030′. This permits the actuator to drive a lead screw to translate relative to the actuator housing 1030′. For example, the contact pads 710 prevent the actuator 700′ from rotating relative to the housing.

However, by constraining only a portion of the actuator 700′, the embodiment permits relatively free vibration of the actuator 700′ to impart movement to a lead screw, e.g. 360′. Further, because the flexible coupling 710 preferably constrains the actuator 700′ at a node point of the preferred standing wave pattern of the actuator 700′, the effect of the constraint on the efficiency of the actuator is reduced. Preferably, the fixed location is chosen to be a node point of a variety of standing wave patterns, thus permitting efficient operation of the actuator under a variety of conditions.

As illustrated, the actuator housing 1030′ includes openings 1034 and 1036 to admit the lead screw 360′. In addition, the housing includes openings 1032 and 1038 configured to admit electrical connections to a main PCB board (not shown). In addition, the actuator housing 1030′ is specialized to prevent shock damage to the actuator 700′. The actuator housing 1030′ is preferably a five-sided chamber that forms a parallelepiped therein. This parallelepiped, called the actuator retention region, is larger in volume than the actuator 700′. Further, the actuator retention region is larger along every dimension than the corresponding dimension of the actuator 700′. In addition, when the actuator 700′ is constrained within the actuator retention region by the flexible coupling 710, preferably a surface of the actuator 700′ is parallel with the surface of the parallelepiped that does not include a portion of the actuator housing. Further, the ends of the actuator 700′ are preferably approximately equidistant from the openings 1034 and 1036, respectively. Thus, the actuator 700′ is suspended within the retention region with a buffer distance between it and each adjacent surface of the actuator housing 1030′.

Further, the size of the parallelepiped actuator retention region and the actuator 700′ are matched to one another, and to the type of flexible coupling 710 used to retain the actuator. Preferably, the buffer distance between the actuator 700′ and the inner surfaces of the housing 1030′ adjacent to the openings 1034 and 1036 are chosen relative to the maximum displacement permitted prior to failure by the flexible coupling 710. Thus, during a mechanical shock, the actuator 700′ will encounter an inner surface of the housing 1030′ prior to stretching the flexible coupling 710 to failure. In addition, similar stretching along axes perpendicular to the lead screw 360′ is prevented by the coupling between the lead screw 360′ and the actuator 700′.

The actuator housings 1030 and 1020 permit close positioning of actuators 700 and 500 relative to the primary guide pin 601. As shown in FIG. 9, this close positioning is permitted because the open end of the actuator housings 1020 and 1030 allow the actuators 500 and 700 to be disposed at a surface of the actuator module. Thus, the actuators 500 and 700 are placed proximate to the primary guide pin 601, leaving clearance for the guide sleeves 410 and 510.

Close positioning increases precision by minimizing torque effects as the actuators 200 and 300 drive the optics modules 400 and 500, respectively. The center of mass of the optics modules 400 and 500 lies between the guide pins 601 and 602. The lead screw coupling surfaces 480 and lie off center. Thus, driving the optics modules 400 and 500 by the coupling surfaces 480 and 570 tends to introduce a torque. The guide pins, including the primary guide pin 601, counteract the torque effect. However, configuring the modules so that the actuators 500 and 700, and the coupling surfaces, are nearly aligned with the guide pin 601 reduces the amount of torque on the guide pins.

Lead Screw Assemblies

Referring now to FIGS. 10A to 10C, the exemplary lead screw assembly 300′ is shown coupled with the actuator housing 1030′. The lead screw assembly 300′ is structured around the lead screw 360′. The assembly includes cam 320′ and the referencing cap 340. The lead screw 360′ comprises a threaded region 5, a first end, and a second end. The first end of the lead screw 360′ and the referencing cap 340 are integrally formed.

Lead Screw-Optics Group Interface

Referring now to FIG. 8A, the front optics group 400 and rear optics group 500, respectively, couple with the lead screws through the lead screw coupling surfaces 480 and 570 respectively. Both primary guide sleeves 410 and 510 couple with the primary guide pin 601.

In the preferred configuration, movement of a lead screw transmits force through its counterpart lead screw coupling surface. Since the coupling surfaces are each a rigidly coupled component of an optics group, translation of a coupling surface results in translation of its counterpart optics group. However, a simple rigid connection between a coupling surface and a lead screw could accomplish this function. The illustrated configuration provides additional benefits by isolating the optics group from non-translational movements of the lead screw. Preferably, a reference cap coupled to the first end of a lead screw contacts the coupling surface, for example, see the reference cap 340 of FIG. 10C.

The small contact area between the reference cap and the coupling surface serves to minimize friction, permitting movement of the coupling surface relative to the reference cap and the lead screw in the axes orthogonal to the axis of the lead screw. This configuration isolates most mechanical vibration or disturbance of the lead screw from the optics group. Further, the isolation means that only the translational degree of freedom of the lead screw need be controlled to achieve a required precision for positioning of the optics group. Though non-translational movement of the lead screw is not present in the preferred embodiment, these features permit embodiments of the present invention to deal with this type of wobble when present.

To maintain coupling between a coupling surface and lead screw, some embodiments of the present invention rely on preload springs otherwise required for accurate operation of the actuators.

Preload Springs

In addition to the features mentioned above, the actuators used within embodiments of the present invention typically require a low-variation preload force. This preload is provided by a spring with a low force constant. In small displacement implementations this method works well.

Some embodiments of the present invention rely on spring forces acting on the optics groups to provide preload to the lead screws used to drive the groups. Thus, to an extent, the required displacement of the optics group determines the type of spring force transmission mechanism required.

For example, in some embodiments of the present invention the front optics group 400 is used for focusing and zoom operations and need only be displaced a millimeter or two. Because the preferred range of motion of the front optics group 400 is less than two millimeters, choosing a low force constant spring for the spring and coupling it to directly exert spring forces on the optics group results in a relatively low variation preload.

As illustrated in FIG. 8B, the front lead screw coupling surface 480 is adjacent to the first end of the lead screw 260. To couple the surface with the lead screw and provide preload, the preload spring must urge the surface against the lead screw. Because of the small movements involved in focusing, directly providing the spring force is permissible in this case. Thus, the front preload spring 180 is coupled to the front optics group 400 via the preload interface feature 470 (FIG. 6B) and configured to directly exert force on the optics group 400.

In another example, the rear optics group 500 is used for zoom operations and need be displaced several millimeters or more. Because the preferred range of motion of the front optics group 500 is more than four millimeters, choosing a low force constant spring for the spring and coupling it to directly exert spring forces on the optics group results to high a variation in preload.

As illustrated in FIGS. 6A and 6B, the rear lead screw coupling surface 570 is adjacent to the lead screw 360. To couple the surface with the lead screw and provide preload, thus preload spring must urge the surface against the lead screw. However, direct provision of the preload is undesirable in this case.

Thus, the preload spring 110 is configured on the opposite end of a preload lever 100. The zoom preload lever 100 includes a pivot hole 140 configured to mate with the pivot boss 32 of the main body 20. In addition, the preload lever 100 includes a preload spring hook 130 and a preload force transfer point 120.

The pivot hole 140 is skewed toward the preload spring hook 130 so that movement at the hook end of the preload lever 100 is amplified at the force transfer point end. By the same mechanism, large movements at the force transfer point 120 end of the zoom lever 100 translate into relatively smaller movements at the spring hook 130 end. Preferably, the location of the pivot hole 140 is chosen to decrease the travel from the force transfer end to the spring hook end, in this example by a factor of five. Other embodiments employ a different factor.

The spring hook 130 is coupled with the preload spring 110, and the force transfer point 120 is coupled with one face of the rear lead screw coupling surface 570. The coupling surface 570 is also adjacent to the lead screw 360. To couple the surface with the lead screw and provide preload, the preload spring must urge the surface against the lead screw. Indirectly providing the spring force from the rear preload spring 110 through the lever 100 means that travel of the rear optics group 500 translates indirectly into extension of the preload spring 110. The specific proportionality of group travel to spring extension depends on the positioning of the lever pivot relative to the force transfer point and spring hook. As described above, the preferred ratio is one-fifth.

In either case, indirect or direct preload spring force application, the opposite end of the preload spring is preferably coupled to the main body 20.

Sensing Target

Some embodiments of the present invention include sensing targets to provide feedback on positioning. In some embodiments, a sensing target is disposed on a lead screw. In some embodiments, a sensing target is disposed on an optics group. Both linear and rotational targets can be used with the present invention.

A lead screw assembly in accordance with some embodiments of the present invention includes a sensing target. Some lead screw assemblies, such as the assembly 300′ of FIG. 10A to 10C, do not include a sensing target. However, the lead screw assembly 200, shown for example in FIG. 8C, includes the sensing target 290 positioned adjacent to the cam 220. In the illustrated embodiment, the target 290 is a rotational target. The use of a rotational target is preferred in contexts that require very fine positioning.

Typically, a sensing target adapted for coupling to a lead screw includes a feature that interfaces with a registering feature of lead screw. In some embodiments the sensing target interfaces with the threads of a lead screw. The position sensing target 290 is configured to engage with the position sensor 902.

In some embodiments, a sensing target is included as part of an optics group. For example, in FIGS. 8A to 8C, the sensing target 590 is configured as part of the rear optics group 500. Here, the target 590 is constructed as an integral part of the optics group 500. However, in some embodiments, a sensing target is modular, or merely coupled with an optics group.

In addition, the sensing target 590 is a linear sensing target. Linear targets are acceptable in relatively low precision positioning applications. Further, linear targets are preferred in applications where the target need move over a relatively large range. Here, the linear target is employed in the rear optics group 500 because the group is used for zoom purposes.

In FIG. 8A the module is in an end stop position. In some embodiments, the position sensors 901 and 902 are disengaged from the sensing targets 590 and 290, respectively, during end stop. In this position, also illustrated in FIGS. 3 and 4, the lead screw positions are registered at a mechanical hard stop via means discussed elsewhere in this document. Thus, because in these embodiments the lead screw positions correlate with the optical group positions, the registering of the lead screws defines the position of the optical groups as well.

In FIGS. 8B and 8C, the modules are in mid position and tele position, respectively. Preferably, the sensing targets 590 and 290 are engaged with the position sensors 901 and 902, respectively, while in mid and tele position. Preferably, the position sensors and sensing targets are engaged throughout all zoom positioning.

Mechanical Hard Stop Latch

Preferably, embodiments of the present invention include features configured to permit referencing of the optics group via a mechanical hard stop.

Referring now to FIGS. 3 and 4, these embodiments include the hard stop latch spring 310 and the hard stop latch spring 410. The hard stop latch spring 310 is mounted to the main body 20 on the spring boss 21. As shown in FIGS. 3 and 4, the hard stop latch spring 310 comprises a substantially rigid body and an active spring 312. The rigid body includes the lens group interface surface 314, the pivot hole 318, and the latch 316. The lens group interface surface 314 and the latch 316 are each arranged on separate arms positioned approximately 90 degrees apart around the pivot hole 318, and extending outward therefrom. The latch 316 arm is substantially longer than the group interface surface 314 arm. At rest the active spring 312 is aligned with the latch 316 arm.

The pivot hole 318 is mated with the spring boss 21 and configured to pivot around the boss 21. The group interface surface 314 is configured to mate with the spring driver 580 of the rear lens group 500. At rest, the latch 316 is disposed out of line with the actuator housing 1030, e.g. FIG. 5A. The hard stop latch spring 310 pivots around the hole 318 when the spring driver 580 urges the group interface surface 314 toward the image sensor, flexing the active spring 312. When pivoted, the latch 316 moves into place to interface with the cam feature 322 of the cam 320. This provides a mechanical hard stop for the lead screw 360.

The hard stop latch spring 210 is mounted to the actuator housing 1020 on the spring boss 1028, as shown in FIG. 5B. The hard stop latch spring 210 comprises a substantially rigid body and an active spring 212. The rigid body includes the lens group interface surface 214, the pivot hole 218, and the latch 216, e.g. FIG. 4. The lens group interface surface 214 and the latch 216 are each arranged on separate arms positioned approximately 90 degrees apart around the pivot hole 218, and extending outward therefrom. The latch 216 arm is substantially longer than the group interface surface 214 arm. At rest the active spring 212 is aligned with the latch 216 arm.

The pivot hole 218 is mated with the spring boss 1028 and configured to pivot around the boss 1028. The group interface surface 214 is configured to mate with the spring driver 480 of the front lens group 400. At rest, the latch 216 is disposed out of line with the actuator housing 1020, e.g. FIG. 5A. The hard stop latch spring 210 pivots around the boss 1028 when the spring driver 480 urges the group interface surface 214 toward the image sensor, flexing the active spring 212. When pivoted, the latch 316 moves into place to interface with the cam feature 222 of the cam 220. This provides a mechanical hard stop for the lead screw 260, e.g. as shown in FIG. 8A.

Position Sensing

Embodiments of the present invention include position-sensing elements configured to provide feedback to an actuator control system. These elements permit the module to accurately position functional groups, e.g. optics, by using non-linear actuator motors.

Preferred embodiments of the present invention employ a sensing target that moves in concert with a functional group of the module, and a sensor configured to detect and encode data representing movement of the sensing target. For example, some embodiments use reflection encoding of a mobile sensing target that comprises regions of differing reflectance. An exemplary position sensing system comprises the position sensors 1030 and the position sensing targets 250 and 350 of the module 1000 of FIG. 1.

Reflection Encoding

In the exemplary reflection encoding system, a sensor includes an element that emits radiation and an element that detects radiation. A target includes dark and light bands, for example. The dark bands tend to absorb a greater proportion of the emitted radiation than do the light bands. The radiation reflected by the bands is detected by the sensor. As the target moves relative to the sensor, the absorption and reflectance of the sensing target portion aligned with the sensor varies. The sensor encodes this variation. A variety of encoding algorithms and processes are consistent with the present invention. For example, a sensor could simply detect each transition between a dark and light band.

System Resolution

The resolution of a reflection encoding system is determined by several factors. The distance between the emitter/detector and the target, the beam spread of the radiation used, and the native target resolution all play important roles in determining a system's resolution. These three factors do not act separately, rather they interact, and each must be tuned relative to the others.

Native target resolution is essentially a function of feature size. The smaller the critical dimension—the dimension parallel to sensor movement—of a target's features, the greater its native target resolution. For example the target 250 of FIG. 1 uses stripe pairs as features. The sensing system is configured to move stripes along their narrow dimension across a sensor's field of view. Thus a critical dimension of a stripe pair in the illustrated configuration is its width along the narrow dimension.

However, a position sensing system does not guarantee high resolution simply by using a high native target resolution. A suitable combination of low beam spread radiation and tight emitter-target tolerances is required to achieve a maximal resolution permitted by a given feature size. The beam spread and tolerance specifications are complementary: a decrease in beam spread combined with an appropriate increase in tolerance can maintain a given resolution, and vice versa.

For a given feature size, there is a maximum radiation beam spread above which the features are not resolvable via reflection encoding. FIG. 11B illustrates the maximum beam spread for a series of light sources (white squares on left hand side) emitting light towards a series of absorptive and reflective bands (right hand side). The detail shown in FIG. 11C illustrates a 20-micron wide light source paired with a target having similarly-sized features. In this case, the maximum tolerable spread is 10 microns.

Under set diffusion conditions, the maximum tolerable spread and desired resolution determine a maximum spacing between a radiation source and the target. This spacing, distance d in FIG. 11C, is proportional to the required resolution, and inversely proportional to the tangent of an angle representing the diffusion of the radiation. For example, given a typical LED diffusion angle of 30 degrees, to achieve 10 micron resolution the distance d should be less than 56.7 microns. Thus, to achieve the native target resolution, a suitable combination of beam spread radiation and spacing should be employed.

Native Target Resolution

Some embodiments of the present invention employ position sensing systems with beam spread and tolerance optimized to operate at native target resolution. In reflection encoding, a variety of methods, strategies, and devices are available to achieve this goal.

FIG. 12A illustrates a direct imaging approach where a radiation emitter (white rectangle), e.g. an LED, produces radiation which is supplied to the target without additional processing. A portion of the radiation reflecting from the target is detected by a detector (hatched rectangle). In this type of approach, the emitter must produce radiation with a sufficiently low beam spread to resolve the target features.

Tolerances

One method of achieving native target resolution is closely spacing the emitter/detector and the scanning target. However, tightening tolerances increases the precision required in manufacturing both the target, and the device as a whole. For example, the cross-sectional roundness of a cylindrical target becomes increasingly important as the spacing decreases. For these and other reasons, embodiments of the present invention preferably space the emitter/detector and scanning target at distances achievable within tolerances typical of mass-manufacturing.

Active Area—Emitter/Detector Modification

Several combinations of features and methods can be employed to lessen the spacing requirements tolerances or decrease problems caused by diffusion of the radiation. In reflection encoding, a portion of the sensing target is excited by radiation and a detector receives a signal from the sensing target. The signal received represents the characteristics of an active area of the sensing target. Preferably, the active area is sized and located to match critical feature dimensions of the sensing target. For example, FIG. 15 illustrates the active area of a sensing target.

The size and location of the active area are determined by characteristics of both the emitter and the detector. In some cases, the radiation is conditioned to limit the portion of the sensing target excited by radiation. In some cases, the field of view of the detector is cropped.

Some techniques involve radiation processing measures that permit the use of higher resolution targets at manufacturable spacings than would be possible using more diffusive radiation. FIG. 12B illustrates a system in which a lens is used to collimate radiation from a detector. Collimating the radiation permits target-sensor spacing to increase relative to direct imaging while maintaining ability to resolve a set feature size. The maximum spacing and resolvable feature sizes are determined by the spreading of the radiation following collimation.

Some techniques involve elements configured to limit the field of view of a sensor to a portion of its native field of view. FIG. 12C illustrates a system in which a pinhole is used to prevent ‘bleed over’ from an adjacent region from preventing detection of a transition. In this case, reflected radiation must pass through the centered pinhole placed near to the target surface before reaching the detector. This system can require higher intensity emitters, as relatively little radiation is available through the pinhole.

Though certain embodiments of the present invention do employ active area cropping strategies, such as radiation conditioning, the additional devices or features needed to carry out these strategies increases the cost and complexity of the manufactured module. Preferably, embodiments of the present invention employ other means to achieve desired resolutions.

Beyond Native Target Resolution

At certain thresholds, achieving high system resolution though use of high native target resolution begins to necessitate radiation conditioning or tight spacing. As outlined above, these elements increase the complexity of a module and the precision required in manufacturing. Therefore, for resolutions above these thresholds, embodiments of the present invention preferably employ a lower native target resolution combined with at least one of a variety of strategies for achieving system resolution greater than native target resolution.

Active Area—Target Modification

The methods of defining an active area referred to above relate to conditioning radiation from an emitter, selecting a detector with an appropriate field of view, or modifying the field of view using an external device. However, alternative methods relate to configuring the sensing target to limit the portion thereof excited by radiation at any one time, and thus cropping the active area.

For example, the cross-sectional view of FIG. 11A illustrates a configuration in which the feature size is paired with arc of a cylindrical sensing target to limit the field of view of a detector. The field of view of the emitter/detector 3030 subtends a region of the target 3350 that includes a maximum of two transitions.

Preferably, the sensing target and detector are configured such that a single feature dominates the field of view. For example, as illustrated in FIG. 15, an active area is sized to match the width of a stripe pair. Typically, the feature size of the target is chosen based on the field of view. However, the required resolution can also be a factor in determining feature size.

Data Processing

Preferably, embodiments of the present invention process data from a sensor to achieve resolutions higher than native target resolution. A variety of processing techniques, methods and elements are employed within various embodiments of the invention, including threshold-based signal conversion and interpolation.

Preferably, embodiments of the present invention encode a portion of the sensing target within the active area into an voltage. The voltage varies depending on the character of the portion of the sensing target within the active area at time of encoding.

Embodiments of the present invention preferably match the dimensions of the active area to the critical dimensions of the sensing target features in order to produce a smoothly varying signal. FIG. 15 illustrates a preferred relationship between the active area and sensing target feature dimensions. The active area is sufficiently large along the direction of the critical dimensions so that it will not sequentially encounter regions with the same light/dark characteristics. In the illustrated embodiment, along the critical dimension the active area is larger than one feature's width and smaller than twice that width. This type of configuration substantially prevents ‘flat’ spots from occurring within the analog signal produced.

Over time, as the sensing target moves through the active area, the system forms a signal representing the portions of the sensing region that have passed through the area. As shown in FIG. 16, a sensing target, part 1, and a varying signal, part 2, are correlated along a time axis t. The strength of the signal in part 2 at a given point in time is determined by the characteristics, e.g. the proportion of light and dark stripe, within the active region at that time. As illustrated, the minima of the signal in part 2 correspond in time to the central axes of the dark stripes. Similarly, the maxima of the signal in part 2 correspond in time to the central axes of the light stripes.

In some embodiments the signal is a continuous encoding of the voltage, in other embodiments the signal is a series of discrete samples taken at a particular frequency. In either case, the signal preferably contains multiple samples related to each feature of the sensing target as it moves across the sensor's field of view.

The encoding process produces a variable signal representing the movement of the sensing target. The minima and maxima of the signal represent movement of the sensing target at its native target resolution. Preferably, this variable signal is an analog voltage. In some embodiments, interpolation is used to construct higher resolution data between the minima and maxima of the variable signal. Preferably the interpolation error occurs only within a given period of the native target resolution and is reset with each minimum or maximum of the signal. This limits the error introduced by interpolation to a substantially fixed percentage of the native resolution.

A processing system receives a variable signal from the sensor and produces corrected movement data at a resolution higher than native target resolution. For example, in some embodiments, the analog voltage signal is supplied to an analog to digital converter (ADC). The analog signal, which was produced at a sampling rate that results in multiple samples per feature, contains sufficient information to support ADC production of digital signal with a resolution greater than native target resolution. In some embodiments, an ADC process using multiple thresholds is used to encode an analog signal to a higher-resolution digital signal.

The corrected movement data is then translated into position data which represents the position of a functional group coupled to the sensing target. For example, in some embodiments digital data from the ADC is supplied to a controller where it is analyzed and translated into position data.

One method in accordance with the present invention is illustrated in FIG. 17. The method seeks to detect a position of a functional group coupled to a sensing target configured to represent movement of the functional group a first resolution. It comprises a step 5010, of using the sensing target to detect movement of the functional group at the first resolution. The method further comprises a step 5020, of encoding raw movement data representing the detected movement. In another step 5030, the method comprises processing the raw movement data into corrected movement data having a second resolution, wherein the second resolution is greater than the first resolution. Further, the method includes a step 5040, of translating the corrected movement data into position data representing the position of the functional group.

Preferably, embodiments include additional calibration of processing circuitry. In the preferred embodiment, an initial calibration is accomplished automatically during power on. For example, in an ADC-based system, self-calibration during power-on preferably determines the input range needed for data. Embodiments that use self-calibration do not require initial calibration during manufacturing or storage of fixed calibration parameters over their lifetime. In addition, the calibration preferably defines the initial position for each functional group. In some embodiments, these initial positions are determined by a hard reference stop discussed elsewhere in greater detail. In some embodiments, the positions are determined via information embedded into the sensing target. In some embodiments, position is referenced by the absence of interaction between the sensor and sensing target.

Specifically, referring to FIG. 14A, as the rear barrel 530, sensing target 590 and guide sleeve 510 move along the guide pin 601, the sensing target 590 and the sensor 901 eventually become disengaged. At this point the signal read by the sensor 901 changes, permitting the system to reference the location of the sensing target 590, the rear barrel 530, or the rear lens (not shown). In addition, during the same movement, the spring driver 580 contacts the lens group interface surface 314 of the hard stop latch spring 310, eventually reaching a mechanical hard stop, which can also be used as a reference as described above.

Referring now to FIG. 14B, a similar process can be employed for the front lens position sensor. The front sensing target 290 and the sensor 902 eventually become disengaged during movement of the front lead screw 260. At this point the signal read by the sensor 902 changes, permitting the system to reference the location of the front sensing target 290, or the front lens (not shown). In addition, during the same movement, the cam feature 222 of the cam 220 contacts the lens group interface surface 216 of the hard stop latch spring 210, eventually reaching a mechanical hard stop, which can also be used as a reference as described above.

However, some embodiments also include continuous calibration during sensing to handle signals with noisy time-variance. A variety of configurations produce signals with slight instabilities over time. For example, FIG. 16, part 3 illustrates a signal with an average magnitude that ‘wobbles’. A variety of design and manufacturing decisions may result in such signals, for example off center mounting of a cylindrical sensing target. In some embodiments a calibration constant correlated to instabilities is used to counteract them and dynamically correct the processing output. For example, the average magnitude over a trailing time or frequency period.

In some embodiments, non-volatile memory elements are included in the control or processing circuitry and used to provide additional manufacturing and calibration data. Preferably, this additional data is used to adjust for component variation and manufacturing tolerances.

Some embodiments that employ interpolation use additional hardware and/or firmware (e.g. a clock for timing and for analysis). If the actuator is very non-linear, interpolation can introduce positioning error. Preferably, embodiments of the present invention use ADC techniques.

Configurations

Embodiments of the present invention include position sensing systems that employ a variety of different configurations of sensors and sensing targets. Some embodiments include cylindrical sensing targets, closed surfaces configured to rotate along with a lead screw or other rotational drive mechanism. Since the functional group is coupled with the lead screw, which has known thread pitch, lead screw rotation is proportional to translation of the functional group along the lead screw axis. In addition, some embodiments include linear sensing targets coupled to a functional group and configured to move therewith. The sensing systems discussed in the examples below are illustrated with cylindrical sensing targets; however, the methods, strategies and equipment described are also contemplated for use with linear targets within some embodiments of the present invention.

For example, a system employing a rotational sensing target is illustrated in FIG. 11A. As shown by the cross sectional view, a position sensing system includes the cylindrical target 3350 positioned a distance d from the emitter/detector 3030. The field of view of the emitter/detector 3030 subtends a region of the target 3350 that includes a maximum of two transitions. In some embodiments the emitter/detector is a photoreflector.

In another example, illustrated in FIG. 13B, an emitter/detector 4030 comprises a sensor 4034, and an emitter 4032. The emitter/detector further includes a mask structure 4030′ includes the emitter window 4032′ and the two sensor windows 4034′ and 4034″. In some embodiments the emitter is an LED.

The dark bands of the sensing target 4350 absorb radiation emitted from the emitter, while the light bands of the sensing target reflect radiation emitted from the emitter. The sensors detect transitions in absorption and reflectance as the bands move relative to the sensor windows. Preferably, the sensor 4034 separately detects transitions in both sensor windows 4032′ and 4034″. In some embodiments the emitter/detector 4030 is a photoreflector.

In yet another example, illustrated in FIG. 13A, an two-detector module is employed. The emitter/detector 3030 comprises a first sensor 3034A, a second sensor 3034B, and an emitter 3032. The mask structure 3030′ includes the emitter window 3032′ and the four sensor windows 3034A′, 3034A″, 3034B′, and 3034B″. In some embodiments, the emitter/detector 3030 is a photoreflector. In some embodiments the emitter is an LED.

Radiation from the emitter 3032 is substantially absorbed by dark bands and substantially reflected by the light bands of the sensing target 3350. The sensors 3034A and 3034B detect transitions in absorption and reflectance as the bands move relative to the sensor windows. Both the first sensor 3034A and the second sensor 3034B detect transitions.

In some embodiments, a detector encodes a given transition at different points in time. In addition, in some embodiments, a detector includes means for encoding a transition in two data forms that differ by a constant, such as a phase. In some embodiments, e.g. FIG. 13A, two separate sensors encode transitions out of phase of one another. In other embodiments, a single sensor views transitions at two different points in space, e.g. the two windows 3034′ and 3034″ of FIG. 13B. Preferably, in these embodiments a control system combines the out-of-phase data, permitting it to detect a direction of movement as well as its magnitude.

In a cylindrical sensing target within the above configurations, each feature preferably covers 60 degrees of the circumference. Thus, in one embodiment, a cylindrical target having a 12 mm circumference includes six 2 mm stripes in an alternating reflective/absorptive pattern. In addition, processing steps as outlined above are preferably employed to increase resolution above that offered natively by this type of target.

A position sensing system provides position data for a lens group over its range of motion. In some embodiments of the present invention, a position sensing system tracks the relative position of an optics group to within 70 microns over a range of 10 mm.

Operation

Preferred systems employ the position sensor data to control an actuator. In some embodiments, the data is used to predict the movement per cycle of the actuator. In some embodiments, the data is used to predict the movement per unit time that the actuator is engaged and powered on. In some embodiments, the data are used on a real-time basis with a correction cycle for increased accuracy. Preferably, the particular implementation used is determined in accordance with the particular actuator used.

Some embodiments of the present invention use the position data during zoom and auto-focus operation to accurately position and track optics groups. Preferably, during zoom operation, multiple lens groups are moved and tracked. The actuator control circuitry preferably accurately interprets position data to accomplish tracking and movement. In some embodiments, the control circuitry uses tracking interpretation data that is stored in a table. In some embodiments, the control circuitry uses tracking interpretation data that is stored as a mathematical function. Sometimes, this data is defined in a calibration cycle. Preferably, this calibration cycle takes place during manufacturing.

In addition, the actuator control circuitry preferably accomplishes zoom operations within a specified time frame. Preferably, in embodiments that relate to video optics, the zoom operations are accomplished in a manner that does not disturb video recording. In some embodiments, the zoom range and frame rate are used to determine an optimal step size. For example, the total zoom range is divided by the number of frames within the desired seek time to yield the step size. Thus, each step can occur within a frame. Preferably, when zoom operations occur, the steps are synchronized with the frame rate. In addition, the movement of multiple groups during zoom operations is preferably interleaved. Thus, as each group is moved, the remaining groups are stationary. Interleaving reduces driver and instantaneous power requirements.

In addition, during auto focus operation, typically a single group is moved. Preferably, a group is moved through a focus range in small increments. Preferably, an accurate position sensor and actuator control circuit is employed to permit s positioning in increments below 20 micrometers. In addition, though a variety of circuitry and hardware can be used to implement the auto-focus algorithm, preferred implementations permit reliable return of the group to the position that shows best focus.

As described above, the optical elements of some embodiments are divided into two groups, one group housed in a front barrel, the other group housed in a rear barrel. Typically, the precise motion of these optics groups group within confined spaces is achieved by using the mechanism(s) described above.

The form factor of the auto-focus and zoom module of some embodiments is approximately 9×14×22 mm without a prism or approximately 9×14×30 mm including a prism.

While the invention has been described with reference to numerous specific details, one of ordinary skill in the art will recognize that the invention can be embodied in other specific forms without departing from the spirit of the invention. Thus, one of ordinary skill in the art will understand that the invention is not to be limited by the foregoing illustrative details, but rather is to be defined by the appended claims.

Westerweck, Lothar, Raschke, Klaus, Toor, John, Grziwa, Wolfram, Raschke, Alexander

Patent Priority Assignee Title
8488046, Dec 27 2007 NAN CHANG O-FILM OPTOELECTRONICS TECHNOLOGY LTD Configurable tele wide module
8564715, Sep 08 2005 NAN CHANG O-FILM OPTOELECTRONICS TECHNOLOGY LTD System for stabilizing an optics assembly during translation
8937681, Jul 19 2007 NAN CHANG O-FILM OPTOELECTRONICS TECHNOLOGY LTD Camera module back-focal length adjustment method and ultra compact components packaging
9407121, May 31 2012 CLAVERHAM LTD Electromechanical rotary actuator and method
Patent Priority Assignee Title
2279372,
3087384,
3599377,
3609270,
4879592, May 02 1988 Senshin Capital, LLC Camera with two-bladed shutter mechanism with multiple filter apertures disposed for relative movement to provide sequential color separation exposures
5016993, Oct 02 1987 Olympus Optical Company Ltd.; OLYMPUS OPTICAL COMPANY LTD Zoom mechanism for zoom lenses
5095204, Aug 30 1990 PRESSCO TECHNOLOGY INC Machine vision inspection system and method for transparent containers
5177638, Nov 09 1990 Konica Corporation Zoom lens-barrel
5196963, Mar 18 1991 FUJIFILM Corporation Zoom lens device
5272567, Oct 23 1990 Sony Corporation Lens barrel having reference shafts movably supporting lenses
5546147, Nov 30 1994 Eastman Kodak Company Lens deployment mechanism for compact camera
5754210, Apr 05 1995 Konica Corporation Optical system assembling device for an image forming apparatus
5805362, Apr 11 1997 Trimble Navigation Limited Focusing mechanism using a thin diaphragm
5835208, May 27 1997 RAYTHEON COMPANY, A CORP OF DELAWARE Apparatus to measure wedge and centering error in optical elements
5926965, Jul 26 1996 Asahi Kogaku Kogyo Kabushiki Kaisha Tilt sensor and top cover plate thereof
5954192, Oct 30 1997 YAESU MUSEN CO , LTD Knob positioning structure
5966248, Oct 16 1996 Nikon Corporation Lens driving mechanism having an actuator
6091902, Feb 25 1998 FUJI PHOTO FILM CO , LTD ; FUJI PHOTO OPTICAL CO , LTD Zoom camera with pseudo zooming function
6292306, May 19 1999 Optical Gaging Products, Inc. Telecentric zoom lens system for video based inspection system
6330400, Jan 28 2000 Concord Camera Corp Compact through-the-lens digital camera
6417601, Oct 27 1999 The United States of America as represented by the Secretary of the Navy Piezoelectric torsional vibration driven motor
6597516, Jul 04 2001 PENTAX Corporation Lens frame structure for optical axis adjustment
6760167, Apr 25 2000 ASML US, INC; ASML HOLDING N V Apparatus, system, and method for precision positioning and alignment of a lens in an optical system
6762888, Feb 18 2003 Arc Design, Inc. Control system of zoom lens for digital still cameras
6792246, Apr 10 2000 Mitsubishi Denki Kabushiki Kaisha Noncontact transmitting apparatus
6805499, Sep 23 2002 Concord Camera Corp Image capture device door mechanism
6869233, Sep 23 2002 Concord Camera Corp Image capture device
6940209, Sep 08 2003 New Scale Technologies Ultrasonic lead screw motor
7010224, Aug 27 2002 PENTAX Corporation Lens barrel incorporating the rotation transfer mechanism
7088525, Nov 21 2001 THALES AVIONICS, INC. Universal security camera
7193793, Mar 30 2004 Nidec Copal Corporation Imaging lens
7301712, Jan 09 2003 Olympus Corporation Image-formation optical system, and imaging system incorporating the same
7330648, Oct 22 2003 Seiko Precision Inc.; SEIKO PRECISION INC Optical module
7394602, Oct 30 2006 Largan Precision Co., Ltd Optical system for taking image
7400454, Apr 19 2007 MAXELL HOLDINGS, LTD ; MAXELL, LTD Digital camera imaging lens
7420609, Jul 11 2003 Konica Minolta Opto, Inc. Image pick-up lens, image pick-up unit, and mobile terminal provided with this image pick-up unit
20020018140,
20020102946,
20030174419,
20040056970,
20040203532,
20040258405,
20050264670,
20060049720,
20060056389,
20060113867,
20060291061,
20070053672,
20070074966,
20070077051,
20070077052,
20070086777,
20070122146,
CN1324012,
EP1148406,
EP1357726,
GB1378515,
GB2315186,
GB2387063,
JP1172678,
JP2002286987,
//////////
Executed onAssignorAssigneeConveyanceFrameReelDoc
Aug 03 2007Flextronics AP, LLC(assignment on the face of the patent)
Aug 23 2007WESTERWECK, LOTHARFlextronics AP, LLCASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0202410024 pdf
Sep 05 2007GRZIWA, WOLFRAMFlextronics AP, LLCASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0202410024 pdf
Sep 11 2007RASCHKE, ALEXANDERFlextronics AP, LLCASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0202410024 pdf
Oct 26 2007TOOR, JOHNFlextronics AP, LLCASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0202410024 pdf
Dec 03 2007RASCHKE, KLAUSFlextronics AP, LLCASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0202410024 pdf
Jun 28 2012Flextronics AP, LLCDigitalOptics CorporationASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0289480790 pdf
Nov 14 2014DigitalOptics CorporationNAN CHANG O-FILM OPTOELECTRONICS TECHNOLOGY LTDASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0348830237 pdf
Nov 14 2014DigitalOptics Corporation MEMSNAN CHANG O-FILM OPTOELECTRONICS TECHNOLOGY LTDASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0348830237 pdf
Nov 14 2014FotoNation LimitedNAN CHANG O-FILM OPTOELECTRONICS TECHNOLOGY LTDASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0348830237 pdf
Date Maintenance Fee Events
Feb 11 2013M1551: Payment of Maintenance Fee, 4th Year, Large Entity.
Mar 24 2017REM: Maintenance Fee Reminder Mailed.
Sep 11 2017EXP: Patent Expired for Failure to Pay Maintenance Fees.


Date Maintenance Schedule
Aug 11 20124 years fee payment window open
Feb 11 20136 months grace period start (w surcharge)
Aug 11 2013patent expiry (for year 4)
Aug 11 20152 years to revive unintentionally abandoned end. (for year 4)
Aug 11 20168 years fee payment window open
Feb 11 20176 months grace period start (w surcharge)
Aug 11 2017patent expiry (for year 8)
Aug 11 20192 years to revive unintentionally abandoned end. (for year 8)
Aug 11 202012 years fee payment window open
Feb 11 20216 months grace period start (w surcharge)
Aug 11 2021patent expiry (for year 12)
Aug 11 20232 years to revive unintentionally abandoned end. (for year 12)