A method and apparatus for the capture of a high number of quasi-continuous effective frames of 2-D data from an event at very short time scales (from less than 10−12 to more than 10−8 seconds) is disclosed which allows for short recording windows and effective number of frames. Active illumination, from a chirped laser pulse directed to the event creates a reflection where wavelength is dependent upon time and spatial position is utilized to encode temporal phenomena onto wavelength. A hyperchromatic lens system receives the reflection and maps wavelength onto axial position. An image capture device, such as holography or plenoptic imaging device, captures the resultant focal stack from the hyperchromatic lens system in both spatial (imaging) and longitudinal (temporal) axes. The hyperchromatic lens system incorporates a combination of diffractive and refractive components to maximally separate focal position as a function of wavelength.
|
1. A system for recording high speed events comprising:
a chirped pulse source configured to generate a chirped pulse, the chirped pulse directed to an event to create a reflected chirped pulse;
a hyperchromatic lens system configured to receive the reflected chirped pulse and output two or more images focused at different distances from the hyperchromatic lens system;
one or more image capture systems configured to record the two or more images which are focused at different distances from the hyperchromatic lens system to create and store image data.
9. A hyperchromatic lens system for recording time-resolved phenomena comprising:
a pulse generator configured to generate a pulse;
a pulse stretcher configured to stretch the pulse to create a chirped pulse, the chirped pulse being directed to a time-resolved phenomena to interact with the time resolved phenomenon which creates a modified chirped pulse;
a hyperchromatic lens system configured to:
receive the modified chirped pulse representing information about the time-resolved phenomenon;
process the modified expanded pulse representing information about the time-resolved phenomenon to create two or more independent images which are established at different distances from the hyperchromatic lens system; and
an image capture system configured to capture and store the two or more independent images as image data.
15. A method for capturing high speed image information regarding an event comprising:
providing a pulse to an event, the pulse interacting with the event to create a modulated pulse, the modulated pulse representing the pulse's interaction with the event, such that the event at a first time is associated with a first wavelength of the modulated pulse and the event at a second time is associated with a second wavelength of the modulated pulse;
receiving the modulated pulse at a hyperchromatic system;
outputting the modulated pulse from the hyperchromatic system such that the first wavelength is presented as a first image at a first distance from the hyperchromatic system and the second wavelength is presented as a second image at a second distance from the hyperchromatic system;
recording the first image and the second image with an image capture device to create image capture device data.
2. The system of
3. The system of
4. The system of
7. The system of
8. The system of
12. The system of
13. The system of
14. The system of
17. The method of
19. The method of
20. The method of
processing the image capture device data to isolate first image data from the image capture device data
processing the image capture device data to isolate second image data from the image capture device data;
displaying only the first image data on a display screen; and
displaying only the second image data on a display screen.
21. The method of
|
This application claims priority to and the benefit of U.S. Provisional Patent Application No. 62/088,475 filed on Dec. 5, 2014, the contents of which are incorporated by reference in its entirety herein.
This invention was made with government support under Contract No. DE-AC52-06NA25946 and was awarded by the U.S. Department of Energy, National Nuclear Security Administration. The government has certain rights in the invention.
The invention relates to high speed image capture systems and in particular to a method and apparatus for a high speed image capture system using a hyperchromatic lens.
When doing scientific research or material interaction studies, it is often beneficial to record an event which occurs during an ultra-short time scale, such as less than one nanosecond (ns). By maximizing the amount of captured or recorded information, the event may be better understood during post event analysis. However, numerous challenges exist when recording events that occur on an ultra-short time scale.
A number of methodologies are currently available for capturing optical phenomena at ultra-short time scale, but they suffer from a number of limitations. Framing cameras which utilize electro-optic tube are one common prior art system for recording images or event data. However, the temporal resolution of framing cameras employing electro-optic tubes is limited by tube physics, in the case of single-tube imaging, and radiometry, in the case of multiple-path configurations utilizing either tubes or micro channel plates.
Another type of camera is streak cameras. The temporal resolution of electro-optic streak cameras can be substantially higher than framing cameras, but with the loss of an entire dimension of data. As such, streak cameras are able to record continuous data, but do so with a view of a single line of data and thus only record data along one particular line (streak). This drawback limits the amount of useful data that can be obtained.
In addition, some prior art systems for recording these events attempt to capture more data by adding optical recording devices, such as using multiple cameras. While this proposed solution did capture more data, it presented several other drawbacks. One such drawback is that synchronization of multiple cameras is difficult, particular for ultra-high speed operation. In addition, each additional camera system increases cost and often, dimensional limitations of the space around the event limits the number of cameras which may be used and limits the acceptable angles of image capture.
Therefore, a need exists for an improved camera system to capture high speed and ultra-high speed events.
To overcome the drawbacks of the prior art, the innovation disclosed below allows capture and recordation of various phenomena at substantially shorter time scales than previously possible. An all-optical recording methodology is desired, i.e., one that does not require conversion of signal photons to electrons and thus would not be subject to the restrictions imposed by electro-optic tube physics. Such a system could demonstrate improved performance by several metrics: system efficiency, total recording time, and effective number of frames. However, it was discovered that effects that allow for fast manipulation of optical signals, such as the electro-optic Kerr and Pockels effects, while approaching nanosecond speeds, are insufficiently sensitive and cannot be utilized to produce a high number of frames.
To overcome the drawbacks of the prior art, research was performed into a novel all-optical methodology for the capture of a high number of quasi-continuous effective frames of 2-D data at very short time scales (from less than 10−12 to more than 10−8 seconds), with potential improvement over existing technologies in terms of short recording windows and number of frames, for specific application in light-matter interaction studies and potential application in focused studies of dynamic materials. This methodology combines (1) a chirped laser pulse to encode temporal phenomena onto wavelength; (2) a strong hyperchromatic lens to create a focal stack mapping wavelength onto axial position; and (3) a three-dimensional (3-D) recording technology to capture the resultant focal stack with axial position information in total. In one embodiment, the system is used with a plenoptic camera or a hologram (digital or film). The lens and recording mechanism are, over the length of the recorded phenomena, agnostic or indifferent to time, as the temporal characteristics are dependent only on the illumination source.
One example embodiment is presented as a system for recording high speed events that includes a light source configured to generate a chirped pulse. The chirped pulse is directed to an event to create a modified chirped pulse. A hyperchromatic lens system is positioned and configured to receive the modified chirped pulse and output two or more images focused at different distances from the hyperchromatic lens system. An image capture system is positioned and configured to record the two or more images, which are focused at different distances from the hyperchromatic lens system, to create and store image data.
In one embodiment, the chirped pulse source comprises a laser and optical pulse stretcher or chirper. The hyperchromatic lens system may include a lens system with one diffractive element although in other embodiments, greater than one diffractive element may be used. Acrylic material was selected but other materials may be used.
The image capture system may be a plenoptic camera. This system may further comprise a processor configured to execute machine readable code such that the machine readable code is stored on a memory and configured to process the image data to isolate and display a first image and a second image from the two or more images. In one embodiment, the hyperchromatic lens system is a focal stack which maps wavelength of the reflected chirped pulse into an axial position relative to the hyperchromatic lens system.
Also disclosed is hyperchromatic lens system for recording time-resolved phenomena that comprises multiple components that operate together to record multiple images. In one configuration this includes a pulse generator configured to generate a pulse and an optical chirper or pulse stretcher configured to expand the pulse to create a chirped pulse. The chirped pulse is directed to a time-resolved phenomenon to interact with the time-resolved phenomenon, creating a modified chirped pulse. A hyperchromatic lens system is configured to receive the modified chirped pulse representing information about the time-resolved phenomenon. The hyperchromatic lens system also processes the modified chirped pulse representing information about the time-resolved phenomenon to create two or more independent images. The processing establishes the two or more independent images at different distances from the hyperchromatic lens system. An image capture system is also provided and is configured to capture and store the two or more independent images.
In one embodiment, the pulse is an optical pulse in the visible light spectrum. In one embodiment, the pulse stretcher uses diffraction gratings to stretch and chirp the optical pulse to create a chirped pulse. In one configuration the hyperchromatic lens system comprises a multiple lens system configured with at least two diffractive elements. It is also contemplated that the image capture system may be a plenoptic camera or a holographic image capture system. The system may further comprise a processor configured to receive data representing two or more independent images and process the data to isolate and individually display the two or more independent images.
Also disclosed is a method for capturing high speed image information regarding a dynamic event. This method provides a chirped pulse to a phenomenon such that the chirped pulse interacts with the phenomenon to create a modified chirped pulse, and the modified chirped pulse contains information on the initial chirped pulse's interaction with the event. The phenomenon at a first time is associated with a first wavelength in the modified chirped pulse and the event at a second time is associated with a second wavelength of the modified chirped pulse. The method of operation receives the modified chirped pulse at a hyperchromatic system, such as a lens system with diffractive elements. Next, this method of operation outputs the modified chirped pulse from the hyperchromatic system such that the first wavelength is presented as a first image at a first distance from the hyperchromatic system and the second wavelength is presented as a second image at a second distance from the hyperchromatic system. An image capture device records the first image and the second image to create image capture device data.
The initial chirped pulse may be from a laser such that multiple wavelengths simultaneously present in the pulse are separated so that the phenomenon with which the pulse interacts is illuminated by a different wavelength at each instant in time. There are different ways to produce a chirped pulse. The most common uses an ultrashort-pulse laser and stretches and chirps the pulse, but there are any number of ways that this could be accomplished. The method developed uses a laser. For long timescales, multiple lasers or other sources could be employed.
It is also contemplated that the image capture device may comprise a plenoptic camera. The hyperchromatic system may comprise a lens stack including at least one diffractive element. It is contemplated that this method may further comprise processing the image capture device data to isolate first image data from the image capture device data and processing the image capture device data to isolate second image data from the image capture device data. It is contemplated that this method may enable displaying the first image data on a display screen and displaying the second image data on the display screen separately.
The distance between the images, in a linear or axial format it greater than in prior art designs. The first distance and the second distance may be considered as defining a maximum distance between images (a first image and a last image) and the maximum distance is greater than 1 meter. As an improvement over the prior art and as a novel aspect to the innovation, a hyperchromat lens system with 3d imaging capability establishes images at different planes to translate time to spatial linear distance, which each image detectable by a imaging system that records the image at two or more distances (planes) from the image system len(s).
Other systems, methods, features and advantages of the invention will be or will become apparent to one with skill in the art upon examination of the following figures and detailed description. It is intended that all such additional systems, methods, features and advantages be included within this description, be within the scope of the invention, and be protected by the accompanying claims.
The components in the figures are not necessarily to scale, emphasis instead being placed upon illustrating the principles of the invention. In the figures, like reference numerals designate corresponding parts throughout the different views.
Disclosed is methodology and a system for capturing time-resolved 2-D data of non-repeating dynamic fast or ultra-fast (high speed and ultra-high speed) events. This methodology and system incorporates an optical illumination source in which wavelength is dependent on time, a hyperchromatic lens that images information at different wavelengths at longitudinally separated focal planes, and a 3-D optical recording device. An optical design innovation results in a disclosed design for a hyperchromatic lens with longitudinal chromatic aberration capable of displacing image planes by 9 mm/nm of incident illumination with near-diffraction-limited imaging performance, as an exemplary implementation. Fabrication and assembly produced a first article hyperchromat.
In this example embodiment the light source 108 comprises a laser, such as a femtosecond-class short-pulse doubled Nd:YAG laser. In other embodiments the light source 108 may be, but is not limited to, a visible or invisible short-pulse laser, a system producing discrete pulses of varying wavelength, or any other type light source which emits a pulse or stream of light.
The pulse 120 is directed to an optical pulse stretcher 124. For this document, the optical pulse stretcher 124 is a device that lengthens the illumination pulse to chirp the pulse, i.e. to create a longer (chirped) pulse in which wavelength is a function of time. In one embodiment, pulse chirping (stretching) may be performed with a fiber optic cable of a selected length. In one embodiment, stretching may be performed with diffractive optical components. In one embodiment, the expansion stretches a 10 femtosecond pulse to a chirped pulse of 100 picoseconds, although in other embodiments other numeric values may be achieved. This is a variable value and provided only for discussion. Other applications will utilize different pre-chirp and post-chirp pulse durations. Pulse stretching is generally known by one of ordinary skill in the art and is thus not described in great detail herein. Companies that offer pulse stretchers include OptiGrate Corp, located in Oviedo, Fla.; TeraXion located in Quebec, Canada; and Ondax, Inc, located in Monrovia, Calif.
The output of the pulse stretcher 124 is a chirped pulse. As compared to the pulse 120, the chirped pulse 128 includes optical signals at wavelength W1 130A, W2 130B, W3 130C, W41 130. These different wavelengths may also be referred to as colors, although the human eye may not be able to distinguish the color differences if the wavelengths are close in value. After the pulse is chirped to create the chirped pulse, the chirped pulse is directed to an event to illuminate or irradiate the event. The benefit of using the chirped pulse 128 instead of the pulse 120 is that the chirped pulse provides illumination in which, at each instant in time, only one wavelength is effectively present.
As shown in
Turning now to
As shown in
The range of images 150A-150B as shown do not appear at the same time. the time over which the hyperchromatic system 142 focuses or presents the range of images 150A-150D is equal to the length of the chirped pulse. This duration is expected to be generally shorter than the integration time of any employed image capture device, thus allowing the image capture device the ability to capture all of the images simultaneously.
Positioned to view the images 150A-150D are one or more image capture devices 160. In this embodiment, the image capture device is shown with an associated lens system 164 and the device and lens are collectively referred to as a camera 166. In one embodiment, the only lens in the ‘lens system 164’ is the hyperchromat lens. The camera captures the images 150 which are presented at the locations D1-D4. In this exemplary embodiment the camera 166 comprises a plenoptic camera. A plenoptic camera (also known as a light field camera) is a device that captures information about the intensity of light in a scene, and also captures information about the direction that the light rays are traveling in space. One type of light field camera uses an array of micro-lenses placed in front of an otherwise conventional image sensor to sense intensity and directional information. Holographic recording systems may also be used, and may be digital or film-based Two sources for plenoptic camera are Raytrix Inc. located in Germany and Lytro Inc, located in Mountain View, Calif. Although described as a plenoptic camera, the camera 166 can be any device that is sensitive to and able to record both spatial and depth information from the hyperchromatic system 142.
In operation, the camera 166 is active during the presentation of images 150 and is able to capture in focus data at each distance D1-D4, thereby capturing image data for each of the images 150A-150D.
In general, plenoptic cameras are sensitive to depth and can record information at different distances or depths from the camera's lens. When viewing an image with information of interest at different distances from the lens, data processing occurs to retain or display, for a particular distance, only the image data for a particular distance. Other data, unrelated to that distance of interest, is discarded or not displayed. It is contemplated that image data for different distances, from the system 142, such as for example, D1, D2, D3, and D4, may be selectively displayed to the user. The resulting image data may be considered four-dimensional data since it records intensity data at two-dimensional planes located at different depths corresponding to different times, and the entire recording system may be considered a four-dimensional recording system
Image information is captured by the image capture device 160 and is presented to an image processor 168. The image processor 168 may be part of the image capture device or a separate element, such as a computer. The camera 166 may be separable from the image processor, such as connected by a cable or other electrical interface. The image processor 168 (or the camera 166 if so configured) generates data regarding the images at each distance D1-D4. In some embodiments, this data may be overlayed to create a single composite image with each image 150 combined such that the composite image is in-focus for each image distance 150A-150D.
In another embodiment, the resulting image data is processed by the image processor 168 to generate individual images representing the light energy at each distance D1-D4. As a result, a user or viewer of the data may select which image 150A-150D to view. For example, the image processor would extract from the data set captured by the image capture device 160 only the intensity information from an in-focus image located at distance D1, and present only that image data, as an image or in some other format, to the user. Likewise, the image processor is configured to extract from the data set captured by the image capture device 160 only the information (light or energy) from an in-focus image located at distance D3, and present only that image data, either as data or as an image, to the user. This allows the user to collectively or individually view each set of light energy from different distances D1-D4.
Because a user may individually view the intensity data at D1, D2, D3, or D4, the user is also able to view the event at a time T1, T2, T3 or T4. Stated another way, the system records the event at different times by radiating the event with different wavelengths of light at different times within a chirped pulse. The different wavelengths of light illuminate the event sequentially at different times, and so each discrete wavelength translates to the time at which that wavelength illuminates the event. The reflected or transmitted pulse thus contains information from the event, each discrete instant captured using light at one discrete wavelength The pulse containing event information passes through the hyperchromatic system and the hyperchromatic system focuses the images at each wavelength at different distances from the hyperchromatic system. Thus, time during the event is translated to wavelength by the chirped pulse, and wavelength is translated to focal position by the hyperchromatic lens, the system thereby translating time during the event to focal position. The plenoptic camera records the data at each distance, for viewing and analysis by a user or computer system on an image by image (distance by distance) basis.
Processing by the image processor 168 occurs on the data from the camera to identified the data at each distance D1, D2, D3, and D4. In other embodiments, any number of images 150 may reside at fewer than four or greater than four distances. Image processing to perform such extraction may be generated or requested from a supplier of plenoptic cameras, holographic recording system or other party capable of such data processing.
For this example embodiment, initial design specifications define a field of view of less than 5 mm, under the assumption that magnifying achromatic optics could be used to relay an image of an actual phenomenon of interest to the object plane of the hyperchromatic system. In one embodiment, a magnification of between 5× and 10× was selected, to enable higher longitudinal chromatic dispersion while maintaining a reasonable final image size.
In the exemplary embodiment a design approach incorporating diffractive dispersion is implemented.
The hybrid hyperchromatic optical system as shown in
As diffractive elements 216, 220 have significantly higher characteristic dispersion than even the highest dispersion glasses, this exemplary hybrid design offers greater optical performance and dispersion than achievable in purely refractive designs. In this example embodiment, the two diffractive components are identical, but in other embodiments, these elements may differ. In this example embodiment, designs were established to have no more than 20 nm of bandwidth, with a desired bandwidth limit of less than 10 nm. Other embodiments may be configured with different bandwidths.
This exemplary design was designed for use with visible wavelengths to include light field recording technology as a viable method for recording produced image information. It is contemplated that other embodiments may be designed for use with non-visible wavelengths.
Another benefit and characteristic of this innovation is a high degree of linearity in the dispersion curve about the center design wavelength and constant magnification throughout the focal stack of produced images. Exemplary initial and developed characteristics are summarized in Table 1.
TABLE 1
Optical design requirements.
Example possible
As
Metric
Note
value range
designed
Illumination
Based on
500 to 800 nm
532 nm
center
expectations of
wavelength
recording
requirements
Illumination
Based on availability
<20 nm (required)
5 nm
bandwidth
of illumination
<10 nm (desired)
sources
Field of view
Under the
5 mm
5 mm
assumption that
achromatic relay
optics could magnify
an actual object
Focal stack
To increase
>25 mm (required)
45 mm
depth
resolvability of
>50 mm (desired)
image slices
In a hybrid hyperchromatic optical system, the magnitude of the chromatic aberration is primarily driven by the power of the diffractive element. In the example embodiment of
The performance of the diffractive lens pair 216, 220 was optimized by minimizing the angle of rays incident on both surfaces, by placing the diffractive lens pair symmetrically about the aperture stop 250 with equal angles on entering and exiting rays.
The embodiment of
In this embodiment, the lens system was designed for a 532 nm center wavelength, as this wavelength may be readily obtained in short-pulse lasers. However, it is contemplated that other potential laser sources and other types of potential experiments may utilize a different wavelength. In one configuration, the change in wavelength was obtained by changing lens spacing(s).
Returning to
In this example embodiment, the lens system 204 provides near-diffraction-limited imaging at 532 nm, with a maximum optical resolution of greater than 52 line-pairs per mm (lp/mm) at 20% contrast across the 5 nm bandwidth, at the image. Given the magnification in this embodiment, this figure equals greater than 395 line-pairs per mm (lp/mm) at the object.
The optomechanical aspects of the exemplary implementation of the lens 204 shown in
In one embodiment, one or more surfaces had light-absorbing threads machined into the metal. In this exemplary embodiment, the thread type was a standard 20 pitch thread however, the system may be configured with a ‘sharp V thread form (no flat), minimal deburr’ configuration to minimize potential reflection and scattering from the truncated crests of standard thread forms. In other embodiment, other thread pitches may be used.
The diffractive components 216, 220 are orderable from Apollo Optical Systems, located in West Henrietta, N.Y., and the refractive components orderable from Optimax Systems, Inc., of Ontario, N.Y. The optomechanical housing for the components are orderable from Zero Hour Parts of Ann Arbor, Mich.
The hyperchromatic lens system and the illumination scheme described herein may be used on, and at time scales relevant to, experiments in shock studies and dynamic materials, such as, but not limited to, 10 ps-1 μs. As the temporal function of the hyperchromatic lens and recording medium relatively insensitive to the dimension of time, recording length is determined by the temporal characteristics of the illumination source. With a holographic medium, no additional timing method is necessary for recording, but with a digital medium, the camera needs to be actuated prior to recording, and continuously integrating over the duration of the illumination pulse.
For an application involving phenomena generated by sub-picosecond pulses, a light signal that is chirped to multiple picoseconds can be used to exhibit a clean chirp, i.e., a pulse in which a single wavelength is only present at a given instant, providing data that is temporally non-convolved.
In other embodiments, digital holography may be used as a 3-D recording method. However, it is possible that holography would add additional, undesirable, constraints on experimental. As an alternative, and as discussed above, plenoptic cameras employ image-plane lenslet arrays and oversampling to measure light as intensities and direction vectors, rather than merely intensity. Therefore, it is submitted that plenoptic cameras, with reconstruction algorithms, are able to digitally refocus discrete image planes from within the focal stack, perform digital determination of the longitudinal position of discrete planes, and create movies or image of recorded phenomena.
It is contemplated that a viable system (laser, lens, and plenoptic camera) may be configured to act as a partial edge filter such that the high f-number (f/26), low-frequency data in any image plane will still be imaged at nearby planes, whereas high-frequency data in a given plane will be unique to that plane. In one embodiment, the system of
At a step 512, the pulse stretcher receives the pulse and transforms the pulse into a chirped pulse. The pulse stretcher may comprise any device capable of expanding the pulse to create a chirped pulse. In one embodiment, the generation of the pulse and the chirping of the pulse is combined into a single step, such that at step 508 a chirped pulse is directly created.
At a step 516, the chirped pulse is directed to an event of interest. The event may be any event, but it is expected that the event, or aspects of the event of interest, will be of short duration. The chirped pulse may be projected directly on the event, or directed to the event using one or mirrors or one or more lenses.
At the event, the pulse reflects from the event and at a step 520 the hyperchromatic lens system collects or receives the reflected chirped pulse from the event. At a step 524, the hyperchromatic lens system processes the reflected chirped pulse (modified chirped pulse) from the event. In one embodiment, the processing comprises passing the reflected chirped pulse through one or more lenses, one of which includes a high-dispersion element. In this embodiment, processing of the reflected chirped pulse includes outputting the reflected chirped pulse as two or more images of the event, with different wavelengths of the reflected chirped pulse being focused or located at different distances from the hyperchromatic lens system. This occurs at step 528.
At a step 532, one or more image capture system captures the two or more images from the hyperchromatic lens system such that each image is located at a different distance from the hyperchromatic lens system. Any type or number of image capture devices may be used to capture the image, but in one embodiment a plenoptic camera is used to simultaneously or sequentially capture the image data at different depths or distances from the hyperchromatic lens system.
Next, at step 536 one or more image processing devices, for example a general purpose computer executing machine readable code, such as software stored in a non-transitory format on a memory, or a specialized processing device, processes the image data from the image capture device. It is contemplated that such image data may be generated in a digital camera system but it is contemplated that the image data may be recorded on film and processed in a traditional film based work flow. In a digital camera environment, the processing isolates the image data located one or more distances from the hyperchromatic lens system. Once isolated, the image data may be viewed independent of the other image data or combined with the other image data in still or movie format. This allows viewing of image data based on the wavelength of the light in the reflected chirped pulse, which can in turn be correlated to time in relation to progression of the event, which is considered to be dynamic.
At a step 540, the image data is stored, displayed, or both. Displaying of the image data may occur graphically as either a static image or a moving, sequenced video. This allows viewing the ultrafast event to gain a better understanding of the event.
The following references are incorporated by reference in their entirety.
Patent | Priority | Assignee | Title |
10948715, | Aug 31 2018 | Hellman Optics, LLC | Chromatic lens and methods and systems using same |
11899354, | Nov 17 2022 | EAST CHINA NORMAL UNIVERSITY | Ultrafast photographing apparatus based on polarization-time mapping |
Patent | Priority | Assignee | Title |
5610734, | Sep 11 1992 | Board of Trustees Leland Stanford, Jr. University | Chromatic focal pencil beam-generating apparatus |
5682262, | Dec 13 1995 | Massachusetts Institute of Technology | Method and device for generating spatially and temporally shaped optical waveforms |
5790242, | Jul 31 1995 | Rudolph Technologies, Inc | Chromatic optical ranging sensor |
6618125, | Sep 05 2000 | The United States of America as represented by the Secretary of the Army | Code-multiplexed read-out for ladar systems |
7161671, | Oct 29 2001 | Hitachi, Ltd. | Method and apparatus for inspecting defects |
7224540, | Jan 31 2005 | PSC SCANNING, INC | Extended depth of field imaging system using chromatic aberration |
7933010, | Oct 22 2007 | Visiongate, Inc | Depth of field extension for optical tomography |
8005314, | Dec 09 2005 | CYTEK BIOSCIENCES, INC | Extended depth of field imaging for high speed object analysis |
8587772, | Dec 21 2011 | Mitutoyo Corporation | Chromatic point sensor configuration including real time spectrum compensation |
20030076485, | |||
20030133109, | |||
20050078296, | |||
20060012797, | |||
20060197937, | |||
20130336345, | |||
20140078352, | |||
20140103018, | |||
20160350573, | |||
DE102005006723, | |||
JP2011085432, | |||
WO2009153067, | |||
WO9641123, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Sep 10 2015 | National Security Technologies, LLC | (assignment on the face of the patent) | / | |||
Jun 02 2017 | FRAYER, DANIEL K | National Security Technologies, LLC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 042594 | /0535 |
Date | Maintenance Fee Events |
Mar 01 2021 | REM: Maintenance Fee Reminder Mailed. |
May 19 2021 | M1551: Payment of Maintenance Fee, 4th Year, Large Entity. |
May 19 2021 | M1554: Surcharge for Late Payment, Large Entity. |
Date | Maintenance Schedule |
Jul 11 2020 | 4 years fee payment window open |
Jan 11 2021 | 6 months grace period start (w surcharge) |
Jul 11 2021 | patent expiry (for year 4) |
Jul 11 2023 | 2 years to revive unintentionally abandoned end. (for year 4) |
Jul 11 2024 | 8 years fee payment window open |
Jan 11 2025 | 6 months grace period start (w surcharge) |
Jul 11 2025 | patent expiry (for year 8) |
Jul 11 2027 | 2 years to revive unintentionally abandoned end. (for year 8) |
Jul 11 2028 | 12 years fee payment window open |
Jan 11 2029 | 6 months grace period start (w surcharge) |
Jul 11 2029 | patent expiry (for year 12) |
Jul 11 2031 | 2 years to revive unintentionally abandoned end. (for year 12) |