A sensor device may determine a first optical sensor value associated with a first displacement and a second optical sensor value associated with a second displacement, wherein the first displacement is between an emitter associated with the first optical sensor value and a sensing location used to determine the first optical sensor value, wherein the second displacement is between an emitter associated with the second optical sensor value and a sensing location used to determine the second optical sensor value, and wherein the first displacement is different from the second displacement. The sensor device may determine one or more measurements using the first optical sensor value and the second optical sensor value, wherein the one or more measurements relate to a first penetration depth associated with the first optical sensor value, and a second penetration depth associated with the second optical sensor value.
|
1. A method performed by a sensor device, comprising:
determining a first optical sensor value associated with a first displacement and a second optical sensor value associated with a second displacement,
wherein the first displacement is based on a first optical waveguide receiving first light generated by an emitter, associated with the first optical sensor value, and guiding the first light to a first collimator associated with a sensing location corresponding to the first optical sensor value,
wherein the second displacement is based on a second optical waveguide receiving second light generated by an emitter, associated with the second optical sensor value, and guiding the second light to a second collimator associated with a sensing location corresponding to the second optical sensor value, and
wherein the first displacement is different from the second displacement;
determining one or more measurements using the first optical sensor value and the second optical sensor value, wherein the one or more measurements relate to:
a first penetration depth associated with the first optical sensor value, and
a second penetration depth associated with the second optical sensor value; and providing information identifying the one or more measurements.
16. A device, comprising:
one or more collimators;
a first optical waveguide to:
receive first light from an emitter associated with a first optical sensor value, and
guide the first light to the one or more collimators;
a second optical waveguide to:
receive second light from an emitter associated with a second optical sensor value, and
guide the second light to the one or more collimators;
one or more optical sensors; and
one or more processors, communicatively coupled to the one or more optical sensors, configured to:
determine the first optical sensor value and the second optical sensor value using the one or more optical sensors,
wherein the first optical sensor value is associated with a first displacement that is based on the first light being guided,
wherein the second optical sensor value is associated with a second displacement that is based on the second light being guided, and
wherein the first displacement is different from the second displacement;
determine one or more measurements using the first optical sensor value and the second optical sensor value, wherein the one or more measurements relate to:
a first penetration depth associated with the first optical sensor value, and
a second penetration depth associated with the second optical sensor value; and
provide information identifying the one or more measurements.
10. A sensor device, comprising:
one or more optical sensors; and
one or more processors configured to:
determine a first optical sensor value and a second optical sensor value using the one or more optical sensors,
wherein the first optical sensor value is associated with a first displacement that is based on a first optical waveguide receiving first light generated by an emitter, corresponding to the first optical sensor value, and guiding the first light to a first collimator associated with a sensing location corresponding to the first optical sensor value,
wherein the second optical sensor value is associated with a second displacement that is based on a second optical waveguide receiving second light generated by an emitter, corresponding to the second optical sensor value, and guiding the second light to a second collimator associated with a sensing location corresponding to the second optical sensor value, and
wherein the first displacement is different from the second displacement;
determine one or more measurements using the first optical sensor value and the second optical sensor value, wherein the one or more measurements relate to:
a first penetration depth associated with the first optical sensor value, and
a second penetration depth associated with the second optical sensor value; and
provide information identifying the one or more measurements.
2. The method of
3. The method of
4. The method of
5. The method of
6. The method of
7. The method of
8. The method of
9. The method of
11. The sensor device of
the first optical waveguide, wherein the first optical waveguide connects the sensing location corresponding to the first optical sensor value and an optical sensor of the one or more optical sensors, and
the second optical waveguide, wherein the second optical waveguide connects the sensing location corresponding to the second optical sensor value and the optical sensor.
12. The sensor device of
a first optical filter associated with an optical sensor of the one or more optical sensors, and
a second optical filter associated with the optical sensor,
wherein the first optical filter corresponds to the sensing location corresponding to the first optical sensor value and the second optical filter corresponds to the sensing location corresponding to the second optical sensor value.
13. The sensor device of
14. The sensor device of
15. The sensor device of
17. The device of
18. The device of
19. The device of
20. The device of
|
This application claims priority to U.S. Provisional Patent Application No. 62/881,819, filed on Aug. 1, 2019, entitled “OPTICAL SENSOR,” the content of which is incorporated by reference herein in its entirety.
A sensor device may be utilized to capture information for spectrometry analysis. For example, the sensor device may capture information relating to a set of electromagnetic frequencies. The sensor device may include a set of sensor elements (e.g., optical sensors, spectral sensors, and/or image sensors) that capture the information. For example, an array of sensor elements may be utilized to capture information relating to one or more frequencies. An analysis may be performed on the information relating to the one or more frequencies to determine spectrometry information.
According to some implementations, a method may include determining a first optical sensor value associated with a first displacement and a second optical sensor value associated with a second displacement, wherein the first displacement is different from the second displacement; determining one or more measurements using the first optical sensor value and the second optical sensor value, wherein the one or more measurements relate to a first penetration depth associated with the first optical sensor value, and a second penetration depth associated with the second optical sensor value; and providing information identifying the one or more measurements.
According to some implementations, a sensor may include one or more optical sensors, and one or more processors configured to determine a first optical sensor value and a second optical sensor value using the one or more optical sensors, wherein the first displacement is different from the second displacement; determine one or more measurements using the first optical sensor value and the second optical sensor value, wherein the one or more measurements relate to a first penetration depth associated with the first optical sensor value, and a second penetration depth associated with the second optical sensor value; and provide information identifying the one or more measurements.
According to some implementations, a device may include one or more optical sensors, and one or more processors, communicatively coupled to the one or more optical sensors, configured to determine a first optical sensor value and a second optical sensor value using the one or more optical sensors, wherein the first displacement is different from the second displacement; determine one or more measurements using the first optical sensor value and the second optical sensor value, wherein the one or more measurements relate to a first penetration depth associated with the first optical sensor value, and a second penetration depth associated with the second optical sensor value; and provide information identifying the one or more measurements.
The following detailed description of example implementations refers to the accompanying drawings. The same reference numbers in different drawings may identify the same or similar elements. The following description uses a spectrometer as an example. However, the calibration principles, procedures, and methods described herein may be used with any sensor, including but not limited to other optical sensors and spectral sensors.
A structure of a measurement target may be composed of layers of different thicknesses. Examples of measurement targets include animal tissue, human tissue, a food item, an organic material, and/or the like. In the case of tissue, different tissue layers contain different types of blood vessels, such as arteries, arterioles, capillaries, and veins. When performing non-intrusive optical measurements of vital signs, blood flow through these different vessels indicates different information. For example, optically monitoring blood pressure is more accurate when measuring blood flow from the deep arteries that exist in the subcutaneous tissue. Measurement complications arise when acquiring spectra of blood flow through deeper layers of tissue (e.g., subcutaneous tissue) because the signal becomes convoluted with signals from the shallower layers (e.g., the dermis and/or epidermis).
A multi-distant approach, in which both deep and shallow layers are assessed, can produce more accurate measurements of blood flow in the deeper vessels, resulting in significantly more accurate readings relative to a single-distant approach. More generally, it may be beneficial to perform measurements at multiple different depths in a measurement sample to determine different types of measurement values or a particular measurement value at two or more different depths. This may be useful, for example, for spectral measurement of a measurement target based on interactance spectrometry. When performing spectral measurements of tissue in interactance mode, the penetration depth of the measured light may be based on a separation between an emitter and a sensor. For example, as the separation increases, the sensor may measure light that has penetrated to a greater penetration depth in the measurement target. However, achieving multiple, different penetration depths using multiple different sensor devices associated with respective sensor/emitter displacements may be costly and size-prohibitive.
Implementations described herein provide interactance-based measurements at multiple different penetration depths in a measurement target. For example, a sensor device may include one or more sensors and one or more emitters. In some implementations, the sensor device may include a single sensor and two or more emitters at different displacements from the single sensor. In some implementations, the sensor device may include a single emitter and two or more sensors at different displacements from the single emitter. In some implementations, the sensor device may include two or more sensors and two or more emitters, which enables measurements at four or more possible displacements between sensor/emitter pairs. In some implementations, the sensor device may include one or more waveguides that guide light from different measurement locations on a measurement target to one or more sensors, thereby achieving different displacements from a single emitter for a single sensor. Thus, the spectra of multiple measurement depths can be measured by including multiple distances between one or more emitters and one or more sensors of a single sensor device, which may provide a lower cost, lower resource consumption, and smaller size than using multiple sensor devices that are each associated with a respective sensor/emitter displacement.
Emitter 110 includes an optical transmitter (e.g., a light source) that transmits an optical signal toward a measurement target (not shown). The optical signal may comprise multiple wavelength ranges of visible light, near-infrared light, mid-infrared light, and/or the like. In some implementations, emitter 110 may include a light emitting diode (LED) or an array of LEDs.
Sensor 120 includes a device capable of performing a measurement of light directed toward sensor 120 (e.g., via filter 130), such as an optical sensor, a spectral sensor, an image sensor, and/or the like. Sensor 120 may utilize one or more sensor technologies, such as a complementary metal-oxide-semiconductor (CMOS) technology, a charge-coupled device (CCD) technology, and/or the like. In some implementations, sensor 120 may include multiple sensor elements (e.g., an array of sensor elements—referred to as a sensor array) each configured to obtain information. For example, a sensor element may provide an indication of intensity of light that is incident on the sensor element (e.g., active/inactive or a more granular indication of intensity), such as an electrical signal.
Filter 130 includes a spectral filter, a multispectral filter, a bandpass filter, a blocking filter, a long-wave pass filter, a short-wave pass filter, a dichroic filter, a linear variable filter (LVF), a circular variable filter (CVF), a Fabry-Perot filter, a Bayer filter, and/or the like. Filter 130 may pass one or more wavelengths of light for sensing by sensor 120. In some implementations, filter 130 may include multiple, different filters that are configured to pass respective spectral ranges to sensor 120. For example, filter 130 may include a binary filter, such as a binary multispectral filter.
Processor 140 is described in more detail in connection with
Emitters 110 of sensor device 100 may be spatially separated from each other so as to achieve different displacements 150, 160, and 170 relative to sensor 120. For example, Emitter 1 is associated with a displacement D1, Emitter 2 is associated with a displacement D2, and so on, where each of these displacements is different than the other displacements. The different displacements D1 through DN may enable measurement at different penetration depths by sensor 120. In some aspects, the displacements D1 through DN may be based on a penetration depth and/or a material composition of the material being measured. For example, a displacement of approximately 1 mm may be used for a shallow skin layer measurement, whereas a displacement of approximately 200 mm may be used for a deep muscle measurement.
In some implementations, an emitter 110 associated with a larger displacement from sensor 120 may emit more optical power than an emitter 110 associated with a smaller displacement from sensor 120. For example, Emitter 2 may use a higher transmission power than Emitter 1. This may improve measurement signal-to-noise ratio and enable the determination of measurements at deeper penetration depths than if each emitter 110 used the same transmission power.
Sensor device 100 may employ a variety of techniques, or a combination of techniques, to differentiate the light transmitted by the emitters 110 and/or the measurements determined using the light. For example, sensor device 100 may activate emitters 110 at different times or in a particular pattern, and may determine measurements for emitters 110 based on the times or the particular pattern. As another example, emitters 110 may emit light at different wavelengths, and filter 130 may filter the different wavelengths so that wavelengths corresponding to different emitters 110 are detected by different regions of sensor 120 or at different times.
As further shown, the plurality of sensors 220 are spaced from emitter 210 at different displacements D1 through DN (shown by reference numbers 240, 250, and 260). For example, D1 through DN may be different from each other, thereby enabling multi-distant spectral measurement of a measurement target. In this case, a sensor 220 farther from emitter 210 (e.g., with a larger displacement) may use a longer integration time for a measurement than a sensor 220 closer to emitter 210 (e.g., with a smaller displacement), which may improve a signal-to-noise ratio for the sensor 220 farther from emitter 210 due to the dimmer signal associated with the larger displacement.
In some aspects, sensor 320 may include a single sensor. For example, sensor 320 may be monolithic and/or may be associated with a single processor, a single backend, a single chip, and/or the like. This may be less costly than implementing a plurality of sensors. In some aspects, sensor 320 may be a composite of multiple sensors, such as a plurality of sensors that are combined to form a sensor, which may be less complex than implementing a single, larger sensor. Each filter 330 may be associated with a respective sensing location. A sensing location may correspond to a range of pixels, an optical detector or set of optical detectors, and/or the like.
Optical waveguide 510/520 includes a device capable of guiding light from one location to another location. For example, optical waveguide 510/520 may include an optical pickup, a light pipe, an optical fiber, and/or the like. Collimator 560 is a device capable of collimating light received via optical waveguide 510/520. For example, collimator 560 may include collimating optics and/or the like. An optical waveguide 510/520 may receive light generated by emitter 530 and may guide the light to collimator 560. For example, and as shown, optical waveguide 510 may guide light via a longer horizontal displacement than optical waveguide 520, meaning that displacement D1, shown by reference number 570, is larger than displacement D2, shown by reference number 580. Thus, sensor device 500 may achieve a variety of displacements using optical waveguides, which may allow for sensors and emitters to be provided in a smaller form factor. Furthermore, a plurality of displacements can be achieved using a single sensor that may be smaller than the sensor 320 described in connection with
In some aspects, a characteristic of a filter or a sensor may be based on a displacement of the filter or sensor from an emitter. For example, a sensor associated with a larger displacement may be associated with a larger pixel size than a pixel size of a sensor associated with a smaller displacement, which may improve light collection to accommodate the dimmer signal, and enable larger displacements. As another example, a filter associated with a larger displacement may be associated with a wider filter channel to allow increased light collection, thereby accommodating dimmer signals and enabling larger displacements.
A sensor device (e.g., sensor device 100/200/300/400/500) may use a variety of techniques, or a combination of techniques, to differentiate the light transmitted by the emitters and/or the measurements determined using the light. For example, the sensor device may activate emitters at different times or in a particular pattern, and may determine measurements for emitters based on the times or the particular pattern. As another example, emitters may emit light at different wavelengths, and a filter may filter the different wavelengths so that wavelengths corresponding to different emitters are detected by different regions of the sensor or at different times.
By providing multiple different displacements between sensors and emitters, different penetration depths for measurements on a measurement target may be achieved. For example, as shown by example 600 of
As indicated above,
Control device 710 includes one or more devices capable of storing, processing, and/or routing information associated with spectroscopic measurement. For example, control device 710 may include a server, a computer, a wearable device, a cloud computing device, and/or the like. In some implementations, control device 710 may be associated with a particular sensor device 720. In some implementations, control device 710 may be associated with multiple sensor devices 720. In some implementations, control device 710 may receive information from and/or transmit information to another device in environment 700, such as sensor device 720.
Sensor device 720 includes one or more devices capable of performing a spectroscopic measurement on a sample. For example, sensor device 720 may include a spectrometer device that performs spectroscopy (e.g., vibrational spectroscopy, such as a near infrared (NIR) spectrometer, a mid-infrared spectroscopy (mid-IR), Raman spectroscopy, and/or the like). In some implementations, sensor device 720 may be incorporated into a wearable device, such as a wearable spectrometer and/or the like. In some implementations, sensor device 720 may receive information from and/or transmit information to another device in environment 700, such as control device 710. In some implementations, sensor device 720 includes one or more of the components described in connection with sensor device 100/200/300/400/500.
Network 730 may include one or more wired and/or wireless networks. For example, network 730 may include a cellular network (e.g., a long-term evolution (LTE) network, a 3 G network, a code division multiple access (CDMA) network, etc.), a public land mobile network (PLMN), a local area network (LAN), a wide area network (WAN), a metropolitan area network (MAN), a telephone network (e.g., the Public Switched Telephone Network (PSTN)), a private network, an ad hoc network, an intranet, the Internet, a fiber optic-based network, a cloud computing network, and/or the like, and/or a combination of these or other types of networks.
The number and arrangement of devices and networks shown in
Bus 810 includes a component that permits communication among multiple components of device 800. Processor 820 is implemented in hardware, firmware, and/or a combination of hardware and software. Processor 820 is a central processing unit (CPU), a graphics processing unit (GPU), an accelerated processing unit (APU), a microprocessor, a microcontroller, a digital signal processor (DSP), a field-programmable gate array (FPGA), an application-specific integrated circuit (ASIC), or another type of processing component. In some implementations, processor 820 includes one or more processors capable of being programmed to perform a function. Memory 830 includes a random access memory (RAM), a read only memory (ROM), and/or another type of dynamic or static storage device (e.g., a flash memory, a magnetic memory, and/or an optical memory) that stores information and/or instructions for use by processor 820.
Storage component 840 stores information and/or software related to the operation and use of device 800. For example, storage component 840 may include a hard disk (e.g., a magnetic disk, an optical disk, and/or a magneto-optic disk), a solid state drive (SSD), a compact disc (CD), a digital versatile disc (DVD), a floppy disk, a cartridge, a magnetic tape, and/or another type of non-transitory computer-readable medium, along with a corresponding drive.
Input component 850 includes a component that permits device 800 to receive information, such as via user input (e.g., a touch screen display, a keyboard, a keypad, a mouse, a button, a switch, and/or a microphone). Additionally, or alternatively, input component 850 may include a component for determining location (e.g., a global positioning system (GPS) component) and/or a sensor (e.g., an accelerometer, a gyroscope, an actuator, another type of positional or environmental sensor, and/or the like). Output component 860 includes a component that provides output information from device 800 (via, e.g., a display, a speaker, a haptic feedback component, an audio or visual indicator, and/or the like).
Communication interface 870 includes a transceiver-like component (e.g., a transceiver, a separate receiver, a separate transmitter, and/or the like) that enables device 800 to communicate with other devices, such as via a wired connection, a wireless connection, or a combination of wired and wireless connections. Communication interface 870 may permit device 800 to receive information from another device and/or provide information to another device. For example, communication interface 870 may include an Ethernet interface, an optical interface, a coaxial interface, an infrared interface, a radio frequency (RF) interface, a universal serial bus (USB) interface, a wireless local area network interface, a cellular network interface, and/or the like.
Device 800 may perform one or more processes described herein. Device 800 may perform these processes based on processor 820 executing software instructions stored by a non-transitory computer-readable medium, such as memory 830 and/or storage component 840. As used herein, the term “computer-readable medium” refers to a non-transitory memory device. A memory device includes memory space within a single physical storage device or memory space spread across multiple physical storage devices.
Software instructions may be read into memory 830 and/or storage component 840 from another computer-readable medium or from another device via communication interface 870. When executed, software instructions stored in memory 830 and/or storage component 840 may cause processor 820 to perform one or more processes described herein. Additionally, or alternatively, hardware circuitry may be used in place of or in combination with software instructions to perform one or more processes described herein. Thus, implementations described herein are not limited to any specific combination of hardware circuitry and software.
The number and arrangement of components shown in
As shown in
As further shown in
As further shown in
Process 900 may include additional implementations, such as any single implementation or any combination of implementations described below and/or in connection with one or more other processes described elsewhere herein.
In a first implementation, the emitter associated with the first optical sensor value and the emitter associated with the second optical sensor value are a same emitter.
In a second implementation, alone or in combination with the first implementation, the emitter associated with the first optical sensor value and the emitter associated with the second optical sensor value are different emitters.
In a third implementation, alone or in combination with one or more of the first and second implementations, the sensing location used to determine the first optical sensor value and the sensing location used to determine the second optical sensor value are a same sensing location.
In a fourth implementation, alone or in combination with one or more of the first through third implementations, the sensing location used to determine the first optical sensor value and the sensing location used to determine the second optical sensor value are different sensing locations.
In a fifth implementation, alone or in combination with one or more of the first through fourth implementations, the emitter associated with the first optical sensor value and the emitter associated with the second optical sensor value are a same emitter, and the first optical sensor value and the second optical sensor value are determined using a single optical sensor.
In a sixth implementation, alone or in combination with one or more of the first through fifth implementations, the first displacement is greater than the second displacement, and the first optical sensor value is determined using an integration time that is longer than an integration time used to determine the second optical sensor value.
In a seventh implementation, alone or in combination with one or more of the first through sixth implementations, the first displacement is greater than the second displacement, and the emitter associated with the first optical sensor value uses a greater transmission power than a transmission power used by the emitter associated with the second optical sensor value.
In an eighth implementation, alone or in combination with one or more of the first through seventh implementations, the one or more measurements comprise one or more spectroscopy measurements.
In a ninth implementation, alone or in combination with one or more of the first through eighth implementations, the sensor device includes a first optical waveguide connecting the sensing location used to determine the first optical sensor value and an optical sensor of the one or more optical sensors, and a second optical waveguide connecting the sensing location used to determine the second optical sensor value and the optical sensor.
In a tenth implementation, alone or in combination with one or more of the first through ninth implementations, the sensor device includes a first optical filter associated with an optical sensor of the one or more optical sensors, and a second optical filter associated with the optical sensor, wherein the first optical filter corresponds to the sensing location associated with the first optical sensor value and the second optical filter corresponds to the sensing location associated with the second optical sensor value.
In an eleventh implementation, alone or in combination with one or more of the first through tenth implementations, the first optical filter and the second optical filter cover different regions of the optical sensor.
In a twelfth implementation, alone or in combination with one or more of the first through eleventh implementations, the first optical filter is configured to filter light from the first optical waveguide and the second optical filter is configured to filter light from the second optical waveguide.
In a thirteenth implementation, alone or in combination with one or more of the first through twelfth implementations, the one or more optical sensors are a single optical sensor.
In a fourteenth implementation, alone or in combination with one or more of the first through thirteenth implementations, the one or more optical sensors comprise a single optical sensor, and the emitter used to determine the first optical sensor value and the emitter used to determine the second optical sensor value are different emitters.
In a fifteenth implementation, alone or in combination with one or more of the first through fourteenth implementations, the one or more optical sensors comprise a plurality of optical sensors, and the emitter used to determine the first optical sensor value and the emitter used to determine the second optical sensor value comprise a same emitter.
In a sixteenth implementation, alone or in combination with one or more of the first through fifteenth implementations, the one or more optical sensors comprise a plurality of optical sensors, and the emitter used to determine the first optical sensor value and the emitter used to determine the second optical sensor value are different emitters.
In a seventeenth implementation, alone or in combination with one or more of the first through sixteenth implementations, each optical sensor of the plurality of optical sensors is associated with a respective filter.
In an eighteenth implementation, alone or in combination with one or more of the first through seventeenth implementations, the sensor device includes one or more collimators used to collimate light for determining the first optical sensor value and the second optical sensor value.
Although
The foregoing disclosure provides illustration and description, but is not intended to be exhaustive or to limit the implementations to the precise forms disclosed. Modifications and variations may be made in light of the above disclosure or may be acquired from practice of the implementations.
As used herein, the term “component” is intended to be broadly construed as hardware, firmware, and/or a combination of hardware and software.
As used herein, satisfying a threshold may, depending on the context, refer to a value being greater than the threshold, more than the threshold, higher than the threshold, greater than or equal to the threshold, less than the threshold, fewer than the threshold, lower than the threshold, less than or equal to the threshold, equal to the threshold, or the like.
It will be apparent that systems and/or methods described herein may be implemented in different forms of hardware, firmware, or a combination of hardware and software. The actual specialized control hardware or software code used to implement these systems and/or methods is not limiting of the implementations. Thus, the operation and behavior of the systems and/or methods are described herein without reference to specific software code—it being understood that software and hardware can be designed to implement the systems and/or methods based on the description herein.
Even though particular combinations of features are recited in the claims and/or disclosed in the specification, these combinations are not intended to limit the disclosure of various implementations. In fact, many of these features may be combined in ways not specifically recited in the claims and/or disclosed in the specification. Although each dependent claim listed below may directly depend on only one claim, the disclosure of various implementations includes each dependent claim in combination with every other claim in the claim set.
No element, act, or instruction used herein should be construed as critical or essential unless explicitly described as such. Also, as used herein, the articles “a” and “an” are intended to include one or more items, and may be used interchangeably with “one or more.” Further, as used herein, the article “the” is intended to include one or more items referenced in connection with the article “the” and may be used interchangeably with “the one or more.” Furthermore, as used herein, the term “set” is intended to include one or more items (e.g., related items, unrelated items, a combination of related and unrelated items, etc.), and may be used interchangeably with “one or more.” Where only one item is intended, the phrase “only one” or similar language is used. Also, as used herein, the terms “has,” “have,” “having,” or the like are intended to be open-ended terms. Further, the phrase “based on” is intended to mean “based, at least in part, on” unless explicitly stated otherwise. Also, as used herein, the term “or” is intended to be inclusive when used in a series and may be used interchangeably with “and/or,” unless explicitly stated otherwise (e.g., if used in combination with “either” or “only one of”).
Houck, William D., Von Gunten, Marc K., Hruska, Curtis R., Smith, Valton
Patent | Priority | Assignee | Title |
12055481, | Aug 01 2019 | VIAVI SOLUTIONS INC. | Sensor device |
Patent | Priority | Assignee | Title |
9714900, | Apr 30 2012 | Mayo Foundation for Medical Education and Research | Method and apparatus for selecting wavelengths for optical measurements of a property of a molecular analyte |
20030220549, | |||
20080021330, | |||
20080312533, | |||
20140275869, | |||
20150109617, | |||
20160151002, | |||
20190150763, | |||
20200029870, | |||
20200196918, | |||
20200315473, | |||
20210100485, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Mar 25 2020 | SMITH, VALTON | Viavi Solutions Inc | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 052238 | /0332 | |
Mar 25 2020 | HOUCK, WILLIAM D | Viavi Solutions Inc | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 052238 | /0332 | |
Mar 25 2020 | HRUSKA, CURTIS R | Viavi Solutions Inc | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 052238 | /0332 | |
Mar 26 2020 | VIAVI SOLUTIONS INC. | (assignment on the face of the patent) | / | |||
Mar 26 2020 | VON GUNTEN, MARC K | Viavi Solutions Inc | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 052238 | /0332 |
Date | Maintenance Fee Events |
Mar 26 2020 | BIG: Entity status set to Undiscounted (note the period is included in the code). |
Date | Maintenance Schedule |
Sep 28 2024 | 4 years fee payment window open |
Mar 28 2025 | 6 months grace period start (w surcharge) |
Sep 28 2025 | patent expiry (for year 4) |
Sep 28 2027 | 2 years to revive unintentionally abandoned end. (for year 4) |
Sep 28 2028 | 8 years fee payment window open |
Mar 28 2029 | 6 months grace period start (w surcharge) |
Sep 28 2029 | patent expiry (for year 8) |
Sep 28 2031 | 2 years to revive unintentionally abandoned end. (for year 8) |
Sep 28 2032 | 12 years fee payment window open |
Mar 28 2033 | 6 months grace period start (w surcharge) |
Sep 28 2033 | patent expiry (for year 12) |
Sep 28 2035 | 2 years to revive unintentionally abandoned end. (for year 12) |