A method for determining absolute orientation of a platform is disclosed. In one embodiment, a first sky polarization data set for a first time Ti is measured using a sky polarization sensor disposed on a platform. A second sky polarization data set is obtained at a second time Tj. A difference in orientation between the first sky polarization data set and the second sky polarization data set is determined using an orientation determiner. The difference in orientation is provided as at least one orientation parameter for the platform at time Tj. The at least one orientation parameter is used to provide a direction relative to a reference point on the platform.
25. A method for obtaining position data for a target with a position determining device, comprising:
acquiring a first position of a total station, the total station comprising a sky polarization sensor disposed in a known spatial relationship to said total station;
determining a first absolute orientation of said total station based on a sky polarization data set acquired using the sky polarization sensor;
translating said first absolute orientation into orientation information in a coordinate system at said total station;
capturing a distance to a target using a visual sighting device of said total station comprising an internal distance measuring system; and
calculating a position of said target based on said first position, said distance to said target, and said orientation information.
26. A method for obtaining position data for a target with a position determining system, comprising:
determining a position of a first location using a global navigation satellite system (GNSS) receiver of the position determining system;
determining a distance to a target from a visual sighting device of the position determining system using an internal distance measuring system of the visual sighting device;
determining an absolute orientation of the visual sighting device based on a sky polarization data measurement using a sky polarization sensor of said position determining system; and
calculating a position of said target based on said position of said first location, said distance from said visual sighting device to said target, and said absolute orientation of said visual sighting device.
28. A surveying system comprising:
a visual sighting device comprising a distance measuring component configured to measure a distance from the visual sighting device to a target;
a polarization sensor disposed in a known spatial relationship with said visual sighting device;
an orientation determiner configured to determine an absolute orientation of said visual sighting device based upon at least one sky polarization data set captured by said polarization sensor;
a global navigation satellite system (GNSS) receiver configured to determine a position at which said surveying system is located; and
a position determiner configured to determine a position of a target based upon said absolute orientation, said distance from the visual sighting device to said target, and the position at which said surveying system is located.
1. A method for obtaining position data for a target, comprising:
orienting a visual sighting device and a sky polarization sensor toward a target, wherein said visual sighting device comprises a range measurement system and is disposed in a known spatial relationship with said sky polarization sensor;
determining an absolute orientation of said visual sighting device based on a sky polarization data measurement using the sky polarization sensor;
determining a distance from said visual sighting device to said target using the range measurement system;
determining a position of a first location having a known spatial relationship to said visual sighting device; and
calculating a position of said target based on said position of said first location, said distance from said visual sighting device to said target, and said absolute orientation of said visual sighting device.
32. A method for determining a position of a target comprising:
orienting a visual sighting device and a sky polarization sensor toward a known location, wherein said visual sighting device comprises a range measurement system and is disposed in a known first spatial relationship with said sky polarization sensor;
determining an absolute orientation of said visual sighting device based on a sky polarization data measurement using the sky polarization sensor;
determining a distance from said visual sighting device to said known location;
accessing a set of coordinates of said known location;
determining a second spatial relationship between said visual sighting device and said target when said distance to said known location is determined; and
calculating a position of said target based on said set of coordinates, said distance to said known location, said known first spatial relationship, said second spatial relationship, and said absolute orientation of said visual sighting device.
27. A method for obtaining position data for a target, comprising:
determining a position of a first location of a visual sighting device and a sky polarization sensing device having a known spatial relationship with respect to the visual sighting device;
determining a first absolute orientation of the visual sighting device when the visual sighting device is at the first location and oriented toward a target, wherein the first absolute orientation is determined based on a first sky polarization data measurement using the sky polarization sensing device;
determining a position of a second location of said visual sighting device;
determining a second absolute orientation of the visual sighting device when the visual sighting device is at the second location and oriented toward the target, wherein the second absolute orientation is determined based on a second sky polarization data measurement using the sky polarization sensing device; and
calculating a position of said target based upon the position of said first location, said first absolute orientation of the visual sighting device, the position of said second location, and said second absolute orientation of the visual sighting device.
2. The method of
3. The method of
4. The method of
establishing said visual sighting device approximately above said first location without levelling said visual sighting device and wherein there is a known distance and orientation between said first location and said optical survey instrument.
5. The method of
establishing said visual sighting device vertically above said first location and leveled, and wherein there is a known distance between said first location and said optical survey instrument.
6. The method of
7. The method of
8. The method of
9. The method of
10. The method of
11. The method of
12. The method of
using a processing system to translate said sky polarization data measurement into an orientation in a coordinate system for said visual sighting device.
13. The method of
translating said orientation into the World Geodetic system (WGS) 84 coordinate system.
14. The method of
15. The method of
receiving a sky polarization data set at a known time T(i);
accessing a sun almanac to determine azimuth and elevation angles of the sun at said time T(i) and for said first location;
calculating a plurality of predicted polarization values for a plurality of respective sample points across the sky at said time T(i) and for said first location;
correlating a sky polarization measurement captured using said sky polarization sensor with at least one of said plurality of predicted polarization values, and calculating translation and rotation parameters of a transform between said sky polarization measurement and said at least one of said plurality of predicted polarization values;
converting said translation and rotation parameters of said transform from image-space to real-world coordinate frame; and
adding said rotation parameters in the real-world coordinate frame to the azimuth and elevation angles determined from said sun almanac, to determine the yaw, pitch and roll orientation parameters for said visual sighting device.
16. The method of
determining at least one orientation angle to a remote target wherein said at least one orientation angle is matched to a coordinate system by a setup procedure comprising:
establishing a platform vertically over a known position;
measuring a vertical distance from said known position to said platform;
determining a level orientation for said sky polarization sensor, as determined by a bubble-level; and
establishing said orientation of said platform in terms of said coordinate system by aiming said visual sighting device at a backsight of either a known azimuth or a known position while measuring said sky polarization data set.
17. The method of
using a polarization reference station, comprising a second sky polarization sensor having measurement axes which are aligned in a known orientation with respect to True North and a local gravity vector, to generate a sky polarization reference data set; and
storing said sky polarization reference data set at a sequence of known times including said known time T(i) using a first data storage system.
18. The method of
transmitting said sky polarization measurement from said polarization reference station to an orientation determiner coupled with said visual sighting device and said sky polarization sensor using a communications link.
19. The method of
transmitting said sky polarization measurement from a mobile device comprising said visual sighting device and said orientation determiner to a storage and processing system via a communications link;
sending said sky polarization reference data set from said first storage system to said storage and processing system; and
determining an orientation for said visual sighting device at said storage and processing system.
20. The method of
transmitting said sky polarization measurement from said orientation determiner to said polarization reference station;
processing said sky polarization measurement with reference data taken proximate to said known time T(i) to bring said sky polarization measurement and said sky polarization reference data set into congruence by calculating a series of axis rotations with resultant defined angles;
using said resultant defined angles as at least one orientation parameter for said visual sighting device at time T(i); and
transmitting said resultant defined angles to said orientation determiner as the orientation parameters for the time T(i) of the visual sighting system.
21. The method of
22. The method of
23. The method of
24. The method of
using a differential polarization process to determine said absolute orientation of said visual sighting device.
29. The surveying system
30. The surveying system of
31. The surveying system of
33. The method of
comparing a sky polarization measurement, captured using said sky polarization sensor, with a sky polarization reference data set to determine said absolute orientation.
34. The method of
receiving an indication of the time said sky polarization measurement was taken and an indication of the location at which said sky polarization measurement was taken.
35. The method of
receiving said sky polarization reference data set at said visual sighting device from a polarization reference source.
36. The method of
comparing a sky polarization measurement, captured using said sky polarization sensor with a sky polarization model based upon a time when said sky polarization measurement was taken.
37. The method of
comparing a sky polarization measurement, captured using said sky polarization sensor with a second sky polarization measurement captured by a second sky polarization sensor having a known absolute orientation.
38. The method of
storing said sky polarization model at said visual sighting device.
|
This application claims priority to U.S. Provisional Application Ser. No. 61/674,563, entitled “USE OF A SKY POLARIZATION SENSOR FOR ABSOLUTE ORIENTATION DETERMINATION IN POSITION DETERMINING SYSTEMS,” with filing date Jul. 23, 2012, by Peter Glenn France, assigned to the assignee of the present application, and hereby incorporated by reference in its entirety.
Position determination is required for the execution of a variety of activities. Some survey and mapping activities include determining the position of objects in the world, either near the position determination device or some distance away. Other activities include navigation or stakeout, where the coordinates of a destination is known, and the operator is guided to that destination by iteratively determining the position of a device and comparing that position with the destination coordinate. The position determination process in optical surveying typically starts from a known location, called a Point of Beginning (POB). The POB may have 2 or 3 coordinates (e.g., X, Y, and Z coordinates), in a specified coordinate system. Such a coordinate system may be a local one, where the location of a given starting point is given in “northing and easting” and height relative to a predetermined datum, or in latitude, longitude and height, or altitude. A typical optical survey system is set up and leveled directly over the POB, and the vertical distance between the POB and the instrument is measured, often with a tape measure. The system uses a visual sighting instrument in the form of a telescope with a built-in electronic distance measurement system to sight to a target located away from the POB. The telescope is coupled to a set of angle measurement devices that give elevation angle (also known as pitch) and a horizontal angle. The raw horizontal angle is relative, not absolute. To convert these relative horizontal angles into absolute azimuths (that is, with respect to true north), the first operation is to measure the horizontal angle in a known direction. This is usually accomplished by sighting to another known reference point, known as the backsight. These measurements are made on a tripod platform which enables the survey instrument to be manually leveled so that the vertical axis of rotation for horizontal angle determination is aligned with the local gravity vector. Leveling also ensures that the elevation angle is absolute (i.e., with respect to local gravity). Alignment of a local vertical axis on the measurement device with the gravity vector has been the standard way to create a reference plane from which azimuth and elevation angles can be measured. After measuring the angles and distance to a target, well-known geometric calculations determine the location of the first target relative to the POB. The survey process may continue by moving the survey instrument to directly over the first target location, repeating the leveling process for the tripod mounting system, and continuing to sight to the next target, again obtaining a range distance and angles to the new target.
The underlying assumption in the above data collection process is that the local vertical axis is well-behaved and well known, and serves as the common orientation reference by establishing a locally horizontal plane and vertical axis to which all range and horizontal angle measurements are referenced. The process always requires a re-leveling step for the survey instrument at each new target location, and the vertical distance between the new POB and the instrument is re-measured. There is quite an art to setting the instrument up so that it is both level and vertically above the POB. This setup procedure is time-consuming, and is a source of error if not accomplished accurately.
It should be noted that handheld laser rangefinders commonly measure absolute orientation and elevation angle, without leveling and backsighting. However this is achieved through electronic tilt sensors and magnetometers, which deliver only low accuracy measurements of roughly one degree, and so are not suitable for high accuracy positioning applications.
Total stations for surveying and construction measure relative angles to an accuracy between 0.5 and 10 arc-seconds; additional errors can be introduced by mis-levelling and imprecise target sighting as described above.
The introduction of Global Positioning System (GPS) and Global Navigation Satellite Systems (GNSS) receivers significantly reduced the need for optical surveying, as a 3D position can be determined from a single measurement of the GNSS receiver at a desired target location. But when high accuracy positioning is desired, as in most survey applications, the GNSS antenna still needs to be precisely positioned over the desired spot on the ground, or over some other physical target. Since GNSS operational practice dictates keeping the antenna at a known distance vertically above a ground target, often on a pole or rod about 2 meters in length, a different kind of leveling activity must be performed as well. Again, leveling a pole requires a bubble-level indicator as in the case of a survey tripod, and takes some time to bring the pole into a vertical alignment with sufficient accuracy to meet survey requirements. Further, there are many circumstances where the GNSS receiver cannot be located exactly at or over the target point.
GNSS receivers often do not give high accuracy positions in environments where the satellites are not clearly in view, such as near buildings or trees. In such cases an optical instrument such as a total station can be used. Alternatively a GNSS receiver can be combined with a total station, where the GNSS position is used to avoid the need to find an existing local POB. However the operator of a combined GNSS total station still needs to level it and take a backsight reading.
Reference will now be made in detail to various embodiments, examples of which are illustrated in the accompanying drawings. While the subject matter will be described in conjunction with these embodiments, it will be understood that they are not intended to limit the subject matter to these embodiments. On the contrary, the subject matter described herein is intended to cover alternatives, modifications and equivalents, which may be included within the spirit and scope as defined by the appended claims. In some embodiments, all or portions of the electronic computing devices, units, and components described herein are implemented in hardware, a combination of hardware and firmware, a combination of hardware and computer-executable instructions, or the like. Furthermore, in the following description, numerous specific details are set forth in order to provide a thorough understanding of the subject matter. However, some embodiments may be practiced without these specific details. In other instances, well-known methods, procedures, objects, and circuits have not been described in detail as not to unnecessarily obscure aspects of the subject matter.
Unless specifically stated otherwise as apparent from the following discussions, it is appreciated that throughout the present Description of Embodiments, discussions utilizing terms such as “measuring,” “obtaining,” “determining,” “providing,” “using,” “calculating,” “accessing,” “adding,” “orienting,” “establishing,” “translating,” “receiving,” “correlating,” “converting,” “adding,” “storing,” “transmitting,” “sending,” “processing,” “acquiring,” “capturing,” “generating,” “comparing” or the like, often (but not always) refer to the actions and processes of a computer system or similar electronic computing device. The electronic computing device manipulates and transforms data represented as physical (electronic) quantities within the electronic computing device's processors, registers, and/or memories into other data similarly represented as physical quantities within the electronic computing device's memories, registers and/or other such information storage, processing, transmission, or/or display components of the electronic computing device or other electronic computing device(s).
As will be discussed in greater detail below, in one or more embodiments, polarization sensor 101 comprises an image capture device which is configured to capture polarization data. The image capture device may comprise a photographic camera, equipped with appropriate polarization filters, which enable the image capture element, typically a Charge Coupled Device (CCD), or a Complimentary Metal-oxide Semiconductor (CMOS) detector, to capture images of the filtered light. In one or more embodiments, polarization sensor 101 also comprises algorithms that take those images as input and calculate intensity or degree of polarization, degree of polarization, and/or other values. Linear polarization is of primary interest in one or more embodiments. Polarization data may be captured even if there are clouds aloft, although clouds do modify the polarization pattern. An example polarization detector is further described with reference to
In another embodiment, the polarization sensor 101 comprises a photo diode sensor with polarization filters. The system described in a reference by Lambrinos (2000) uses three pairs of photo-diodes. Each pair has their polarization filters at right angles to its mate. The pairs are arranged at 0, 60, and 120 degrees relative to each other. The photo-diodes all view the same portion of the sky, with a field of view of about 60 degrees. This method is able to calculate orientation by using the fact that the sky polarization pattern has symmetry about the solar meridian.
As an example, polarization sensor 101 is configured to capture data indicating the polarization of the sky at the location where device 100 is located. It is noted that embodiments are not limited to still images alone and polarization sensor 101 can comprise a motion picture cameras in at least one embodiment. In various embodiments, visual sighting device 103 comprises a telescopic sighting device with or without a lens, binoculars, a camera with an electronic visual display system, or the like configured to permit users to view a real-time representation of their surroundings. In other embodiments, visual sighting device 103 comprises a robotic target tracker configured to track a target such that as the target moves, the visual sighting device 103 can continually measure the distance and orientation to the target. In some embodiments, the robotic tracker comprise servo motors rotating visual sighting device 103, guided by an optical target identification and tracking system that recognizes the location and movement of an optical target such as a prism. Orientation determiner 104 is configured to analyze the sky polarization data received from polarization sensor 101. Orientation determiner 104 is configured to determine the orientation of device 100, or a component thereof, in three axes (e.g., a relative orientation and/or an absolute orientation of visual sighting device 103). Components and operation of orientation determiner 104 is discussed in greater detail below with reference to
In accordance with various embodiments, wireless link 107 may operate on any suitable wireless communication protocol including, but not limited to: WiFi, WiMAX, WWAN, implementations of the IEEE 802.11 specification, cellular, two-way radio, satellite-based cellular (e.g., via the Inmarsat or Iridium communication networks), mesh networking, implementations of the IEEE 802.15.4 specification for personal area networks, and implementations of the Bluetooth® standard.
In accordance with one or more embodiments, polarization reference source 120 comprises a source of polarization data which is at a located apart from device 100. In one or more embodiments, polarization reference source 120 is configured to utilize measurements of sky polarization data at its location along with data indicating the absolute orientation of polarization reference source 120. In at least one embodiment, polarization reference source 120 provides this information to device 100 which facilitates determining the absolute orientation of visual sighting device 103. In an embodiment, the orientation of polarization sensor 101 is determined by performing coordinate axes rotations to bring the pattern of the polarization sensor 101 into congruence with the pattern from the polarization reference source. When the polarization reference source 120 is aligned with the celestial zenith, and its True North heading reference point is aligned with True North, then the rotation angles determined when congruence is obtained will become the orientation angles for the polarization sensor 101. Alternatively, in an embodiment, the vertical zenith alignment of the reference source 120 may be aligned with any other desired reference system, such as a local gravity vector. It is noted that time at which the polarization data is captured, both at device 100 and at polarization reference source 120, is factored into the information which is sent to device 100. In another embodiment, polarization reference source 120 is configured to utilize a model of an expected polarization pattern based upon the time of day and the location at which device 100 is located and to send this data to device 100 to facilitate determining the absolute orientation of visual sighting device 103. Alternatively, this model of the expected polarization pattern based upon the time of day and the location at which device 100 may be derived by device 100 itself. In another embodiment, device 100 is configured to send data (e.g., raw sky polarization data, or processed sky polarization data) to polarization reference source 120 which will then determine the absolute orientation of visual sighting device 103 using either the model of the expected polarization pattern, or the sky polarization data captured at polarization reference source 120. The absolute orientation of visual sighting device 103 can then be stored (e.g., at polarization reference source 120), or sent back to device 100. In various embodiments, polarization reference source 120 may be placed at a fixed location, or may be a mobile device which for which the absolute orientation is determined as a set procedure. It is noted that there is no necessity to determine the absolute orientation of visual sighting device 103 at the time data is captured. Instead, the sky polarization data captured by polarization sensor 101 can be stored and processed at a later time.
In an embodiment, position determination of an object remote from device 100 can be obtained with the system shown in
In
In the CAHV format, C provides a distance from a feature in a field of view to a perspective center (or entrance pupil) of an imaging device such as visual sighting device 103. The perspective center is generally on an axis passing through a lens of visual sighting device 103.
A coordinate transform may be used to convert the data from the CAHV format to an intermediate X′, Y′, and Z′ camera reference frame. In this intermediate reference frame, X′ and Y′ lie along an image plane that may be near a rear of visual sighting device 103, and Z′ extends outward along a lens axis.
In various embodiments, a second coordinate transform can be performed to convert the relative orientation of a component of device 100 (e.g., polarization sensor 101 or visual sighting device 103) to a real-world (e.g., GPS/GNSS) coordinate frame. In the real-world coordinate frame, the Z axis extends in a vertical direction parallel to a gravity vector, and the X and Y axes extend along a horizontal plane. This is shown in
The data may be converted from the real-world coordinate frame to spherical coordinates using known conversions. The vector r may be determined using the equation:
R=(X2+Y2+Z2)1/2 Equation (1)
As shown in
Using real-world coordinates X, Y, Z, the tilt angle and tilt direction of device 100 can be determined using the equations:
Tilt Angle (theta)=arcos(Z/r) Equation (2)
Tilt Direction (phi)=arctan(Y/X) Equation (3)
In one or more embodiments, a location of a point of interest can be determined using a location of a survey device and a tilt angle and tilt direction of the survey device. For example, referring to
The following equations can be used to determine the X and Y components of the ground offset and the Z component of offset error in at least one embodiment:
X1=r*cos(theta) Equation (4)
Y1=r*sin(theta) Equation (5)
Z1=r*sin(theta)*cos(phi) Equation (6)
Where r is the distance from the antenna phase center of GNSS receiver 111 to a tip of pole 110 of device 100.
In the embodiments of
It should be noted that leveling is typically not required in the embodiment of
In
In
In the example of
In the example of
Orientation of an object is defined by angles of rotation about three perpendicular dimensional axes associated with the object, commonly referred to as X, Y, and Z, or Easting, Northing, and altitude. In the following discussion, the term “absolute orientation” means that the Y axis of a device is aligned with true north and the Z axis is aligned with the gravity vector. These axes are also commonly referred to as yaw, pitch and roll.
Positioning is usually defined in terms of a coordinate system, using axes defined as latitude/longitude or northing/easting or X/Y. Altitude, height or Z, is measured relative to various datums but is usually parallel to the local gravity vector. Thus in nearly all cases, positioning coordinate systems are aligned with true north and gravity, so absolute orientations can be used directly for coordinate geometry. Another term of art is “pose.” Pose refers to the orientation of the object relative to some previously selected and defined frame of reference.
The angle sensors in traditional optical survey instruments inherently measure relative angle differences. For example, the horizontal angle is zeroed when the device is switched on; the horizontal axis simply measures angle differences between the backsight and the target, and the vertical angle is relative to the base of the instrument. This is why the instrument traditionally must be truly level before it can deliver elevation angles. Similarly, the azimuth to the backsight must be known in order to convert horizontal angle difference values into the azimuth to the target.
Alternatively, gyroscopes can be used to measure angle differences. Once the gyroscope is spinning, it tends to maintain its orientation from its initial axial alignment, and relative rotations can be measured. Unfortunately all gyros exhibit drift over time. A variety of underlying technology can be used, from ring laser gyros down to micro electro-mechanical system (MEMS) gyros, each having its own characteristics such as drift rate, size, weight, power and cost. Because of these compromises, gyros have not typically been used for angle difference measurement in land survey applications, although they are used in other applications such as airborne surveys, often combined with other sensors in an Inertial Measurement Unit (IMU).
In optical survey applications, positioning usually ideally requires absolute orientation, (e.g., horizontal angle with respect to true north (termed azimuth) and elevation angle with respect to gravity (termed elevation angle or vertical angle)). To achieve this with traditional optical instruments, each time the instrument is set up the operator uses a bubble-level to find a local vertical reference line, parallel to the gravity vector. Then the operator takes a backsight reading, which enables subsequent raw horizontal values to be converted to absolute azimuth values. Other gravity-sensing systems have been employed, such as a weighted pendulum.
A gyrocompass is another method of determining absolute azimuth. A gyrocompass (or gyrotheodolite) contains a gyroscope plus a mechanism that results in an application of torque whenever the compass's axis is not pointing north. This uses the earth's rotation in an effect known as precession. Such devices can deliver high accuracy, but this takes a long time (e.g., 20 minutes for 10 arc-seconds), and so are not useful for general purpose surveying.
GNSS-based devices using multiple antennae can also be used to determine absolute orientation. However, these devices are costly and the antennae must have large separation to get even moderate accuracy. Thus, they are not practical for many applications such as surveying in rugged terrain.
In accordance with at least one embodiment, computer vision techniques are used to measure orientation changes using a polarized camera (e.g., polarization sensor 101) pointing at the sky. Typically, unpolarized photos of the sky do not actually provide much textured information about the sky itself, and it can include a lot of detail that causes problems for correlation, such as fast-moving local clouds, buildings, trees, and overhead wires that may not be visible at both a reference and a mobile location. Polarized light measurements greatly reduce the effect of clouds, and make many earth-bound objects practically invisible because they do not emit polarized light. Images of the ground or local surroundings tend to include objects moving independently of the world, such as cars, people and shadows, which can cause severe difficulties to computer vision algorithms. Linear motion of the mobile device causes apparent rotation in some axes when nearby objects are in view, whereas the sky polarization avoids that effect because the sky if effectively at infinity. Additionally, sky polarization provides a world-wide texture that is present even when the sky is clear and texture-less to the human eye. As shown in
It is well known that as the sun's rays enter Earth's atmosphere, they are scattered by molecules of air, water, dust and aerosols, forming a defined polarization pattern. The Rayleigh sky model is a well known mathematical model of this effect. The pattern depends on the time of day and geographical location of the observer. Sky polarization is mostly linear, rather than circular. Both the intensity (or degree) and angle of polarization can be measured. However, the intensity varies substantially day to day, as well as being reduced by cloud cover, whereas the angle of polarization is unaffected by atmospheric disturbances and can be predicted with an acceptable level of precision by the Rayleigh sky model. The angle of polarization is moderately affected by clouds, such that insects can still navigate successfully under cloud cover.
Digital imaging can produce high resolution polarization maps of the sky. In one embodiment, polarization sensor 101 is configured to capture two images, each with a linearly polarized filter at 90 degrees to the other linearly polarized filter. In another embodiment, polarization sensor 101 is configured to capture three images with polarization filter angles of 0, 60 and 120 degrees respectively, or four images with polarization filter angles of 0, 45, 90 and 135 degrees respectively. In another embodiment, polarization sensor 101 is configured to capture a plurality of successive images with a polarizing filter at various respective angles. For example, polarizing sensor could capture 3 or more successive images with respective polarization filter angles of 0, 60, and 120 degrees. In one embodiment, the polarization filter can be rotated between the capture of respective images to achieve the respective polarization filter angles. It is recognized that polarization sensor 101 can capture more or less than 3 successive images in various embodiments. In another embodiment, polarization sensor 101 is configured with one image sensor and two or more polarizing filters. In this case each of the two or more polarizing filters receives the same view as the other. In accordance with one embodiment, a single image sensor is used in conjunction with one or more splitters, each directing light to a respective filter and separate sensor. In another embodiment, polarization sensor 101 comprises an image sensor (e.g., a charge coupled device (CCD), or a complimentary metal-oxide semiconductor (CMOS) device). Alternatively, polarization sensor 101 may utilize one or more cameras, each with a differently angled filter, CCD, or CMOS sensor.
In at least one embodiment, each pixel element of the image sensor has a respective polarizing filter at different angles to neighboring pixels. This is shown in greater in
Sky polarization occurs at most wavelengths, but typically filters are used to restrict measurements to the blue or ultraviolet spectral range, as used by insects. Experiments have shown that the extension of the e-vector pattern of the blue sky into the areas of the sky covered by clouds is more useful to an e-vector compass when the observer responds to shorter wavelengths. Thus, in one or more embodiments, polarization sensor 101 is configured to selectively pass wavelengths in the blue to ultraviolet spectral range.
When the sun is in full view, its light intensity can cause saturation, as many sensors have limited dynamic range. Thus, in at least one embodiment, the dynamic range of the most useful parts of the sky image is maximized. In one embodiment, multiple sensor elements, each with a restricted field of view, are implemented by polarization sensor 101 so that if one sensor element is saturated by the direct image of the sun, one or more other sensor elements can independently set their exposure to capture other parts of the sky with high fidelity. In one embodiment, sensor exposure can be controlled to intentionally over-expose the parts of the image around the sun in order to optimize the dynamic range of the rest of the captured image.
Orientation determiner 104 is configured to receive the data captured by polarization sensor 101 and to derive information regarding the orientation of device 100. Typically, cloud cover does not preclude orientation from sky polarization, unlike traditional celestial navigation. Furthermore, the polarization angle pattern is unique in subsets of the sky. Therefore the whole sky need not be visible; orientation can be calculated when parts of the sky are occluded by buildings, trees, or other obstructions. Additionally, the sun itself can be occluded, yet orientation can be calculated from the sky polarization of other parts of the sky, unlike traditional sun-shots.
In one or more embodiments, using orientation determiner 104, sky polarization measurements (alone) can be used to calculate all three axes of rotation of device 100: yaw, pitch and roll. In contrast, celestial navigation such as a sunshot with a sextant only determines azimuth. Elevation angle and roll are also detectable, but they are measured from the ocean horizon which strictly speaking is a separate measurement of the gravity vector unrelated to the sun, and which is unavailable to most land survey applications. To understand the difference, consider that the sun's location is a single point, so it is necessary to also measure gravity (by sighting the ocean horizon) in order to measure elevation angle, look up the almanac with an approximate location, and determine the azimuth to the sun. In contrast, the capture of sky polarization is not based upon a single point; it is a 3D hemisphere, which can be imaged as a two dimensional surface. Thus it follows that rotations of the measuring device in all three axes can be determined by, for example, correlating the captured sky polarization image from device 100 with a reference sky polarization model. Furthermore, the gravity vector does not need to be measured when correlating the captured sky polarization image with the reference sky polarization model.
In one or more embodiments, orientation determiner 104 measures the angle of polarization received at polarization sensor 101 in addition to, or instead of, measuring the intensity of polarization. Then, instead of simply looking for meridian symmetry, orientation determiner 104 correlates the measured values with the predicted values from a model such as Rayleigh's model. Alternatively, orientation determiner 104 correlates the measured values with a second set of measured values captured at a separate location. In one or more embodiments, orientation determiner 104 receives the predicted values (e.g., from a Rayleigh's model or a second set of measured values) from polarization reference source 120 via wireless link 107.
To use this information to determine orientation, orientation determiner 104 performs a correlation process to compute the relative rotations of device 100 around all three axes that result in the best match between the measurements captured by polarization sensor 101 and the model values, or second set of measurements. Those relative rotations comprise the orientation of device 100 relative to the sun.
In the method described above, orientation determiner 104 calculates the orientation of device 100 relative to the sun. Of course the sun is in constant motion across the sky, rotating by about 15 arc-seconds per second of time. Absolute orientation of device 100 (e.g., with respect to true north and gravity) can be determined by orientation determiner 104 if the azimuth and elevation angle of the sun are known. In one or more embodiments, these values can be accurately calculated by orientation determiner 104 using a solar almanac, if the time and approximate location of device 100 on earth is known. Furthermore, based upon the known spatial relationship between polarization sensor 101 and visual sighting device 103, the absolute orientation of visual sighting device 103 can also be determined. In one or more embodiments, GNSS receivers (e.g., 111 of
It is recognized that observations of sky polarization (e.g., captured by polarization sensor 101) do not always exactly match the Rayleigh model this can be due to various atmospheric factors such as double- or multiple-scattering effects. Pollution also causes variations in the number and size of aerosol particles, which also causes differences from simple models of sky polarization. Cloud cover and differences in ground albedo also have an effect on sky polarization measurements. These effects can constrain the orientation accuracy achievable from measuring the polarization pattern and comparing it to theoretical models. In one or more embodiments, rather than attempting to improve the theoretical models, one sky polarization measurement (e.g., captured by polarization sensor 101) is correlated against another sky polarization measurement (e.g., captured by polarization reference source 120) by orientation determiner 104, rather than against a theoretical model.
One advantage in using orientation determiner 104 to correlate a plurality of sky polarization measurements is that two contemporary real-sky measurements will typically be much more alike under most conditions, compared to the technique of comparing a real-sky measurement (e.g., captured by polarization sensor 101) with a model such as the Rayleigh model. This is because the factors that cause local variance from the ideal Rayleigh model are likely to be similar in both measurements. In other words, the factors that cause a variance from the Rayliegh model at device 100 are also likely to cause a similar variance from the Rayleigh model at the site where polarization reference source 120 is captured. For example, heavy cloud, pollution and city lights may cause the sky polarization to diverge significantly from a model, but two measurements (from devices nearby in time and space) will see sky polarization affected by the environment, and so will be relatively similar and more highly correlated. For this reason, differential correlation algorithms utilized by orientation determiner 104 will be much more robust, and will deliver higher accuracy results. Another advantage of the differential technique is that polarization model values do not have to be calculated. In contrast, the model-comparison method should ideally re-calculate the model for each orientation update, which is computationally expensive at high update rates.
There are two important methods of employing the differential approach in accordance with various embodiments: absolute referenced differential orientation and autonomous relative orientation.
Instead of a single polarization measurement device, two devices are used in one or more embodiments. One device is a stationary reference (e.g., polarization reference source 120 of
The static reference device (e.g., polarization reference source 120) may be permanently emplaced at a known location, in much the same way as GNSS reference stations are installed. This would make it easier to co-locate, to share the power, computing, and communications resources of the GNSS reference stations to provide sky polarization data to device 100. However other configurations are possible, especially if it is desired to locate polarization reference source 120 close to device 100. For example a polarization reference source 120 could be established near a construction site. In one embodiment, polarization reference source 120 can be vehicle-mounted so that it is nearby when a field worker exits the vehicle to perform a survey. The only requirement is to establish the absolute orientation of polarization reference source 120, which can be done in the traditional way by leveling over a control point and backsighting, or by using a twin-GNSS system, where orientation is calculated by using two GNSS antennae separated by a distance (e.g., the Trimble SPS555H receiver). This setup of polarization reference source 120 only needs to be done once regardless of where device 100 is moved. Thus it is still much more productive than traditional optical surveying, which has to be leveled and backsighted every time the instrument is moved.
Differential measurement correlation can also be performed by orientation determiner 104 on measurements from a single device (e.g., polarization sensor 101) over time. In one embodiment, polarization sensor 101 measures the sky polarization at times t1 and t2, then orientation determiner 104 computes the change in orientation of device 100 over that duration. This is advantageous in that no reference device (e.g., polarization reference source 120) and no data communications are required. It also enables a relatively high update rate, without being throttled by data transfer speed from a remote resource. The downside is that it delivers only relative orientation changes. However the relative orientation values are typically sufficiently accurate, and atmospheric changes at device 100 will cause only slow drift.
If high accuracy is required or there is a significant time between the measurements, then the solar movement from frame to frame can be calculated by orientation determiner 104 and applied to the relative orientation. The solar movement is simply the difference between the azimuth and elevation angles calculated from the solar almanac for the location and times.
Each measurement of sky polarization is actually a set of data values over a spatial region (the region's shape is determined by the lens of polarization sensor 101). The data values calculated are angle or intensity of polarization, or potentially other Stokes parameters. When using camera-based measurement, these values are typically calculated per pixel, so it is convenient to think of the measurement data as an image, and use image processing techniques when correlating sky polarization measurements.
The mobile measurement (e.g., taken by device 100) is taken with unknown orientation. In theory, the mobile image can be rotated in all three axes until it matches the static reference measurement provided, for example, by polarization reference source 120. As an example, assume that the measurements captured by device 100 are taken roughly upwards (towards zenith) with a fish-eye lens. If polarization reference source 120 and device 100 have the same orientation (for example level and with true north at the top), then there will be negligible difference between the images. But if device 100 is rotated 20 degrees horizontally (i.e., around the vertical axis), then the image correlation performed by orientation determiner 104 will detect a rotation of 20 degrees in the horizontal plane. Similarly pitch and roll rotation of device 100 cause translation along the image sensor pixel axes.
In practice the correlation matching performed by orientation determiner 104 will be inexact because of different atmospheric effects and different skyline obstructions. Different lenses used by polarization sensor 101 and polarization reference source 120, as well as different image resolutions, may cause scale changes or other aberrations. Thus, in one or more embodiments, algorithm used by polarization processor is used to find a best fit between the sky polarization measurements captured by polarization sensor 101 and the sky polarization measurements captured by polarization reference source 120 and/or a sky polarization model. For the purpose of the present Description of Embodiments, the term “best fit” is understood to mean the solution in which there is the least difference between the sky polarization measurement captured by polarization sensor 101 and the sky polarization reference data set with which is correlated.
There are several types of algorithms available for correlating images and determining rotation, translation and scale changes. Examples which can be used in various embodiments are the Wavelet transformation method, Fast Fourier transformation method, Morphological Pyramid approach, and Genetic Algorithms. In one embodiment, Fast Fourier Transforms are use by orientation determiner 104 to find global transforms with small scale changes between sky polarization measurements captured by polarization sensor 101 and sky polarization measurements captured by polarization reference source 120 and/or a sky polarization model. Because the scale changes will be constant for a given pair of reference and mobile polarization sensors, at least one embodiment accounts for scale changes between polarization sensor 101 and the polarization sensor used by polarization reference source 120 before running the correlation algorithm used by orientation determiner 104.
Lens distortion can introduce correlation variances that are undesirable because they are not directly related to orientation changes of device 100. Therefore in one embodiment, a measurement of the lens distortion of polarization sensor is performed perhaps as a factory calibration procedure. This lens distortion can be compensated for, either by creating a corrected image at device 100, or by accounting for the known distortion of polarization sensor 101 during the pre-processing or correlation by orientation determiner 104.
In one or more embodiments, it is possible to use other correlation methods that do not use image-processing algorithms. As a simplistic example, a number of samples can be taken by orientation determiner 104 from the first measurement dataset captured by polarization sensor 101, at spatial intervals across the sky (e.g., in a regular rectangular grid pattern). Each sample value can be subtracted from the value of a second measurement dataset captured by polarization sensor 101 at the same spatial location, and the total difference of the sample values can be summed. Then the first dataset can be re-sampled with the sampling grid rotated or translated and the process is repeated. A smaller total difference between the first measurement dataset and the second measurement dataset indicates a better fit. This process iterates, trying other orientations of the sky polarization measurements captured by polarization sensor 101 until the total difference is minimized. In one embodiment, a least squares method can also be used by orientation determiner 104.
It should be noted that there are many other computer vision algorithms used for finding correspondences between images, such as speeded up robust feature (SURF). However many of these algorithms rely on finding identifiable local features, such as corners or lines. Thus they require a reasonable level of texture across relatively small areas of the image. Thus they are not optimal for use with polarization patterns, which have relatively gradual changes over large portions of the sky.
Image correlation calculates the orientation of device 100 with respect to the static device (e.g., polarization reference source 120). But most applications need to determine the absolute orientation with respect to true north and true zenith (the direction of the gravity vector). This is achieved by determining the absolute orientation of polarization reference source 120, and applying the calculated difference of the sky polarization measurement from polarization reference source 120 to the sky polarization measurement captured by polarization sensor 101 of device 100. One formula for performing this operation is shown in Equation 7 below:
RDM=RDRRRM Equation (7)
All sky polarization orientation devices need to be calibrated, to determine the spatial relationship between image sensor data and either true north and gravity, a sighting system, or the body of the device. This is true for model-comparison systems, autonomous differential systems, and reference differential systems. For example in a camera-based device, it is unlikely that the axis of polarization sensor 101 is perfectly perpendicular to the device housing, or to the axis of visual sighting device 103. Thus in one or more embodiments, calibration is performed to measure the misalignment between the axis of polarization sensor 101 and the axis of visual sighting device 103, to a high degree of accuracy so the error can be compensated.
Typically, each sky polarization sensor 101 is only calibrated once (assuming no physical damage occurs to device 100). Thereafter, the polarization detected for a given orientation of polarization sensor 101 on an assembled system will be consistent. In other words, once polarization sensor 101 is aligned with the equipment, it will always produce a true north reference direction.
Traditional surveying techniques provide methods for alignment of sensors, such as mounting the device over a known point, precisely levelling it, and sighting another known point at a distance. However an extra step is required to calibrate to the plane of the actual image sensor(s). There are at least two methods, using a standard nearby target, and imaging the sky.
In one embodiment, a target surface is placed in view of the optics of device 100 (e.g., visual sighting device 103). This target surface is located so that it is in a known spatial relationship (such as perpendicular) to either the mounting for device 100 (e.g., base 401 of
Other calibration methods used in various embodiments do not use a standard target as described above, but use the sky or sun itself as the calibration target. They utilize the fact that we already have a sky camera (e.g., polarization sensor 101), and use it to directly calculate the absolute orientation (yaw, pitch and roll) of the image sensor(s). The advantages are that the sun's elevation and azimuth can be precisely calculated, so no other target needs to be installed and itself calibrated. In fact, no factory calibration is required. They require access to a solar ephemeris which can be stored locally on device 100 such as in data storage unit 1212 of
The disadvantage is that a large part of the sky needs to be in view, which is difficult in a manufacturing environment. However these methods may be useful as an uncomplicated setup procedure for a static reference station, or as a way of independently checking a factory calibration.
One method utilized in various embodiments compares sky polarization measurements with a model, and averages over time. First, sky polarization measurements are correlated with a predictive model to determine 3-axis orientation of device 100, or polarization reference source 120. Then, average orientations of device 100, or polarization reference source 120, are determined over time (e.g., a day or several days). In one or more embodiments, the averaging can be weighted by the closeness of the match, for example to de-weight results when the sky is cloudy.
Another calibration method used in various embodiments utilizes sun-shots over the course of a day. The equipment being calibrated (e.g., device 100, or polarization reference source 120) can be used to directly measure the location of the sun (e.g., using a centroid-of-area algorithm) and using a dark filter to avoid overexposure of the polarization sensor. As an example, multiple sun-shot measurements are taken over the course of a day, as the sun's location (azimuth and elevation angle) moves across an arc of the sky. Each individual sun-shot resolves two unknowns (azimuth and elevation angle), so only two sun-shots (preferably at widely spaced times) are needed to resolve the 3-axis orientation of device 100 or polarization reference source 120. Additionally, extra information from subsequent sun-shots can be averaged to increase precision.
As an example,
The camera image can act as an independent measurement of angles. dX and dY are the differences in pixels of the sun's location at the two times, assuming the camera is stationary. With a well-calibrated camera, dX and dY subtend known angles. The roll angle can be calculated as follows:
let dXangle=dX converted from pixels to angle
and dYangle=dY converted from pixels to angle
tan Γ1=(Difference in Vertical Angle)/(Difference in Azimuth)
tan Γ2=dYangle/dXangle Equation (8)
then Roll=θ2−θ1 Equation (9)
Then any other difference in X and Y, measured from the image, can be converted to a difference in azimuth 1601 and a difference in vertical angle 1602.
θ2=arctan(dYangle/dXangle)
θ1=Roll−θ2
h=√(dX2+dY2)
Difference in Vertical Angle=h·cos θ1 Equation (10)
Difference in Azimuth=h·tan θ1 Equation (11)
One advantage is that this method is not affected by possible variations in polarization, or shortcomings of a polarization model. Another advantage is the increased precision from averaging a large number of samples. In addition, there is no need to calculate polarization models.
When calibrating a polarization device as a static differential reference (e.g., polarization reference source 120), it need not be manually leveled; it just needs to be stationary and aimed at the sky. The sun/sky observations are used to calculate the absolute orientation of the image sensor used by polarization reference source 120) in that configuration.
When calibrating a mobile device such as device 100, then it is also necessary to calculate the relationship of the polarization sensor 101 to the vector of the associated visual sighting device 103, or to the instrument body of device 100. Therefore the latter vector can be determined during calibration. This can be achieved by setting up over a known point, leveling, and sighting another point with known 3D coordinates to capture an elevation angle and azimuth to the known point, or by using a fixture that holds the device body in a known orientation, during the sky/sun calibration process.
Other celestial objects such as the moon, planets or selected stars could be tracked if required, but the sun is much easier to identify automatically because of its unique brightness. And, conveniently, it is available during normal working hours.
Some applications such as navigation only require a few degrees of orientation accuracy. However many other applications such as remote survey measurement require much higher accuracy, and are the motivation for the differential sky polarization approach. Accuracy is affected by timing, image resolution, baseline, and correlation performance.
The sun appears to move across the sky at roughly 15 arc-seconds per second of time. This speed varies on the season, the latitude, and the time of day. But clearly, precise time-tagging of measurements is important to achieve survey-grade orientation accuracy of 0.5-10 arc-seconds.
This method should work even when only relatively small patches of sky are visible. It is better than matching a model, as the models include large expanses with very little variation, whereas real measurements will include small variations that can provide matchable texture. Of course, accuracy will suffer if the sky visibility is restricted so much that insufficient texture is available.
It seems obvious that correlation performance affects accuracy. But most pairs of measurements will have various kinds of differences such as occlusions and noise, and some correlation algorithms will perform better than others. Choice of algorithm will affect accuracy.
An underlying assumption is that the sky polarization pattern will be similar enough in the two locations, for matching to achieve the accuracy required. The baseline distance between the two devices will affect this assumption. In clear conditions the similarity assumption holds over relatively long distances, which are still only a small fraction of the earth's radius. Under conditions with significant local variation, such as some types of cloud and pollution, the baseline will be more restricted in order to achieve a given orientation accuracy.
High accuracy GNSS systems also use differential corrections, usually via one or more reference stations, which transfer measurements either for real-time differential correction or for post-processing. A very convenient solution would be to establish a reference polarization measurement device at the same location as each GNSS base station, sharing the same communications channel and computer processor, and using the GNSS location and time in the ephemeris calculation.
Some GNSS reference implementations establish a network of base stations, model the errors over the coverage area, and create synthetic ‘local’ corrections for each client mobile GNSS device. This is known as a Virtual Reference Station (VRS). This principle can be applied to polarization: polarization can be measured at various locations, such as at a plurality of polarization reference sources 120, across the network area, and synthetic local polarization values can be calculated for the location of each mobile device 100 within the coverage area. The synthetic values can be calculated from interpolation, or modelling such as monitoring the differences over time and space between the actual values and the Rayleigh predictions.
A simple reference differential system could transfer the entire reference measurement dataset whenever device 100 needs to calculate its orientation (or store this data for processing later). But many applications demand a relatively high update rate, which would require a prohibitively high data transfer rate. There are several ways to minimize the amount of data to be transferred.
The data transfer rate is directly related to the resolution of the static sensor (e.g., polarization sensor 1901 of
One simple compression method is to only transmit pixel values that have a large change from the previous frame. Another method is to calculate spatial and temporal differences from a model (Rayleigh etc), and only transmit the differences that are larger than a threshold. That avoids transmitting a lot of pixel values that simply follow the model, although it requires the device 100 to calculate the model values and then apply the differences.
As the measurement data values are similar to an image, then standard image compression techniques can be used. Each static measurement can be compressed using a variety of algorithms, such as JPEG. Or successive static measurements can be treated as video, and compressed with an algorithm such as MPEG. Also, as many compression algorithms use variants of Fourier transforms, and the image correlation can also use Fourier techniques, there may be efficiencies to be gained here, such as avoiding to-and-fro conversion of data.
It is also possible to implement custom compression that takes advantage of known properties of the data. One example is to account for the motion of the sun over time. This is predictable from the ephemeris, and of course the polarization pattern moves accordingly, even over short time spans. Therefore the mobile device can create synthetic extrapolated reference data by calculating the motion of the sun since the last actual reference data received, and rotating and translating the reference data accordingly, as a good prediction of the likely polarization pattern.
Another method of extrapolation (for the purposes of minimization of data transfer) is to calculate autonomous relative orientation updates for the duration between reference data frames. The most recent mobile data frame is correlated with the previous mobile data frame (instead of comparing with the reference). This allows a relatively high update rate without a correspondingly high rate of data transmission from the reference device. The arrival of the next reference data frame will enable the absolute orientation to be recalculated to reset any drift. To achieve the highest accuracy, the autonomous relative correlations should account for the movement of the sun during the interval between mobile measurements; this information is available from the almanac. This method combines the two differential techniques (autonomous relative updates and absolute referenced differential orientation), to achieve a high update rate with low data transfer rate.
Some applications such as machine control require orientation to be calculated at a high rate such as 10 Hz, with minimal latency. An optimal solution could be to add sensors such as MEMS gyroscopes, which make frequent measurements of relative orientation changes. To meet the requirements of the application these other sensor values can be combined in a Kalman filter with the sky polarization orientation values, which may be less frequent but which drift less over time and can provide absolute orientation. Other sensors can also be combined, such as GNSS which adds absolute geographical location and precise time.
Another advantage of the differential approach is that it allows some operation at night. The solar polarization effect continues well after dusk and before dawn. The polarization of moonlight causes a similar effect which, although 100 million times less intense, is known to be used successfully by dung beetles for navigation. Of course success is unlikely during moonless nights.
An important advantage of the differential approach is that it allows orientation differences to be calculated just by observing whatever polarization pattern is present, without requiring a solar or lunar ephemeris or polarization model. More advanced methods may need a solar or lunar ephemeris, in order to predict the motion of the sun or moon over time.
RUM=RURRRMRST Equation 12
where
With reference now to
Computer system 1200 of
Referring still to
Referring still to
As discussed above another calibration method used in various embodiments utilizes sun-shots over the course of a day. In accordance with one embodiment, multiple sun-shot measurements are taken over the course of a day, as the sun's location (azimuth and elevation angle) moves across an arc of the sky. Each individual sun-shot resolves two unknowns (azimuth and elevation angle), so only two sun-shots (preferably at widely spaced times) are needed to resolve the 3-axis orientation of device 100 or polarization reference source 120. Additionally, extra information from subsequent sun-shots can be averaged to increase precision.
As an example,
In accordance with various embodiments, wireless link 1907 may operate on any suitable wireless communication protocol for communicating with device 100 including, but not limited to: WiFi, WiMAX, WWAN, implementations of the IEEE 802.11 specification, cellular, two-way radio, satellite-based cellular (e.g., via the Inmarsat or Iridium communication networks), mesh networking, implementations of the IEEE 802.15.4 specification for personal area networks, and implementations of the Bluetooth® standard.
In an alternative embodiment, device 100 sends data to a storage and processing system 2601 such as sky polarization measurements, time/date and location at which the sky polarization measurements were captured, relative azimuth of visual sighting device 103 to a target, or other data via message 2615. Similarly, polarization reference source 120 can send sky polarization reference data in message 2620 such as a model of expected sky polarization. Alternatively, message 2620 may comprise a sky polarization reference measurement comprising a second sky polarization measurement captured by polarization reference source 120 using a second polarization sensor having a known absolute orientation. In at least one embodiment, storage and processing system 2601 is configured to store this data in a data storage unit (e.g., 1212 of
Patent | Priority | Assignee | Title |
10101459, | Jun 06 2014 | CARLSON SOFTWARE, INC | GNSS mobile base station and data collector with electronic leveling |
10244154, | Feb 05 2016 | BOE TECHNOLOGY GROUP CO , LTD ; Peking University | Imaging collecting system, image collecting processing system and image collecting processing method |
10451737, | Oct 31 2016 | SAMSUNG SEMICONDUCTOR, INC | Fast scanning with dynamic voxel probing |
10466050, | Jun 06 2014 | CARLSON SOFTWARE, INC | Hybrid total station with electronic leveling |
10473921, | May 10 2017 | SAMSUNG SEMICONDUCTOR, INC | Scan mirror systems and methods |
10477149, | Jan 20 2016 | SMITS, GERARD DIRK | Holographic video capture and telepresence system |
10502815, | Dec 18 2015 | SAMSUNG SEMICONDUCTOR, INC | Real time position sensing of objects |
10563980, | Dec 23 2016 | Topcon Positioning Systems, Inc | Enhanced remote surveying systems and methods |
10564284, | Dec 27 2016 | SAMSUNG SEMICONDUCTOR, INC | Systems and methods for machine perception |
10581160, | Dec 16 2016 | JPMORGAN CHASE BANK, N A , AS ADMINISTRATIVE AGENT | Rotational wireless communication system |
10591605, | Oct 19 2017 | SAMSUNG SEMICONDUCTOR, INC | Methods and systems for navigating a vehicle including a novel fiducial marker system |
10725177, | Jan 29 2018 | SAMSUNG SEMICONDUCTOR, INC | Hyper-resolved, high bandwidth scanned LIDAR systems |
10871572, | Jun 06 2014 | Carlson Software, Inc. | GNSS mobile base station and data collector with electronic leveling and hands-free data collection |
10935989, | Oct 19 2017 | Methods and systems for navigating a vehicle including a novel fiducial marker system | |
10955236, | Apr 05 2019 | Faro Technologies, Inc. | Three-dimensional measuring system |
10962867, | Oct 10 2007 | Method, apparatus, and manufacture for a tracking camera or detector with fast asynchronous triggering | |
11067794, | May 10 2017 | SAMSUNG SEMICONDUCTOR, INC | Scan mirror systems and methods |
11137497, | Aug 11 2014 | SAMSUNG SEMICONDUCTOR, INC | Three-dimensional triangulation and time-of-flight based tracking systems and methods |
11204245, | Jun 06 2014 | CARLSON SOFTWARE, INC | Hybrid total station with electronic leveling |
11422270, | Jun 06 2014 | Carlson Software, Inc. | GNSS mobile base station and data collector with electronic leveling and hands-free data collection |
11437718, | Dec 16 2016 | GoPro, Inc. | Rotational wireless communication system |
11474257, | Jun 06 2014 | Carlson Software, Inc. | GNSS mobile base station and data collector with electronic leveling and hands-free data collection |
11662470, | Jun 06 2014 | Carlson Software, Inc. | Survey range pole and data collector with electronic height detection, leveling and hands-free data collection |
11709236, | Dec 27 2016 | SAMSUNG SEMICONDUCTOR, INC | Systems and methods for machine perception |
11829059, | Feb 27 2020 | High resolution scanning of remote objects with fast sweeping laser beams and signal recovery by twitchy pixel array | |
11881629, | Dec 16 2016 | GoPro, Inc. | Rotational wireless communication system |
9881028, | Jul 10 2014 | Photo-optic comparative geolocation system | |
ER8894, |
Patent | Priority | Assignee | Title |
1990607, | |||
5903235, | Apr 15 1997 | Trimble Navigation Limited | Handheld surveying device and method |
8390696, | Jan 06 2009 | Panasonic Corporation | Apparatus for detecting direction of image pickup device and moving body comprising same |
8654179, | Dec 25 2008 | Panasonic Corporation | Image processing device and pseudo-3D image creation device |
20020178815, | |||
20030160757, | |||
20040233461, | |||
20100275334, | |||
20110018990, | |||
20110050854, | |||
20110231094, | |||
20110275408, | |||
20120127455, | |||
CN101413799, | |||
EP1990607, | |||
EP2375755, | |||
WO2011036662, | |||
WO2013103725, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Mar 07 2013 | FRANCE, PETER GLEN | Trimble Navigation Limited | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 030132 | /0216 | |
Mar 14 2013 | Trimble Inc. | (assignment on the face of the patent) | / | |||
Sep 30 2016 | Trimble Navigation Limited | TRIMBLE INC | MERGER AND CHANGE OF NAME SEE DOCUMENT FOR DETAILS | 042830 | /0137 | |
Sep 30 2016 | TRIMBLE INC | TRIMBLE INC | MERGER AND CHANGE OF NAME SEE DOCUMENT FOR DETAILS | 042830 | /0137 |
Date | Maintenance Fee Events |
Jan 17 2017 | ASPN: Payor Number Assigned. |
Aug 05 2020 | M1551: Payment of Maintenance Fee, 4th Year, Large Entity. |
Jul 30 2024 | M1552: Payment of Maintenance Fee, 8th Year, Large Entity. |
Date | Maintenance Schedule |
Feb 07 2020 | 4 years fee payment window open |
Aug 07 2020 | 6 months grace period start (w surcharge) |
Feb 07 2021 | patent expiry (for year 4) |
Feb 07 2023 | 2 years to revive unintentionally abandoned end. (for year 4) |
Feb 07 2024 | 8 years fee payment window open |
Aug 07 2024 | 6 months grace period start (w surcharge) |
Feb 07 2025 | patent expiry (for year 8) |
Feb 07 2027 | 2 years to revive unintentionally abandoned end. (for year 8) |
Feb 07 2028 | 12 years fee payment window open |
Aug 07 2028 | 6 months grace period start (w surcharge) |
Feb 07 2029 | patent expiry (for year 12) |
Feb 07 2031 | 2 years to revive unintentionally abandoned end. (for year 12) |