An attitude angle sensor correcting apparatus for an artificial satellite of the present invention includes a satellite attitude estimator. The satellite attitude estimator reads geographical image data out of an image data memory, produces a gcp (Ground Control Point) position included in the image data by stereo image measurement, and then estimates the instantaneous satellite attitude angle on the basis of a relation between the measured gcp position and a true gcp position. An attitude angle sensor data corrector corrects measured attitude angle data with estimated satellite attitude data output from the satellite attitude estimator and corresponding in time to the measured attitude angle data. The attitude angle sensor data corrector outputs an estimated satellite attitude signal.

Patent
   6336062
Priority
Dec 10 1999
Filed
Dec 07 2000
Issued
Jan 01 2002
Expiry
Dec 07 2020
Assg.orig
Entity
Large
13
12
all paid
1. An attitude angle correcting apparatus for an artificial satellite, comprising:
an attitude angle sensor data memory for storing a signal output from sensing means responsive to an attitude angle of the artificial satellite;
an image data memory for storing geographical image data representative a same geographical area, where a (Ground Control Point) gcp is located, shot at a plurality of positions;
a satellite attitude estimator for generating estimated attitude data of the artificial satellite on the basis of a difference between a true gcp value representative of a true position of the gcp and a measured gcp value produced by image measurement using said geographical image data stored in said image data memory; and
an attitude angle sensor data corrector for estimating an attitude angle of the artificial satellite by using said estimated attitude data, and then correcting measured attitude angle data, which is read out of said attitude angle sensor data memory, with said attitude angle estimated to thereby generate an estimated attitude angle signal.
2. An apparatus as claimed in claim 1, wherein a plurality of GCPs (Ground Control Points) are located on geometry represented by said geometrical image data.
3. An apparatus as claimed in claim 2, wherein said satellite attitude estimator estimates an attitude angle of the artificial satellite by describing, for each of the GCPs whose positions can be measured on the basis of said geographical image data and whose true values are known, a relation between the measured gcp value and the true gcp value by use of a Moor-Penrose quasi-inverse matrix.
4. An apparatus as claimed in claim 2, wherein when said satellite attitude estimator relates the measured value and the true value of each of the GCPs, positions of which can be measured on the basis of the geographical image data and true values of which are known, by using a constant coefficient matrix: E = [ e 1 e 2 e 3 e 4 e 5 e 6 e 7 e 8 1 ] image" FILE="US06336062-20020101-M00014.TIF"/>
said satellite attitude estimator estimates an attitude angle error of the artificial satellite by using a result of singular value resolution of said constant coefficient matrix.
5. An apparatus as claimed in claim 1, wherein said satellite attitude estimator estimates an attitude angle of the artificial satellite by describing, for each of the GCPs whose positions can be measured on the basis of said geographical image data and whose true values are known, a relation between the measured gcp value and the true gcp value by use of a Moor-Penrose quasi-inverse matrix.
6. An apparatus as claimed in claim 1, An apparatus as claimed in claim 2, wherein when said satellite attitude estimator relates the measured value and the true value of each of the GCPs, positions of which can be measured on the basis of the geographical image data and true values of which are known, by using a constant coefficient matrix: E = [ e 1 e 2 e 3 e 4 e 5 e 6 e 7 e 8 1 ] image" FILE="US06336062-20020101-M00015.TIF"/>
said satellite attitude estimator estimates an attitude angle error of the artificial satellite by using a result of singular value resolution of said constant coefficient matrix.

The present invention relates to an attitude angle sensor correcting apparatus for correcting measured attitude angle data, which is output from an attitude angle sensor mounted on an artificial satellite, with an estimated attitude angle derived from geographical image data.

A conventional attitude angle sensor correcting apparatus for a satellite application includes an attitude angle sensor data memory and an attitude angle sensor noise corrector. The attitude angle sensor noise corrector produces an attitude angle correction signal by using measured attitude angle data read out of the attitude angle sensor data memory. The prerequisite with the attitude angle sensor correcting apparatus is that a positional relation between an attitude angle sensor and the center of gravity of a satellite on which it is mounted is precisely measured and is strictly controlled even in the space. When an error (alignment error) occurs in the attitude angle sensor due to some cause, sensor correction accuracy is critically lowered. Moreover, because a reference value for correcting alignment errors is not available, the detection of alignment errors itself is not practicable.

Technologies relating to the present invention are disclosed in, e.g., Japanese Patent Laid-Open Publication Nos. 59-229667, 1-237411, 7-329897 and 11-160094 as well as in Japanese Patent Publication No. 61-25600.

It is therefore an object of the present invention to provide an attitude angle sensor correcting apparatus capable of shooting tridimensionally a plurality of GCPs (Ground Control Points) located on the ground with, e.g., a camera, producing estimated satellite attitude data from measured GCP values and true GCP values, and correcting measured attitude angle data with the estimated attitude data to thereby correct attitude angle sensor data.

The above and other objects, features and advantages of the present invention will become more apparent from the following detailed description taken with the accompanying drawings in which:

FIG. 1 is a block diagram schematically showing a conventional attitude angle sensor correcting apparatus for an artificial satellite;

FIG. 2 is a schematic block diagram showing an attitude angle sensor correcting apparatus embodying the present invention;

FIG. 3 is a view for describing the principle of stereo image measurement to be executed by a satellite attitude estimator included in the illustrative embodiment; and

FIG. 4 is a view demonstrating GCP correction to be also executed by the satellite attitude estimator.

To better understand the present invention, brief reference will be made to a conventional attitude angle sensor correcting apparatus mounted on an artificial satellite, shown in FIG. 1. As shown, the apparatus includes an attitude angle sensor data memory 101 and an attitude angle sensor noise corrector 102. Measured attitude angle data 103 is read out of the attitude angle sensor data memory 101. The attitude angle sensor noise corrector 102 outputs an attitude angle correction signal 104.

Specifically, the attitude angle sensor data memory 101 stores the measured attitude angle data 103. The attitude angle sensor noise corrector 102 estimates measurement noise contained in the attitude angle data 103 by using a statistical probability model. The corrector 102 then removes noise components from the attitude angle data 103 and outputs the resulting data in the form of the attitude angle correction signal 104. With this circuitry, the apparatus corrects a measured attitude angle sensor signal.

The correctness of the statistical probability model, or noise model, applied to the attitude angle sensor noise corrector 102 directly effects the accuracy of the attitude angle correction signal 104. As for a sensor noise model, various mathematically outstanding schemes have heretofore been proposed and may be used to enhance accurate estimation to a certain degree.

The prerequisite with the attitude angle sensor correcting apparatus described above is that a positional relation between an attitude angle sensor and the center of gravity of a satellite on which it is mounted is precisely measured and is strictly control led even in the space. Alignment errors critically lower the sensor correction accuracy, as stated earlier. Moreover, because a reference value for correcting alignment errors is not available, the detection of alignment errors itself is not practicable, as also stated previously.

Referring to FIG. 2, an attitude angle sensor correcting apparatus embodying the present invention and mounted on an artificial satellite will be described. As shown, the apparatus includes an attitude angle sensor data corrector 1, a satellite attitude estimator 2, an image data memory 3, and an attitude angle sensor data memory 101. There are also shown in FIG. 2 geographical shot data 4, estimated satellite attitude data 5, an estimated satellite attitude angle signal 6, and measured attitude angle data 103.

The image data memory 3 stores the geographical shot data 4 representative of a plurality of shots of the same geographical area on the ground where a GCP is located. The satellite attitude estimator 2 reads the data 4 out of the image data memory 3 and determines, by stereo image measurement, the measured position of the GCP contained in the data 4. The estimator 2 then estimates the instantaneous satellite attitude angle on the basis of a relation between the measured position of the GCP and the true position of the same. The estimator 2 feeds the resulting estimated satellite attitude data 5 to the attitude angle sensor data corrector 1. In response, the attitude angle sensor data corrector 1 corrects the measured attitude angle data 103 with the above data 5 coincident in time with the data 103 and then outputs the estimated satellite attitude angle signal 6.

The stereo image measuring method, which is a specific scheme for measuring the positions of a plurality of GCPs located on the ground, will be described specifically hereinafter. FIG. 3 shows a specific relation between two cameras 10 and 11 different in position from each other and a single point P14 shot by the cameras 10 and 11. In practice, a single camera implements the two cameras 10 and 11 and shoots the single point P14 at different positions to thereby output two different geographical image data. Vectors shown in FIG. 3 derive an equation:

P1=Pd+P2 Eq. (1)

Assume that the cameras 10 and 11 have coordinates Σs1 and Σs2, respectively, and that the component vectors of the individual vectors are expressed as:

S1P1=[s1x1s1y1s1z1]T

S2P2=[s2x2s2y2s2z2]T

Further, assume that projection points on screens 12 and 13 included in the cameras 10 and 11, respectively, are (s1x'1s1y'1) and (s2x'2s2Y'2), and that the cameras 10 and 11 both have a focal distance h. Then, there hold the following relations: x 1 ' s1 = h ⁢ ⁢ X 1 s1 z 1 s1 , y 1 ' s1 = h ⁢ ⁢ y 1 s1 z 1 s1 , x 2 ' s2 = h ⁢ ⁢ x 2 s2 z 2 s2 , y 2 ' s2 = h ⁢ ⁢ y 2 s2 z 2 s2 Eq . ⁢ ( 2 )

The projection points (s1x'1s1y'1) and (s2x'2s2y'2) on the screens 12 and 13 may alternatively be expressed as: X 1 ' s1 = k x ⁢ ⁢ i 1 υ sx , y 1 ' s1 = k y ⁢ ⁢ - j 1 υ sy , x 2 ' s2 = k x ⁢ ⁢ i 2 υ sx , y 2 ' s2 = k y ⁢ ⁢ - j 2 υ sy Eq . ⁢ ( 3 )

where (i1j1) and (i2j2) denote pixel values corresponding to the projection points on the screens 12 and 13, respectively, vsx and vsy denote a screen size, and kx and ky denote an image size.

Let a DCM (Direct Cosine Matrix) representative of a relation between the coordinates Σs1 and Σs2 be expressed as:

Σs2=s2Cs1Σs1 Eq. (4)

Then, the Eq. (1) may be rewitten as:

s1P1=s1Pd+s2Cs1T s2P2 Eq. (5)

The Eqs. (2) and (5) therefore derive s1z1, as follows: Z 1 s1 = h ⁢ ⁢ ( x 2 ' s2 ⁢ c 3 - h c1 ) T · Pd ( x 2 ' s2 ⁢ c 3 - hc 1 ) T · s Eq . ⁢ ( 6 )

where

s2Cs1=[c1c2c3]T, s=[s1x'1s1y'1h] Eq. (7)

Hereinafter will be described how the satellite attitude estimator 2 generates the estimated satellite attitude data 5 by using a measured GCP position vector sP'1, which is derived from the image data by the Eqs. (2), (3) and (6), and the true GCP position vector sP1. FIG. 4 shows the principle of satellite attitude estimation using a GCP. As shown, assume that the vectors sP'1 and sP1 are respectively assigned to a GCP 24 in an observed image 23 and a GCP 22 in an actual image 21. Then, the two vectors sP'1 and sP1 are related as follows:

sP'1=RsP1+st Eq. (8)

where R denotes a rotational transform matrix, i.e., RRT=RTR=1 and detR=1, and st denotes a translational transform vector.

The rotational transform matrix Rand translational transform vector st are representative of a difference between the attitudes of the camera 20 with respect to the GCPs 22 and 24. When the camera 23 is affixed to a satellite, the above matrix R and vector st may directly be interpreted as a difference in the attitude of the satellite.

Further, in an ideal condition wherein disturbance is absent, it is generally possible to precisely calculate the attitude of a satellite from time. Therefore, the true GCP position vector sP1 indicative of the GCP in the actual image easily derives the attitude value of a satellite in the ideal condition. It follows that if the rotational transform matrix R and translational transform vector st included in the Eq. (8) can be determined on the basis of the two vectors sP1 and sP'1, there can be generated the instantaneous estimated satellite attitude data 5.

More specifically, the satellite attitude estimator 2 first executes the stereo image measurement with the geographical shot data 4 in order to produce a measured GCP value based on the Eqs. (2), (3) and (6). The estimator 2 then determines a rotational transform matrix R and a translational transform vector st that satisfy the Eq. (8) with respect to the measured GCP value and the true GCP value. In this manner, the estimator 2 can generate the estimated satellite attitude data 5 for correcting errors contained in the measured attitude angle data 103.

The estimated satellite attitude data 5 and measured attitude angle data 103 are input to the attitude angle sensor data corrector 1. The attitude angle sensor data corrector 1 detects, by using time information included in the data 5, measured attitude angle data 103 corresponding to the time information, compares the detected data 103 with the estimated satellite attitude data 5, and then corrects the data 103. In this manner, the corrector 1 corrects the data 103 on the basis of information derived from an information source that is entirely different from the attitude angle sensor responsive to the data 103. The corrector 1 therefore successfully removes the noise components of the attitude angle sensor contained in the data 103 and corrects the alignment of the sensor, so that the estimated satellite attitude signal 6 is highly accurate.

In the illustrative embodiment, the positional errors of a camera mounted on a satellite may have critical influence on the correction accuracy of the attitude angle sensor. In practice, however, such errors are smoothed during stereo image measurement and influence the correction accuracy little. The mounting errors of the camera are therefore substantially unquestionable in the aspect of the correction accuracy of the measured attitude angle data 103.

As for the satellite attitude estimator 2, the rotational transform matrix R and translational transform vector st that satisfy the Eq. (8) can be generated by a Moore-Penrose quasi-inverse matrix. An alternative embodiment of the present invention using this scheme will be described hereinafter.

Assume that n GCP true vectors sP1, FIG. 4, are present, that a matrix Q having such elements is defined as: Q = [ [ s ⁢ P1 1 ] ⁡ [ s ⁢ P2 1 ] ⁢ ⁢ ⋯ ⁢ [ s ⁢ Pn 1 ] ] Eq . ⁢ ( 9 )

and that a matrix Q' constituted by measured GCP vectors sP'1 is defined as: Q = [ [ P ' s ⁢ 1 1 ] ⁡ [ P ' s ⁢ 2 1 ] ⁢ ⁢ ⋯ ⁢ [ P ' s ⁢ n 1 ] ] Eq . ⁢ ( 10 )

Then, a simultaneous transform matrix H constituted by the matrix R and vector st is expressed as:

H=Q'Q+ Eq. (11)

H = [ R s ⁢ t o 1 ] ,

The Eq. (11) therefore derives a rotational transform matrix R and a translational transform vector st that indicate a satellite attitude error, which in turn derives estimated satellite data 5.

Further, the satellite attitude estimator 2 may alternatively generate the rotational transform matrix R and translational transform vector st, which satisfy the Eq. (8), in relation to a constant coefficient matrix. Specifically, in another alternative embodiment of the present invention to be described, the estimator 2 generates the above matrix R and vector st on the basis of the following relation.

In FIG. 4, assume that n true GCP vectors sP1 are present, and that n measured GCP vectors sP'1 corresponding thereto are present. In the embodiment to be described, the following new vectors are defined:

sWn=sPn/sZn Eq. (12)

sWn=sP'n/sZn Eq. (13)

Let a matrix E be defined as: E = s ⁢ tR ~ = [ e 1 e 2 e 3 e 4 e 5 e 6 e 7 e 8 1 ] = [ φ 1 T φ 2 T φ 3 T ] Eq . ⁢ ( 14 )

where s ⁢ t ~ = [ 0 s ⁢ t 3 - t 2 s - t 3 s 0 t 1 s t 2 s - t 1 s 0 ]

Then, the matrix can be unconditionally determined by the following equation:

W'(diagE).W=0 Eq. (15)

where

W=[sW1sW2 . . . sWn],

W'=[sW'1sW'2 . . . sW'n]

Let the matrix produced by the Eq. (15) be expressed, by singular value resolution, as:

E=UΛVT Eq. (16)

Then, the matrix R and vector st can eventually be determined by: R = U ⁡ [ 0 ± 1 ± 1 0 s ] ⁢ V T , s = ( det ⁢ ⁢ U ) ⁢ ( det ⁢ ⁢ V ) Eq . ⁢ ( 17 ) s ⁢ t = α ⁡ [ φ 1 T ⁢ φ 2 / φ 2 T ⁢ φ 3 φ 1 T ⁢ φ 2 / φ 1 T ⁢ φ 3 1 ] Eq . ⁢ ( 18 )

where α denotes any desired constant.

With the Eqs. (17) and (18), it is possible to determine a rotational transform matrix R and a translational transform vector st indicative of a satellite attitude error in the same manner as in the immediately preceding embodiment. The matrix R and vector st derive estimated satellite attitude data 5, as stated earlier.

The difference between this embodiment and the immediately preceding embodiment as to the rotational transform matrix R and translational transform vector st will be described more specifically by using specific numerical values. Assume that the following eight GCPs exist in any desired geographical data 4:

sP1=|[6.2 26.8 0.5 ]T, sP2=[-6.5 20.1 0.08]T

sP3=[7.6 -30.8 10.6]T, sP4=[-0.8 -28.2 3.1]T

sP5=[10.7 34.3 16.1]T, sP6=[9.3 -18.6 0.15]T

sP7=[-17.2 30.1 9.5]T, sP8=[16.1 24.7 2.9]T Eq. (19)

A true rotational transform matrix R and a translational transform vector st corresponding to the above GCPs are given by: R = [ 0.99848 - 0.01562 - 0.05291 0.01742 0.99927 0.03398 0.05234 - 0.03485 0.99802 ] , s ⁢ t = [ - 81 31 - 24 ] Eq . ⁢ ( 20 )

Values produced by applying the transform of the Eq. (20) to the Eq. (19) and containing suitable noise are assumed to be measured GCP points sP'1, sP'2, sP'3, sP'4, sP'5, sP'6, sP'7 and sP'8. Then, the previous embodiment produces an estimated rotational transform matrix R1 and an estimated translational transform vector st1: R 1 = [ 0.99648 - 0.01611 - 0.05264 0 / 01743 0.99927 0 / 03398 0.05276 - 0.03444 0 / 99568 ] , 
 ⁢ det ⁢ ⁢ R 1 = 0.99567 , t 1 s = [ - 80.97565 31.0 - 23.97823 ] Eq . ⁢ ( 21 )

Likewise, the illustrative embodiment produces an estimated rotational transform matrix R and an estimated translational transform vector st2: R 2 = [ 0.99595 - 0.01225 - 0. ⁢ ⁢ 08907 0.00772 0.99866 - 0. ⁢ ⁢ 0511 0.08957 0.05020 0. ⁢ ⁢ 99471 ] , 
 ⁢ det ⁢ ⁢ R 2 = 1 , t 2 s = [ - 29.53341 38.34762 - 26.40867 ] Eq . ⁢ ( 22 )

As the Eqs. (21) and (22) indicate, the two embodiments are capable of estimating the rotational transform matrix R and translational transform vector st with acceptable accuracy with respect to true values although they include some errors as to numerical values.

As stated above, in accordance with the present invention, an image data memory stores two different geographical shot data representative of the same geographical area, where a GCP is set, shot at two different points. A satellite attitude estimator reads the shot data out of the image data memory, determines the position of the GCP included in the image data by stereo image measurement, and estimates the instantaneous attitude angle of a satellite by referencing a relation between the determined GCP position and a true GCP position. The estimated satellite attitude angle is input to an attitude angle sensor data corrector as estimated satellite attitude data. In response, the attitude angle sensor data corrector corrects measured attitude angle data output from an attitude angle sensor data memory with the estimated satellite attitude data corresponding in time to the measured data.

The attitude angle sensor may be implemented by the integrated value of a gyro signal, STT or an earth sensor by way of example. Also, the image data can be easily attained with a camera mounted on a satellite.

In summary, the present invention provides an attitude angle sensor correcting apparatus for an artificial satellite having the following unprecedented advantages. The correcting apparatus includes an image data memory, an attitude angle estimator, and an attitude angle sensor data corrector. The correcting apparatus can therefore remove both of sensing errors ascribable to random noise and bias noise, which are particular to an attitude angle sensor, and the alignment errors of the sensor mounted on a satellite, thereby determining an attitude angle with utmost accuracy. The correcting apparatus further promotes the accurate determination of an attitude angle by producing measured GCP values by executing stereo image measurement with geographical shot data.

Various modifications will become possible for those skilled in the art after receiving the teachings of the present disclosure without departing from the scope thereof.

Yamashita, Toshiaki

Patent Priority Assignee Title
10533856, Apr 05 2017 NovAtel Inc.; NOVATEL INC Navigation system utilizing yaw rate constraint during inertial dead reckoning
10630970, Nov 01 2016 Korea Aerospace Research Institute System and method for determining satellite image loss and computer-readable recording medium therefor
11105633, Apr 05 2017 NovAtel Inc. Navigation system utilizing yaw rate constraint during inertial dead reckoning
6600976, Mar 29 2002 Lockheed Martin Corporation Gyroless control system for zero-momentum three-axis stabilized spacecraft
6695263, Feb 12 2002 Lockheed Martin Corporation System for geosynchronous spacecraft rapid earth reacquisition
6702234, Mar 29 2002 Lockheed Martin Corporation Fault tolerant attitude control system for zero momentum spacecraft
6732977, Feb 11 2002 Lockheed Martin Corporation System for on-orbit correction of spacecraft payload pointing errors
7051980, Feb 26 2002 Lockheed Martin Corporation Efficient orbit sparing system for space vehicle constellations
7835826, Dec 13 2005 Lockheed Martin Corporation Attitude determination system for yaw-steering spacecraft
8212714, Aug 31 2009 National Technology & Engineering Solutions of Sandia, LLC Using doppler radar images to estimate aircraft navigational heading error
9251419, Feb 07 2013 MAXAR INTELLIGENCE INC Automated metric information network
9672624, Dec 28 2012 Korea Aerospace Research Institute Method for calibrating absolute misalignment between linear array image sensor and attitude control sensor
9875404, Feb 07 2013 MAXAR INTELLIGENCE INC Automated metric information network
Patent Priority Assignee Title
5104217, Mar 17 1986 GeoSpectra Corporation; GEOSPECTRA CORPORATION, P O BOX 1387, 333 PARKLAND PLAZA, ANN ARBOR, MI 48106, A CORP OF MI System for determining and controlling the attitude of a moving airborne or spaceborne platform or the like
5467271, Dec 17 1993 Northrop Grumman Corporation Mapping and analysis system for precision farming applications
5596494, Nov 14 1994 EOTEK INC Method and apparatus for acquiring digital maps
6108593, Jul 09 1997 Hughes Electronics Corporation Method and apparatus for estimating attitude sensor bias in a satellite
6125329, Jun 17 1998 MDA INFORMATION SYSTEMS LLC Method, system and programmed medium for massive geodetic block triangulation in satellite imaging
6233105, Mar 29 1999 Inventec Corporation Method of disk formatting
6275677, Mar 03 1999 Orbcomm LLC Method and apparatus for managing a constellation of satellites in low earth orbit
JP11160094,
JP1237411,
JP59229667,
JP6125600,
JP7329897,
//
Executed onAssignorAssigneeConveyanceFrameReelDoc
Dec 01 2000YAMASHITA, TOSHIAKINEC CorporationASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0113610740 pdf
Dec 07 2000NEC Corporation(assignment on the face of the patent)
Date Maintenance Fee Events
Jun 07 2005M1551: Payment of Maintenance Fee, 4th Year, Large Entity.
Apr 27 2006ASPN: Payor Number Assigned.
Jun 03 2009M1552: Payment of Maintenance Fee, 8th Year, Large Entity.
Jun 05 2013M1553: Payment of Maintenance Fee, 12th Year, Large Entity.


Date Maintenance Schedule
Jan 01 20054 years fee payment window open
Jul 01 20056 months grace period start (w surcharge)
Jan 01 2006patent expiry (for year 4)
Jan 01 20082 years to revive unintentionally abandoned end. (for year 4)
Jan 01 20098 years fee payment window open
Jul 01 20096 months grace period start (w surcharge)
Jan 01 2010patent expiry (for year 8)
Jan 01 20122 years to revive unintentionally abandoned end. (for year 8)
Jan 01 201312 years fee payment window open
Jul 01 20136 months grace period start (w surcharge)
Jan 01 2014patent expiry (for year 12)
Jan 01 20162 years to revive unintentionally abandoned end. (for year 12)