2009 IEEE International Geoscience and Remote Sensing Symposium 2009
DOI: 10.1109/igarss.2009.5418142
|View full text |Cite
|
Sign up to set email alerts
|

Using aerial images to calibrate the inertial sensors of a low-cost multispectral autonomous remote sensing platform (AggieAir)

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
7
0

Year Published

2009
2009
2019
2019

Publication Types

Select...
3
3
1

Relationship

1
6

Authors

Journals

citations
Cited by 7 publications
(7 citation statements)
references
References 5 publications
0
7
0
Order By: Relevance
“…This is done by inverse orthorectifing the images to find the actual position and attitude of the UAV using ground references setup in a square. Actual data from a test flight is used to validate this method (Jensen, Han & Chen, 2009). As detailed above, a point in the image plane ( p i ) can be transformed into Earth-Centered Earth-Fixed (ECEF) coordinates ( p w ) using equation 13 where u w is the position of the UAV in ECEF, R c b is the rotation matrix from the camera frame to the body frame, R b n is the rotation matrix from the body frame to the navigation frame, R n w is the rotation matrix from the navigation frame to ECEF, and h is the height above ground of the UAV.…”
Section: Image Orthorectificationmentioning
confidence: 99%
“…This is done by inverse orthorectifing the images to find the actual position and attitude of the UAV using ground references setup in a square. Actual data from a test flight is used to validate this method (Jensen, Han & Chen, 2009). As detailed above, a point in the image plane ( p i ) can be transformed into Earth-Centered Earth-Fixed (ECEF) coordinates ( p w ) using equation 13 where u w is the position of the UAV in ECEF, R c b is the rotation matrix from the camera frame to the body frame, R b n is the rotation matrix from the body frame to the navigation frame, R n w is the rotation matrix from the navigation frame to ECEF, and h is the height above ground of the UAV.…”
Section: Image Orthorectificationmentioning
confidence: 99%
“…The method is also called the inverse georeferencing. 13,39 Finding errors and calibrating the camera result in more accurate IMU/GPS data, which obviously benefit both direct and indirect georeferencing methods to achieve better performance.…”
Section: Update the Inertial Measurement Unit/global Positioning Syst...mentioning
confidence: 99%
“…Direct georeferencing methods use the respective IMU/GPS data such as pitch, roll, yaw, altitude, latitude, and longitude to georeference each aerial image. [12][13][14] They are easy, fast, straightforward, and automatic. 2 They first perform a series of transformations to project four corners of each image to the earth coordinate and then create a three-dimensional surface and overlay images on the map using the terrestrial digital elevation model (DEM).…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…However, the sensor error always varies with time, camera calibration too complex to calibrate camera before every flight. What's more, the best way to eliminate the error of aircraft sensor may be use the method of field calibration [14]. Therefore, this paper presents a method to obtain the prediction model of attitude errors with the results of field calibration in advance, which can effectively improve the positioning accuracy but also guarantee the instantaneity.…”
Section: Introductionmentioning
confidence: 99%