Accuracy Assessment of the Integration of GNSS and a MEMS IMU in a Terrestrial Platform
Abstract
: MEMS Inertial Measurement Units are available at low cost and can replace expensive units in mobile mapping platforms which need direct georeferencing. This is done through the integration with GNSS measurements in order to achieve a continuous positioning solution and to obtain orientation angles. This paper presents the results of the assessment of the accuracy of a system that integrates GNSS and a MEMS IMU in a terrestrial platform. We describe the methodology used and the tests realized where the accuracy of the positions and orientation parameters were assessed using an independent photogrammetric technique employing cameras that integrate the mobile mapping system developed by the authors. Results for the accuracy of attitude angles and coordinates show that accuracies better than a decimeter in positions, and under a degree in angles, can be achieved even considering that the terrestrial platform is operating in less than favorable environments.1. Introduction
A Direct Georeferencing System (DGS) can be defined as a set of sensors onboard a platform, whose goal is to obtain positions (as three coordinates) and attitudes (as three angles) of the origin of a reference system, defined in the platform, without external control points. One main application of a DGS is as a major component of a Mobile Mapping System (MMS), which contains, besides the DGS, a set of remote sensors to acquire information from the surrounding objects. The linkage of absolute positions and orientation parameters to the remote sensors allows the determination of positional and geometrical information of the objects observed [1]. Detailed descriptions of this technology can be found for example in [2,3].
In terms of applications, terrestrial MMS can be used to acquire data of urban or road infrastructures and can be optimized for specific applications such as traffic sign inventory and railway or road inspection as described in [4–6]. More specific applications of terrestrial MMS are the automatic DTM (Digital Terrain Model) generation in sandy beaches [7], river shorelines change detection [8] or pavement surface analysis [9].
Nowadays two lines emerge in MMS. A first one associated with high to moderate cost navigation or tactical grade Inertial Measuring Units (IMUs) and laser/camera sensors, which when integrated with satellite positioning receivers (Global Navigation Satellite Systems—GNSS) can deliver high accuracy surveys/modeling. The level of accuracy achieved by GNSS/IMU systems used in DGS can vary significantly, depending on the grade of the IMUs and on the GNSS data used. For a tactical grade IMU, integrated with geodetic grade GNSS receivers (multi-frequency code and phase receivers), accuracies can be of the order of one decimeter for positions, and better than 0.03° for attitude [10]. Navigation grade IMUs can provide accuracies of one order of magnitude better. Some of these types of systems are commercially available and are mainly directed to road, urban or railway environment surveys. Low cost Micro Electromechanical System (MEMS) IMUs integrated with geodetic grade GNSS receivers can provide results at the several decimeter level for positions and 0.2° for attitude, as shown for example in [11,12]. A very robust way of developing a DGS is by integrating GNSS and IMU measurements—an insight into this approach can be found in [13,14], or [15]. Manufacturers of GNSS receivers and IMUs provide data sheets with information on the expected accuracy that these individual systems can achieve, which usually relates to optimal operating conditions. However, in terrestrial platforms the working environment is more aggressive and consequently an accuracy decrease can be expected. In a MMS, the accuracy performance of the DGS is critical to the final positional accuracy of the objects acquired by the data sensors.
The subject of accuracy evaluation or calibration of MEMS units using known orientation and positions was presented by Artese et al. [16]. El-Sheimy presented a standard technique to obtain relative orientation between pairs of imaging sensors and DGS by using external control points on a bundle adjustment with known parameters as constraints [17].
The main goal of the work presented here was to assess the linear and angular accuracy of a DGS mounted on a terrestrial platform previously developed by the authors and described in [7], considering good conditions for GNSS observation. The tested DGS is composed by a dual frequency GNSS receiver and a MEMS IMU. The methodology used for the accuracy assessment is based on the comparison of the parameters given by the DGS with those obtained independently by photogrammetric techniques, using control points.
In Section 2, we describe briefly the DGS used and address the problem of converting data between different reference systems (sensors and platform), including the used conventions. In Section 3 the surveying test carried out is also described, as well as the methodology to assess the accuracy of the DGS system implemented. Finally, in Section 4, a brief discussion of the results obtained is presented.
2. Equipment and Methodology
The implemented DGS relies on the integration of measurements from a GNSS phase receiver with measurements from an IMU through a Kalman filter, which is a classic configuration that can provide very high accuracy due to its complementarity. A description of GNSS and IMU systems is not in the scope of this work since there is enough and accessible literature on the subjects (for GNSS see for example [18] and for IMU [19]).
2.1. Equipment Used
The accuracy of a GNSS/IMU system depends on the equipment used, built in processing and integration software. The DGS equipment used in our case was a Novatel dual phase GNSS receiver (Novatel Inc., Calgary, Canada) and a AHRS440 IMU from Crossbow (Crossbow: San Jose, CA, USA). Tables 1 and 2 present their accuracies, taken from the manufacturers' specifications.
The remote sensors used were two digital CCD cameras whose characteristics are presented in Table 3.
The cameras were calibrated independently with a methodology developed by the authors [20]. The resulting five parameters (principal point coordinates, focal distance and two radial distortions) allows for a mean linear re-projection error of 0.35 pixels. The calibrated focal distances, always fixed in the maximum value, were approximately 3.6 mm for both cameras. Images of the vehicle and the equipment mounted on it can be viewed in Figure 1.
2.2. GNSS/INS Integration by Kalman Filter
The Least Squares method is the most common way to estimate the state vector in geomatics, which is purely based on measurements [21]. For navigation applications it is more efficient to use a sequential approach and here the Kalman Filter plays an important role in the integration of GNSS and IMU observations [22–25], which is based on the knowledge of the measurements and the state vector dynamics.
In the Kalman Filter mechanical equations, the position, velocity and attitude computed in the navigation frame, combined with other information, such as sensors errors and local gravity disturbance, are taken as the states of the filter. Here we have used a 34 state vector for the Extend Kalman Filter (EKF) described as:
The dynamic error equation is:
The noise matrix Qk−1 can be approximated as:
In this equationt is the current observation time, τ means the time difference between consecutive observations and δ is the Dirac delta function with dimension s−1; In our application the Qk−1 has the form:
The system observation equations can be written as:
2.3. Coordinate Transformation between Reference Systems
Figure 2 shows a plan of the platform used to carry out the surveys. Complementary information about the plan is that the cameras are leveled over the platform, as can be viewed in Figure 1b, and their centers are two centimeters above the center of the DGS. The lengths between components were measured rigorously and the values are shown. There are four reference systems (r.s.) present: the Cam left, Cam right, DGS (whose origin is considered at the center of the platform) and the cartographic r.s. in which the platform moves. In order to convert coordinates of points between reference frames, three linear and three angular components are needed.
Considering any of these r.s., say 1 and 2, assuming that only scale is maintained, then the following equation can be established relating to a point in space (P) [27]:
Considering the observation of the rotation angles, and the positions of cameras and DGS in the cartographic r.s., the following equations can be established using Equation (15):
Using Equations (16), (19) and (20) the offset parameters relating the DGS with the left camera are:
These results will be used as needed in the rest of the document.
2.4. Rotation Angles (ω φ κ versus Roll, Pitch, Heading)
Traditionally, in photogrammetry, the orientation angles were associated with aerial photography and used to calculate Earth referenced coordinates from the photographic coordinates. They are the well-known rotations ω ϕ κ, of the photographic camera axes [27].
With the development of navigation technology applied to mobile surveying platforms, orientation angles different from the usual ones in photogrammetry were adopted, namely the roll, pitch and heading angles—see Table 4.
In the case of the terrestrial image, the authors used an axis convention different from the usual in photogrammetry, namely by considering photo coordinates as (x,z) and the y axis forward from the photo, in so achieving a more intuitive relation between κ and heading angles. Despite the axes considered in each case being different and rotated in different order, the resulting rotation matrices are the same [27]. In this work the conventions used are presented in Table 4. In each case the rotation angles can be obtained from the rotation matrices as follows:
3. Evaluation of the DGS Accuracy
In order to evaluate the quality of the obtained navigation solution, the authors compared the exterior orientation parameters obtained independently from the cameras on the platform, with the parameters derived from the DGS itself. The offsets between the DGS and the cameras must be considered. However only the linear offsets are known by means of rigorous measurements over the platform (Figure 2). In fact, offset determination using the independent observations from cameras and DGS were used in this work to estimate the accuracy of the later. The observations, the strategy adopted and the results obtained are described below.
3.1. Observations
The data acquisition was conducted in an urban environment with the terrestrial video-based MMS, described in [7]. The test zone was selected on pavement with painted crosswalks because there we have well-defined points with enough variability on the ground and in the photos and also because such observations may be easily repeated at other locations in order to apply this methodology in our future work.
The control points were observed with centimetric accuracy by means of a static relative GNSS survey (referring Table 1, the acquisition mode was RT-2 (Real Time Kinematics), with the reference station at a distance of 6 km. Figure 3 shows a general view of the test area (a) and a scheme with the observed control points (b).
Several passes were made through the test zone, from different directions, either following a straight line or a curve, collecting images at a rate of four per second (the cameras collect images at the same instant since they share the same trigger). Of these, nine pairs were selected, in which a sufficient number of control points could be observed, in order to compare the position and orientation parameters with those given by the DGS. This means that the tests were carried using eighteen images in which the control points could be observed. Of interest is the fact that the 2nd and 3rd pairs were selected because the vehicle was moving in a curve, and all the other were along a straight trajectory.
3.2. Position and Orientation Parameters by Space Resection and Respective Accuracy Estimation
The position and orientation parameters of the cameras in the cartographic space were obtained by space resection from the control points. This is a photogrammetric process by which the six exterior orientation parameters of an image are computed using at least 3 control points on the photo [27]. The image coordinates of the control points were identified manually and independently in each photo. The results of this phase are summarized in Table 5.
Table 5 contains the images used to perform exterior orientation, including the used control points, the exterior orientation parameters obtained and, in the right column, the calculated relative orientation parameters. The relative orientation matrices and linear relative parameters were obtained with Formulas (21) and (22) and the relative orientation angles were extracted from the rotation matrices as in Formula (25).
The relative orientation between cameras is constant for all pairs because they are fixed on the platform and the images of each pair were obtained at the same instants. Moreover, the exterior orientation of the images, as referred previously, was obtained independently in each photo. In this way it is expected that the relative orientation parameters calculated, regarded as observations, follow a normal law, so their standard deviations are a good estimate of the accuracy of this methodology for obtaining the exterior orientation of an image. Table 6 presents the accuracy estimations in the orientation angles, in the 3D positions and in the length of vector T.
The length separation between the two camera centers is known to be 1.044 m (as shown in Figure 2). In this case it is possible to present the root mean square error and the mean error of the length of vector T, what is also done in Table 6. This is a good estimate of the accuracy achieved by space resection, for the positional parameters.
3.3. Position and Orientation Parameters Given by the GNSS / IMU System and Respective Accuracy Estimation
After processing the IMU and GNSS observations, as described in Section 2.2, a navigation solution was obtained for the surveyed path. Of interest are the instants that correspond to the analyzed photogrammetric pairs. The results obtained are presented in Table 7.
Estimation of the accuracy of the IMU, either in coordinates or orientations, cannot rely on the differences between IMU and cameras observations as these depend on the directions of the vehicle. The followed strategy was to calculate the linear and angular offsets of the left camera in the IMU r.s., for each observation, which, in the absence of observation errors, should be constant.
The offset matrices, MLcamDGS, were calculated using Formula (23), where MLcamCart is the left camera rotation matrix, calculated with the rotation angles in Table 5, and MDGSCart is the rotation matrix of DGS calculated with the rotation angles in Table 7. The offset angles where extracted using Formula (26). The linear offsets were calculated using Formula (24), where [X0Lcam − X0DGS]Cart is the difference between the cartographic coordinates of the left camera and DGS, given respectively in Tables 5 and 7. The results are presented in Table 8.
In this case, as the DGS r.s. is leveled over the platform and its orientation matches the orientation of the platform, the linear offsets are known rigorously by means of precise measurements (see Figure 2). In this way it is possible to assess the mean error and the root mean square error of the 3 linear offset components, which were also given in Table 8. The linear accuracy of this GNSS/IMU integration can be assessed by means of the root mean square error of the vector T and the angular accuracy by means of the standard deviation of the obtained offset angles.
4. Discussion
The evaluation of the DGS accuracy relied on the comparison between its parameters computed using the measurements acquired in good GNSS observation conditions and the ones obtained from the exterior orientation of the left camera at the same instants, by space resection, using rigorous control points on the pavement. It was necessary first to estimate the accuracy of this exterior orientation process, which is summarized in Table 6, showing an accuracy of better than 1.5 cm in 3D positions and better than 0.1 degrees in orientation angles. This result agrees with what was expected regarding the geometric characteristics of camera/lens systems and previous experiments made during calibration tests [18]. The obtained mean linear re-projection errors of 0.35 pixels represent 0.6 cm to 1.5 cm, respectively at 4 to 10 meters distance, being that sort of distances at which the control points were observed.
The accuracy estimation of the DGS itself relied on the fact that the offset parameters between DGS and cameras must be constant. Using this condition, as shown by the results summarized in Table 8, the obtained accuracy was better than 6cm in 3D positions, and around 0.15 degrees in roll and pitch and 0.75 degrees in heading.
A major issue relates to the fact that the accuracy of the DGS is being assessed through an independent process of obtaining position and orientation. It can be noted, however, that the estimated accuracy with the space resection process is better than the accuracy estimated for the DGS. The relation is about 4 times better in 3D positions, 2.5 times in attitude (ω and φ angles) and 20 times in heading.
As a final remark, and regarding Table 8, if the GNSS/IMU observations made in curves are discarded, the 2nd and 3rd pairs on the table, the heading accuracy achieved is higher, remaining under 0.2°. As expected the heading accuracy given by the GNSS/IMU system looses quality when the vehicle is changing direction.
5. Conclusions
In this work the accuracy of the navigation solution provided by a DGS implemented with a particular GNSS/IMU integration was evaluated for a terrestrial platform moving in an urban environment. The used sensors were a high quality dual frequency GNSS receiver and a medium quality MEMS IMU, integrated by means of an extended Kalman filter. The results show that, for specific environments, these type of sensors can deliver good results and replace the more expensive tactical grade type of IMU.
The methodology used was to compare the DGS derived parameters with those obtained directly from the images applying space resection in the same instants. The obtained accuracy for the DGS, presented in Table 8, was of 6 cm in position parameters, 0.2° in attitude and 0.8° in heading.
Acknowledgments
This work was partially funded through FCT (Portuguese Foundation for Science and Technology) in the scope of projects PEst-C/MAR/LA0015/2013, DEOSOM (Detection and Evaluation of Oil Spills by Optical Methods), through the AMPERA ERA-NET, and PTDC/AGR-AAM/104819/2008.
Author Contributions
Madeira S., Gonçalves J. and Bastos L. conceived and designed the experiments; Madeira S. and Yan W. performed the experiments; Madeira S., Yan W. and Gonçalves J. analyzed the data; Madeira S., Gonçalves J., Yan W. and Bastos L. wrote the paper.
Conflicts of Interest
The authors declare no conflict of interest.
References
- Madeira, S.; Gonçalves, J.; Bastos, L. Sensor Integration in a Low Cost Land Mobile Mapping System. Sensors 2012, 12, 2935–2953. [Google Scholar]
- Schwarz, K.P.; Sheimy, N. Digital Mobile Mapping Systems—State of the Art and Future Trends. In Advances in Mobile Mapping Technology; Taylor & Francis Group: London, UK, 2007; pp. 3–18. [Google Scholar]
- Gruen, A., Huang, T.S., Eds.; Calibration and Orientation of Cameras in Computer Vision; Springer Series in Information Sciences; Springer-Verlag: Berlin/Heidelberg, Germany, 2001; p. 235.
- Madeira, S.; Gonçalves, J.; Bastos, L. Photogrammetric mapping and measuring application using MATLAB. Comput. Geosci. 2010, 36, 699–706. [Google Scholar]
- Gikas, V.; Daskalakis, S. Determining Rail Track Axis Geometry Using Satellite and Terrestrial Geodetic Data. Surv. Rev. 2008, 40, 392–405. [Google Scholar]
- Wang, C.; Hassan, T.; El-Sheimy, N.; Lavigne, M. Automatic Road Vector Extraction for Mobile Mapping Systems. Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci. 2008, XXXVII-B3b, 516–522. [Google Scholar]
- Madeira, S.; Gonçalves, J.; Bastos, L. Accurate DTM generation in sand beaches using Mobile Mapping. J. Coast. Conserv. 2013. [Google Scholar] [CrossRef]
- Vaaja, M.; Hyyppä, J.; Kukko, A.; Kaartinen, H.; Hyyppä, H.; Alho, P. Mapping Topography Changes and Elevation Accuracies Using a Mobile Laser Scanner. Sensors 2011, 11, 587–600. [Google Scholar]
- Aoki, K.; Yamamoto, K.; Shimamura, H. Evaluation model for pavement surface distress on 3D point clouds from mobile mapping system. Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci. 2012, XXXIX-B3, 87–90. [Google Scholar]
- Deurloo, R.A. Development of a Kalman Filter Integrating System and Measurement Models for a Low-cost Strapdown Airborne Gravimetry System. Ph.D. Thesis, University of Porto, Porto, Portugal, 2011. [Google Scholar]
- Bastos, L.; Yan, W.; Magalhães, A.; Ayres-Sampaio, D.; Deurloo, R. Assessment the performance of low cost IMUs for strapdown airborne gravimetry using UAVs. Proceedings of the 4th International Galileo Science Colloquium, Scientific and Fundamental Aspects of the Galileo Programme, Prague, Czech Republic, 4–6 December 2013.
- Ayres-Sampaio, D.; Gonçalves, J.A.; Magalhães, A.; Bastos, L. Evaluating the performance of MEMS-based IMUs for direct georeferencing. Proceedings of the 8th Portuguese and Spanish Assembly of Geodesy and Geophysics, Evora, Portugal, 29–31 January 2014; pp. 337–340.
- Grewal, M.; Weill, L.; Andrews, A. Global Positioning Systems, Inertial Navigation and Integration; J. Wiley & Sons: Hoboken, NJ, USA, 2007. [Google Scholar]
- Schwarz, K.P.; Chapman, M.A.; Cannon, M.W.; Gong, P. An integrated INS/GPS Approach to the Georeferencing of Remotely Sensed Data. ISPRS J. Photogramm. Remote Sens. 1993, 59, 1667–1674. [Google Scholar]
- Borgese, G.; Rizzo, L.; Artese, G.; Pace, C. Compact Wireless GPS/inertial System. Proceedings of 4th Annual Caneus Fly-By-Wireless Workshop, Montreal, QC, Canada, 14–17 June 2011; pp. 15–18.
- Artese, G.; Trecroci, A. Calibration of a low cost MEMS INS sensor for an integrated navigation system. Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci. 2008, XXXVII-B5, 877–882. [Google Scholar]
- El-Sheimy, N. The Development of VISAT—A Mobile Survey System For GIS Applications. PhD Thesis, The University of Calgary, Calgary, Canada, 1996; pp. 80–87. [Google Scholar]
- Hofmann-Wellenhof, B.; Lichtenegger, H.; Wasle, E. GNSS—Global Navigation Satellite Systems: GPS, GLONASS, Galileo, and more; Springer-Verlag: Wien, Austria, 2008. [Google Scholar]
- Britting, K.R. Inertial Navigation System Analysis; John Wiley & Sons, Inc.: New York, NY, USA, 1972. [Google Scholar]
- Madeira, S.; Gonçalves, J.; Bastos, L. Fast camera calibration for low cost mobile mapping. Proceedings of the 6th International Symposium on Mobile Mapping Technology, Presidente Prudente, Brazil, 21–24 July 2009.
- Jekeli, C. Inertial Navigation Systems with Geodetic Applications; Walter de Gruyter: Berlin, Germany, 2001. [Google Scholar]
- Kalman, R.E. A New Approach to Linear Filtering and Prediction Problems. Trans. ASME J. Basic Eng. 1960, 82, 35–45. [Google Scholar]
- Moore, J.B.; Qi, H. Direct Kalman Filtering Approach for GPS/INS Integration. IEEE Trans. Aerosp. Elect. Syst. 2002, 38, 687–693. [Google Scholar]
- Yaakov, B.S.; Li, X.R.; Thiagalingam, K. Estimation with Applications to Tracking and Navigation; John Wiley & Sons: New York, NY, USA, 2001. [Google Scholar]
- Gikas, V.; Cross, P.A.; Akuamoa, A. A Rigorous and Integrated Approach to Hydrophone and Source Positioning during Multi-Streamer Offshore Seismic Exploration. Hydrogr. J. 1995, 77, 11–24. [Google Scholar]
- Gelb, A. Applied Optimal Estimation; M.I.T. Press: Cambridge, UK, 1974. [Google Scholar]
- Wolf, P.R.; Dewitt, B.A.; Wilkinson, B. Elements of Photogrammetry, with Applications in GIS, 3rd ed.; McGraw-Hill: New York NY, USA, 2000; pp. 237–245. [Google Scholar]
Novatel DL-V3 Dual Frequency GNSS Phase Receiver | |
---|---|
Horizontal Position | Accuracy (RMS) |
Single Point L1 | 1.5 m |
Single Point L1/L2 | 1.2 m |
SBAS | 0.6 m |
DGPS | 0.4 m |
RT-20 | 0.2 m |
RT-2 | 1 cm + 1 ppm |
Crossbow AHRS440 IMU | |
---|---|
Heading | |
Range (°) | ± 180° |
Accuracy (RMS) | < 1.0° |
Resolution | < 0.1° |
Attitude | |
Range: Roll, Pitch | ± 180°, ± 90° |
Accuracy (RMS) | < 0.2° |
Resolution | < 0.02° |
Camera | Lens |
---|---|
(3.2 × 2.4) mm2 CCD array | Focal Length: Vario 1.8–3.6 mm |
Cell size of 5.6 μm × 5.6 μm | Iris range: F1.6 – Close |
Array size of 640 × 480 pixels | Minimum Object Distance: 0.2 m |
Angle of view (Hor): 97° to 53° for a (3.2 × 2.4) mm2 CCD |
Measuring Element | Rotation Order | Rotation Matrices |
---|---|---|
1st - ω | ||
2nd - ϕ | ||
3rd - κ | ||
1st - heading | ||
2nd - pitch | ||
3rd - roll |
Pair | Left Image | Space Resection | Right Image | Space Resection | Relative Orientation |
---|---|---|---|---|---|
1 Straight line | ω = −0.737° | ω = −2.258° | ωrel = −0.174° | ||
φ = 4.037° | φ = 3.858° | φ rel = −1.518° | |||
κ = 283.200° | κ = 284.374° | κ rel = 1.067° | |||
X = −43700.132 m | X = −43699.833m | Xrel = 1.043 m | |||
Y = 153423.686 m | Y = 153422.684m | Yrel = 0.062 m | |||
Z = 8.435 m | Z = 8.448m | Zrel = 0.021 m | |||
Stdxy = 1.3;1.3 pix | Stdxy=1.7;1.5 pix | T = 1.045 m | |||
2 Curve | ω = 1.777° | ω = − 0.397° | ω rel = −0.147° | ||
φ = 4.199° | φ = 4.706° | φ rel = −1.459° | |||
κ = 255.460° | κ = 256.671° | κ rel = 1.102° | |||
X = −43694.735 m | X = −43694.946 m | Xrel = 1.035 m | |||
Y = 153423.886 m | Y = 153422.871 m | Yrel = 0.048 m Zrel | |||
Z = 8.423 m | Z = 8.431m | = 0.024 m | |||
Stdxy = 2.4;1.2 pix | Stdxy = 2.2;0.9 pix | T = 1.037 m | |||
3 Curve | ω = 0.365° | ω = −0.753° | ω rel = −0.224° | ||
φ = 3.092° | φ = 4.259° | φ rel = −1.599° | |||
κ = 231.655° | κ = 232.757° | κ rel = 1.028° | |||
X = −43692.415 m | X = −43692.993 m | Xrel = 1.038 m | |||
Y = 153422.506 m | Y = 153421.642 m | Yrel = 0.081 m | |||
Z = 8.483m | Z = 8.530 m | Zrel = 0.021 m | |||
Stdxy = 1.7;1.3 pix | Stdxy = 1.3;1.2 pix | T = 1.041 m | |||
4 Straight line | ω = −4.790° | ω = −4.626° | ω rel = −1.121° | ||
φ = −0.199° | φ = −1.677° | φ rel = −1.483° | |||
κ = 10.994° | κ = 12.053° | κ rel = 1.054° | |||
X = −43687.679 m | X = −43686.669 m | Xrel = 1.042 m | |||
Y = 153411.606 m | Y = 153411.868 m | Yrel = 0.064 m | |||
Z = 8.576 m | Z = 8.570 m | Zrel = 0.013 m | |||
Stdxy = 1.4;0.6 pix | Stdxy = 1.3;0.9 pix | T = 1.044 m | |||
5 Straight line | ω = −4.645° | ω = −4.460° | ω rel = −0.096° | ||
φ = 0.253° | φ = −1.218° | φ rel = −1.479° | |||
κ = 10.846° | κ = 11.866° | κ rel = 1.017° | |||
X = −43689.485 m | X = −43688.495 m | Xrel = 1.020 m | |||
Y = 153418.980 m | Y = 153419.236 m | Yrel = 0.065 m | |||
Z = 8.487 m | Z = 8.480 m | Zrel = 0.018 m | |||
Stdxy = 1.3;1.1 pix | Stdxy = 1.1;1.0 pix | T = 1.023 m | |||
6 Straight line | ω = −3.787° | ω = −3.642° | ω rel = −0.140° | ||
φ = −0.965° | φ = −2.585° | φ rel = −1.621° | |||
κ = 10.027° | κ = 11.028° | κ rel = 0.995° | |||
X = −43691.099 m | X = −43690.102 m | Xrel = 1.026 m | |||
Y = 153425.459 m | Y = 153425.712 m | Yrel = 0.075 m | |||
Z = 8.513 m | Z = 8.526m | Zrel = 0.013 m | |||
Stdxy = 1.1;1.0 pix | Stdxy = 1.5;1.5 pix | T = 1.029 m | |||
7 Straight line | ω = 4.873° | ω = 4.798° | ω rel = −0.181° | ||
φ = 1.225° | φ = 2.613° | φ rel = −1.379° | |||
κ = 190.571° | κ = 191.616° | κ rel = 1.041° | |||
X = −43694.993 m | X = −43696.017 m | Xrel = 1.047 m | |||
Y = 153440.921 m | Y = 153440.706 m | Yrel = 0.019 m | |||
Z = 8.838 m | Z = 8.874 m | Zrel = 0.032 m | |||
Stdxy = 1.6;0.8 pix | Stdxy = 1.9;0.9 pix | T = 1.047 m | |||
8 Straight line | ω = 5.271° | ω = 5.139° | ω rel = −0.154° | ||
φ = 2.060° | φ = 3.609° | φ rel = −1.547° | |||
κ = 190.560° | κ = 191.576° | κ rel = 1.008° | |||
X = −43692.895 m | X = −43693.894 m | Xrel = 1.033 m | |||
Y = 153432.684 m | Y = 153432.413 m | Yrel = 0.079 m | |||
Z = 8.678m | Z = 8.708 m | Zrel = 0.018 m | |||
Stdxy = 1.9;1.0 pix | Stdxy = 2.1;1.1 pix | T = 1.036 m | |||
9 Straight line | ω = 4.823° | ω = − 4.776° | ω rel = −0.175° | ||
φ = −0.074° | φ = 1.269° | φ rel = −1.333° | |||
κ = 189.460° | κ = 190.526° | κ rel = 1.063° | |||
X = −43691.246 m | X = −43692.288 m | Xrel = 1.064 m | |||
Y = 153425.281 m | Y = 153425.062 m | Yrel = 0.044 m | |||
Z = 8.536 m | Z = 8.540m | Zrel = 0.023 m | |||
Stdxy = 1.5;1.0 pix | Stdxy = 1.3;0.7 pix | T = 1.065 m |
Δω | Δ φ | Δ κ | Δx (m) | Δy (m) | Δz (m) | ΔT(m) | |
---|---|---|---|---|---|---|---|
Std | 0.037° | 0.094° | 0.034° | 0.013 | 0.020 | 0.006 | 0.012 |
Mean E | −0.003 | ||||||
RMSE | 0.012 |
Pair | X (m) | Y (m) | H (m) | Roll (°) | Pitch (°) | Heading (°) |
---|---|---|---|---|---|---|
1 | −43,700.51 | 153,422.84 | 8.38 | 1.848 | 0.457 | 75.039 |
2 | −43,695.31 | 153,422.95 | 8.41 | 0.829 | 0.417 | 101.501 |
3 | −43,693.06 | 153,421.66 | 8.44 | 0.711 | 3.761 | 125.160 |
4 | −43,687.48 | 153,411.23 | 8.57 | −0.039 | −0.849 | 347.533 |
5 | −43,689.25 | 153,418.56 | 8.52 | −0.498 | −0.721 | 347.775 |
6 | −43,690.85 | 153,425.03 | 8.51 | 0.954 | −1.791 | 348.608 |
7 | −43,695.77 | 153,440.53 | 8.84 | −1.404 | 0.863 | 168.090 |
8 | −43,693.66 | 153,432.23 | 8.70 | −2.073 | 0.330 | 168.016 |
9 | −43,692.01 | 153,424.82 | 8.48 | −0.092 | 1.061 | 169.328 |
Pair/offsets | Offset X (m) | Offset Y (m) | Offset H (m) | T vector (m) | Offset Roll (°) | Offset Pitch (°) | Offset Heading (°) |
---|---|---|---|---|---|---|---|
1 | −0.411 | 0.188 | 0.061 | 0.456 | 1.075 | −5.792 | 1.697 |
2 | −0.579 | 0.158 | 0.018 | 0.601 | 0.867 | −5.416 | 2.927 |
3 | −0.558 | 0.024 | 0.073 | 0.563 | 0.885 | −5.540 | 3.010 |
4 | −0.497 | 0.085 | 0.011 | 0.505 | 0.845 | −5.580 | 1.423 |
5 | −0.524 | 0.133 | −0.033 | 0.541 | 0.770 | −5.316 | 1.372 |
6 | −0.537 | 0.138 | 0.016 | 0.555 | 0.967 | −5.493 | 1.256 |
7 | −0.460 | 0.106 | 0.014 | 0.472 | 1.239 | −5.602 | 1.168 |
8 | −0.462 | 0.043 | 0.000 | 0.464 | 1.051 | −5.498 | 1.184 |
9 | −0.462 | 0.027 | 0.063 | 0.467 | 1.135 | −5.777 | 1.155 |
Mean | −0.499 | 0.100 | 0.025 | 0.514 | 0.982 | −5.557 | 1.688 |
Std | 0.055 | 0.059 | 0.034 | 0.053 | 0.154 | 0.155 | 0.746 |
Known Values | −0.522 | 0.125 | 0.020 | 0.537 | |||
Mean Error | 0.023 | −0.025 | 0.005 | −0.023 | |||
RMSE | 0.057 | 0.061 | 0.033 | 0.055 |
© 2014 by the authors; licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution license ( http://creativecommons.org/licenses/by/3.0/).
Share and Cite
Madeira, S.; Yan, W.; Bastos, L.; Gonçalves, J.A. Accuracy Assessment of the Integration of GNSS and a MEMS IMU in a Terrestrial Platform. Sensors 2014, 14, 20866-20881. https://doi.org/10.3390/s141120866
Madeira S, Yan W, Bastos L, Gonçalves JA. Accuracy Assessment of the Integration of GNSS and a MEMS IMU in a Terrestrial Platform. Sensors. 2014; 14(11):20866-20881. https://doi.org/10.3390/s141120866
Chicago/Turabian StyleMadeira, Sergio, Wenlin Yan, Luísa Bastos, and José A. Gonçalves. 2014. "Accuracy Assessment of the Integration of GNSS and a MEMS IMU in a Terrestrial Platform" Sensors 14, no. 11: 20866-20881. https://doi.org/10.3390/s141120866
APA StyleMadeira, S., Yan, W., Bastos, L., & Gonçalves, J. A. (2014). Accuracy Assessment of the Integration of GNSS and a MEMS IMU in a Terrestrial Platform. Sensors, 14(11), 20866-20881. https://doi.org/10.3390/s141120866