Assessing Stability Performance of Non-Metric Camera’s Lens Distortion Model during UAV Flight Missions

Abstract

The process and definition of camera calibration have change greatly over recent years. Aerial metric cameras calibration, for which laboratory and field calibration procedures were a separate process, was performed before and independent of any actual mapping data collection using precise calibration fixtures with an assumption that the camera parameters determined would remain valid for a significant period. In contrast, non-metric cameras are characterized by unstable intrinsic parameters over the times and they are vulnerable to the engine and other vibrations during flight data acquisitions. Moreover, there is no standard calibration procedures exist for these cameras. But, since non-metric camera self-calibration augments the concept of calibration as a part of the measurement process, it can determine the camera intrinsic parameters at the time of the data acquisition as long as highly-convergent geometry and the use of multiple exposures are employed. Therefore, this paper investigates variations of the lens distortion components with object distance within the photographic field by using the self-calibration method. The use of redundant flight paths and tilted camera is also incorporated to ascertain the stability of the principal distance and the principal points during the flight mission. During the experiments, a series of flight mission is conducted to photograph test field areas from over a relatively flat area to highly mountainous one. It is revealed that the radial, decentering, and affinity distortion parameters are more stable than that of the principal distance and principal points against vibrations.

References
[1] Mélykuti, B. and Kruck, E. J. (2014). Camera Calibration with Radial Variance Component Estimation. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci, vol. XL–1, pp. 253–256.

[2] Tjahjadi, M. E., et al. S. (2017). Int. J. Electr. Comput. Eng, vol. 7, pp. 1188–1196.

[3] Sai, S. S., et al. (2019). Geometric accuracy assessments of orthophoto production from uav aerial images, in The 1st International Conference on Geodesy, Geomatics, and Land Administration 2019. Semarang.

[4] Rokhmana, C. A., et al. (2019). Potential use of UAV-based mapping system to accelerate the production of parcel boundary map in Indonesia, in The 1st International Conference on Geodesy, Geomatics, and Land Administration 2019. Semarang.

[5] Rokhmana, C. A., et al. (2019). Cadastral surveys with non-metric camera using UAV: a feasibility study, in The 1st International Conference on Geodesy, Geomatics, and Land Administration 2019. Semarang.

[6] Remondino F. and Fraser, C. (2006). Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci, vol. 36, pp. 266–272.

[7] Ziemann, H. (1986). Photogramm. Eng. Remote Sensing, vol. 52, pp. 1617–1625.

[8] Fraser, C. S. (2013). Photogramm. Eng. Remote Sensing, vol. 79, pp. 381–388. [9] Moe, D., et al. (2010). Int. Arch. Photogramm. Remote Sens, vol. 38, pp. 395–400.

[10] Clarke, T. A., et al. (1998). Photogramm. Rec., vol. 16, pp. 293–312.

[11] Yanagi, H. and Chikatsu, H. (2015). Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci, vol. XL-4/W5, pp. 223–226.

[12] Luhmann, T., et al. (2016). ISPRS J. Photogramm. Remote Sens., vol. 115, pp. 37–46.

[13] Kenefick, J. F. (1972). Photogramm. Eng. Remote Sens., vol. 38, pp. 1117–1126.

[14] Anderson, J. M. (1975). Photogramm. Eng. Remote Sens., vol. 41, pp. 1337–1348.

[15] Fryer, J. G. (1989). Photogramm. Eng. Remote Sensing, vol. 55, pp. 1751–1754.

[16] Honkavaara, E. (2004). Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci., vol. 35, pp. 166–172.

[17] Fraser, C. S. (1997). ISPRS J. Photogramm. Remote Sens., vol. 52, pp. 149–159.

[18] Fraser, C. S. (2010). Photogrammetric Camera Component Calibration: A review of Analytical Technique, in Calibration and Orientation of Cameras in Computer Vision, edited by A. W. Gruen and T. S. Huang (Berlin Heidelberg: Springer-Verlag, 2010), pp. 93–121.

[19] Fraser, C. S. (2005). Network Orientation Models for Image-Based 3d Measurement. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci., vol. 36, pp. 9.

[20] Tjahjadi, M. E. (2016). A fast and stable orientation solution of three cameras-based UAV imageries. ARPN J. Eng. Appl. Sci., vol. 11, no. 5, pp. 3449–3455.

[21] Tjahjadi M. E. and Agustina, F. D. (2017). Single image orientation of UAV’s imagery using orthogonal projection model, in Achieving Sustainability through Digital Earth, 2017 International Symposium on Geoinformatic (ISyG), Malang: IEEE., pp.19–24.

[22] Tjahjadi M. E. and Handoko, F. (2017). Single frame resection of compact digital cameras for UAV imagery, in Deep Learning High Speed Processing technologies and Its Application on Electrical, Electronics, Computer science and Informatics for Humanity, 4th International Conference on Electrical Engineering, Computer Science and Informatics (EECSI), edited by M. A. Riyadi et al. Yogyakarta: IEEE., pp. 409–413.

[23] Tjahjadi M. E. and Agustina, F. D. (2017). A Relative Rotation between Two Overlapping UAV’s Images, in Toward the Next Generation of technology, 5th International Conference on Electrical Engineering, Computer Science and Informatics (EECSI), edited by A. Yudhana et al. Malang: IEEE. pp. 658–663.

[24] Tjahjadi M. E. and Agustina, F. D. (2019). Fast and stable direct relative orientation of UAV-based stereo pair. Int. J. Adv. Intell. Informatics. vol. 5, no 1, pp. 24–39.

[25] Clarke T. A. and Wang, X. (1998). Extracting high-precision information from CCD images, in Proc. ImechE Conf., Optical methods and data processing for heat and fluid flow, (City University, 1998), pp. 311–320.