Geometric Accuracy Assessments of Orthophoto Production from UAV Aerial Images

Abstract

Orthophoto mosaic is assembled from aerial perspective images through a process called orthorectification, which eliminate photographic tilts and terrain relief effects. These orthorectified images have been resampled from the original ones that may have been prepared from a DTM which does not accurately model the surface. Meanwhile, some proprietary software such as Agisoft utilizes spatially dense 3D point clouds that are generated from a so called Structure from Motion technique to generate the orthophoto. The software provides a black-box method to regard these clouds as DSM, and it utilizes this surface model to project pixels from the original images. This paper investigates geometric accuracy of the produced orthophoto mosaic according to the American Society of Photogrammetry and Remote Sensing (ASPRS) standards. To minimize scale differences among images, a 35mm fixed-lens camera is mounted on a fixed-wing UAV platform. Flight missions are carried out at around 250m flying height controlled by a navigational grade sensor on board to provide spatial resolution of about 27mm. A number of orthophoto mosaics are produced by varying numbers of GCPs, flight paths configuration and terrain relief differences. The geometric accuracies are assessed through a provision of ICPs on each test field area. Coordinates deviations between the ICP and the corresponding orthophotos are framed into a RMSE figures. Within a 95% confidence level, it is revealed that a suitable orthophoto map scale is up to around 1:500. It is recommended that a cross flight configuration to achieve better results.

References
[1] P. R. Wolf and B. A. Dewitt. (2000). Elements of Photogrammetry: with Applications in GIS, 3rd ed. McGraw-Hill Companies Inc., New York, pp. 217–225.

[2] E. M. Mikhail, J. S. Bethel, and C. J. McGlone. (2001). Introduction to Modern Photogrammetry. John Wiley & Sons, Inc., New York, 2001, pp. 225–238.

[3] M. Scott. (2013).Photogrammetric Products, in Manual of Photogrammetry: 6th Edition, edited by J. C. McGlone. American Society for Photogrammetry & Remote Sensing, Bethesda, Maryland, pp. 1009–1043.

[4] T. J. Blachut and M. C. Van-Wijk, Photogramm. Eng. 36, 365–374 (1970).

[5] G. Konecny, Photogramm. Eng. Remote Sensing. 45, 727–734 (1979).

[6] M. E. Tjahjadi, S. S. Sai, and F. Handoko, “Assessing a 35mm fixed-lens sony alpha-5000 intrinsic parameters prior to, during, and post uav flight mission,” in The 1st International Conference on Geodesy, Geomatics, and Land Administration 2019, AIP Conference Proceeding, (Accepted).

[7] B. Altena and T. Goedemé, ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. II-5, 17–24 (2014).

[8] M. E. Tjahjadi, S. S. Sai, and C. A. Rokhmana, “Assessing stability performance of non-metric camera’s lens distortion model during UAV flight missions,” in The 1st International Conference on Geodesy, Geomatics, and Land Administration 2019, AIP Conference Proceeding, (Accepted).

[9] M. R. Shortis, S. Robson, and H. A. Beyer, Photogramm. Rec. 16, 165–186 (1998).

[10] T. A. Clarke and X. Wang (1998) Ds And Data Processing For Heat And Fluid Flow. City University pp. 311–320.

[11] K. Novak. (1992).Photogramm. Eng. Remote Sens. 58, 339–344.

[12] D. Liebowitz and A. Zisserman. (1998).Metric rectification for perspective images of planes, in Computer Vision and Pattern Recognition, Computer Society Conference on IEEE, pp. 482–488.

[13] A. KrupniK. (2003), Photogramm. Rec. 18, 41–58.

[14] K. I. Bang and A. F. Habib,. (2007).Comparative analysis of alternative methodologies for true orthophoto generation from high resolution satellite imagery, in ASPRS 2007 Annual Conference ASPRS, Tampa, Florida, pp. 12.

[15] A. F. Habib, E.-M. Kim, and C.-J. Kim. (2007).Photogramm. Eng. Remote Sens. 73, 025–036.

[16] L. Barazzetti, R. Brumana, D. Oreni, M. Previtali, and F. Roncoroni. (2014).ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. II(5), 57–63.

[17] Y. Chen, C. Briese, W. Karel, and N. Pfeifer. (2014) “Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. XL-3, 67–71.

[18] G. E. Karras, L. Grammatikopoulos, I. Kalisperakis, and E. Petsa, Photogramm. Eng. Remote Sens.73, 403–411 (2007).

[19] M. E. Tjahjadi, F. Handoko, and S. S. Sai. (2017).Int. J. Electr. Comput. Eng. 7, 1188–1196.

[20] Q. Chen, M. Sun, X. Hu, and Z. Zhang. (2014).Remote Sens. 6, 12334–12359.

[21] Z. Maoteng, X. Xiaodong, and Z. Junfeng. (2018). ISPRS J. Photogramm. Remote Sens.138, 30–46.

[22] Y. Afek. (1998). Photogramm. Eng. Remote Sens. 64, 115–125.

[23] C.-H. Lin, B.-H. Chen, B.-Y. Lin, and H.-S. Chou.(2016). ISPRS J. Photogramm. Remote Sens. 119, 426–436.

[24] C. A. Rokhmana, M. E. Tjahjadi, and F. D. Agustina, “Cadastral surveys with non-metric camera using UAV: a feasibility study,” in The 1st International Conference on Geodesy, Geomatics, and Land Administration 2019, AIP Conference Proceeding, (Accepted).

[25] C. A. Rokhmana, I. A. Gumeidhidta, and M. E. Tjahjadi, “Potential use of uav-based mapping system to accelerate the production of parcel boundary map in Indonesia,” in The 1st International Conference on Geodesy, Geomatics, and Land Administration 2019, AIP Conference Proceeding, (Accepted).

[26] G. Esposito, G. Mastrorocco, R. Salvini, M. Oliveti, and P. Starita.(2017).Environ. Earth Sci. 76, 1–16.

[27] Y. Tan, S. Wang, B. Xu, and J. Zhang.(2018).ISPRS J. Photogramm. Remote Sens.146, 421–429. [28] F. Chiabrando, E. Donadio, and F. Rinaudo, Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. XL-5/W7, 91–98 (2015).

[29] F. Agüera-Vega, F. Carvajal-Ramírez, and P. Martínez-Carricondo.(2017). J. Surv. Eng. 143, 4016025.

[30] ASPRS (American Society for Photogrammetry and Remote Sensing), Photogramm. Eng. Remote Sens. 81, A1–A26 (2015).

[31] M. E. Tjahjadi and F. Handoko, “Single frame resection of compact digital cameras for UAV imagery,” in Deep Learning High Speed Processing technologies and Its Application on Electrical, Electronics, Computer science and Informatics for Humanity, 4th International Conference on Electrical Engineering, Computer Science and Informatics (EECSI), edited by M. A. Riyadi et al. (IEEE, Yogyakarta, 2017), pp. 409–413.

[32] model,” in Achieving Sustainability through Digital Earth, 2017 International Symposium on Geoinformatic (ISyG), (IEEE, Malang, 2017), pp.19–24.

[33] M. E. Tjahjad.(2016).ARPN J. Eng. Appl. Sci.11, 3449–3455.

[34] M. E. Tjahjadi and F. D. Agustina, “A Relative Rotation between Two Overlapping UAV’s Images,” in Toward the Next Generation of technology, 5th International Conference on Electrical Engineering, Computer Science and Informatics (EECSI), edited by A. Yudhana et al. (IEEE, Malang, 2018), pp. 658–663.

[35] M. E. Tjahjadi and F. D. Agustina. (2019). Int. J. Adv. Intell. Informatics 5, 24–39.

[36] A. Mingyao, H. Qingwu, L. Jiayuan, W. Ming, Y. Hui, and W. Shaohua.(2015). Remote Sens. 7, 2302–2333.