Application of UAS photogrammetry and geospatial AI techniques for palm tree detection and mapping

Date

2023-08

Authors

Regmi, Pratikshya

ORCID

Journal Title

Journal ISSN

Volume Title

Publisher

DOI

Abstract

Uncrewed aircraft systems (UAS), commonly known as drones, underwent significant advance ments in recent years, particularly in the development of improved sensors and cameras that enabled high-resolution imagery and precise measurements. This study utilized a UAS to capture aerial imagery of Texas A & M University-Corpus Christi (TAMUCC) main campus, which was then processed using Structure-from-Motion (SfM) photogrammetric software to generate orthomosaic imagery. The primary purpose of this study was to utilize the orthomosaic imagery acquired from UAS to detect, map, and quantify the number of palm trees. Initially, three deep-learning models were trained using the same set of training samples. The model exhibiting the highest performance in terms of precision, recall, and F1-Score was selected as the optimal model. The model obtained through the fine-tuning of a pre-trained GIS-based model with additional training samples was identified as the optimal choice, yielding the following values: precision=0.88, recall=0.95, and F1-score=0.91. This model successfully detected a total of 1414 sabal palm trees within our study area. The chosen optimal model was employed to examine the impact of ground sampling distance (GSD) on the deep learning model. GSD values were varied, namely 5 cm, 10 cm, 20 cm, and 40 cm. The findings revealed that the model’s performance deteriorated as the resolution decreased. Furthermore, the optimal model was subjected to an additional test using multi-temporal datasets with approximately the same GSD (1.5 cm). These datasets included one acquired a year prior to the model’s training datasets, and another obtained three months after the training datasets. Remarkably, the results demonstrated that the model maintained a comparable level of accuracy across all three testing datasets. The obtained results were verified using ground truth values taken in a small portion of the study area. This study concludes that deep learning models for object detection exhibit superior performance when fine-tuned with training samples specific to the area of interest. Furthermore, it is evident that the optimal model’s effectiveness diminishes significantly when the imagery resolution is reduced. Additionally, the performance of the deep learning model remains relatively consistent when applied to datasets acquired at different time frames, as long as the resolution of the testing data remains the same. In summary, the application of deep learning demonstrates its efficacy, user-friendliness, and time-saving capabilities for object detection. This study shows how we can use UAS and deep learning to detect palm trees. It helps us develop better ways to monitor and manage palm trees.

Description

A thesis submitted in partial fulfillment of the requirements for the degree of Master of Science in Geospatial Systems Engineering

Keywords

deep learning, GIS, mapping, object detection, photogrammetry, UAS

Sponsorship

Rights:

This material is made available for use in research, teaching, and private study, pursuant to U.S. Copyright law. The user assumes full responsibility for any use of the materials, including but not limited to, infringement of copyright and publication rights of reproduced materials. Any materials used should be fully credited with its source. All rights are reserved and retained regardless of current or future development or laws that may apply to fair use standards. Permission for publication of this material, in part or in full, must be secured with the author and/or publisher., This material is made available for use in research, teaching, and private study, pursuant to U.S. Copyright law. The user assumes full responsibility for any use of the materials, including but not limited to, infringement of copyright and publication rights of reproduced materials. Any materials used should be fully credited with its source. All rights are reserved and retained regardless of current or future development or laws that may apply to fair use standards. Permission for publication of this material, in part or in full, must be secured with the author and/or publisher.

Citation

Collections