2020
DOI: 10.5194/isprs-archives-xliv-m-2-2020-99-2020
|View full text |Cite
|
Sign up to set email alerts
|

Identifying Epiphytes in Drones Photos With a Conditional Generative Adversarial Network (C-Gan)

Abstract: Abstract. Unmanned Aerial Vehicle (UAV) missions often collect large volumes of imagery data. However, not all images will have useful information, or be of sufficient quality. Manually sorting these images and selecting useful data are both time consuming and prone to interpreter bias. Deep neural network algorithms are capable of processing large image datasets and can be trained to identify specific targets. Generative Adversarial Networks (GANs) consist of two competing networks, Generator and Discriminato… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
5
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
4
2

Relationship

1
5

Authors

Journals

citations
Cited by 8 publications
(5 citation statements)
references
References 14 publications
0
5
0
Order By: Relevance
“…Conditional GAN is widely used for many image generation applications. Shashank et al, (2020) used CGAN for identifying epiphytes from drone photos where the CGAN will generate the output label for input image. Jiao et al, (2019) evaluated the potential of the CGAN for plant leaf recognition.…”
Section: Conditional Ganmentioning
confidence: 99%
See 3 more Smart Citations
“…Conditional GAN is widely used for many image generation applications. Shashank et al, (2020) used CGAN for identifying epiphytes from drone photos where the CGAN will generate the output label for input image. Jiao et al, (2019) evaluated the potential of the CGAN for plant leaf recognition.…”
Section: Conditional Ganmentioning
confidence: 99%
“…This study addressed the image data generation case studies in synthesizing photos from label maps, reconstructing objects from edge maps, and colorizing images. The potential of such an image-to-image translation model for epiphyte segmentation with less training samples are studied by our research team (Shashank et al, 2020. The proposed study focused on the impact of quality of training data in terms of image contrast while training a CGAN network is addressed.…”
Section: Conditional Ganmentioning
confidence: 99%
See 2 more Smart Citations
“…The input images hold a dimension of 512 × 512 × 3. The current study concentrates on a single epiphyte species named Warahuai Kupperina captured from the Costa Rica reserve forest [4], [5]. The quality factors of the epiphyte images, like lighting conditions, the distance at which the drone captures the images, and the occupancy of the target and background, vary throughout the image samples.…”
Section: The Epiphyte Datasetmentioning
confidence: 99%