2022
DOI: 10.1080/13658816.2022.2087223
|View full text |Cite
|
Sign up to set email alerts
|

Towards a training data model for artificial intelligence in earth observation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 19 publications
(7 citation statements)
references
References 29 publications
0
5
0
Order By: Relevance
“…Its bounding boxes are represented using either 2D or 3D geometries. Yue et al (2022) propose a Training data Markup Language (TDML) for producing Machine learning training data, which defines a UML model and encodings consistent with the OGC standards baseline. It supports the exchange and retrieval of the geospatial machine learning training data in the Web environment, which is consistent with the ubiquitous JSON/XML encoding on the Web.…”
Section: Training Data Specificationsmentioning
confidence: 99%
See 1 more Smart Citation
“…Its bounding boxes are represented using either 2D or 3D geometries. Yue et al (2022) propose a Training data Markup Language (TDML) for producing Machine learning training data, which defines a UML model and encodings consistent with the OGC standards baseline. It supports the exchange and retrieval of the geospatial machine learning training data in the Web environment, which is consistent with the ubiquitous JSON/XML encoding on the Web.…”
Section: Training Data Specificationsmentioning
confidence: 99%
“…Annotation files in four kinds of formats are provided in FASDD, i.e., the JSON format defined by the TDML (Yue et al, 2022), the XML format adopted by the PASCAL VOC (Everingham et al, 2015) dataset, the JSON format adopted by the Microsoft COCO (Lin et al, 2014) dataset, and the text format adopted by models of YOLO (Redmon et al, 2016) series.…”
Section: Data Annotationmentioning
confidence: 99%
“…61 Open access to research data and other digital research resources according to the FAIR principles, assuring the findability, accessibility, interoperability and reuse of data is essential. 62 For dermoscopy, the international imaging initiatives Human Against Machine (HAM) and International Skin Imaging Collaboration (ISIC) are good examples of collective open-source databases, providing many dermoscopy images. Acquiring a large data set of relatively new imaging modalities such as OCT is still challenging, as OCT is not widely used yet.…”
Section: Ch a L L E Nge S A N D Fu T U R E Dir Ec Tionsmentioning
confidence: 99%
“…In order to train a DL model, enormous amounts of correctly labelled data should be fed to the model 61 . Open access to research data and other digital research resources according to the FAIR principles, assuring the findability, accessibility, interoperability and reuse of data is essential 62 . For dermoscopy, the international imaging initiatives Human Against Machine (HAM) and International Skin Imaging Collaboration (ISIC) are good examples of collective open‐source databases, providing many dermoscopy images.…”
Section: Challenges and Future Directionsmentioning
confidence: 99%
“…In the European Union (EU) the Sentinel for Common Agricultural Policy program (Sen4CAP) [2] focuses on developing tools and analytics to support the verification of direct payments to farmers with underlying environmental conditionalities such as the adoption of environmentally-friendly [50] and crop diversification [51] practices based on real-time monitoring by the European Space Agency's (ESA) Sentinel high-resolution satellite constellation [1] to complement on site verification. Recently, the volume and diversity of space-borne Earth Observation (EO) data [63] and post-processing tools [18,61,70] has increased exponentially. This wealth of resources, in combination with important developments in machine learning for computer vision [20,28,53], provides an important opportunity for the development of tools for the automated monitoring of crop development.…”
Section: Introductionmentioning
confidence: 99%