2019
DOI: 10.1080/01490419.2019.1671560
|View full text |Cite
|
Sign up to set email alerts
|

Discrimination of different sea ice types from CryoSat-2 satellite data using an Object-based Random Forest (ORF)

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
36
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 14 publications
(36 citation statements)
references
References 27 publications
0
36
0
Order By: Relevance
“…As an ML method, the decision tree (DT) method has been widely applied to sea ice monitoring [41,42]. Another powerful ML algorithm employed for classification is random forest (RF), which creates a variety of individual decision trees that operate as an ensemble [43]. Although the DT and RF algorithms have been applied to monitor sea ice using satellite remote sensing data, such as MODIS and CryoSat-2, there is a lack of information about how DT and RF algorithms can be utilized for monitoring sea ice using spaceborne GNSS-R data.…”
Section: Introductionmentioning
confidence: 99%
“…As an ML method, the decision tree (DT) method has been widely applied to sea ice monitoring [41,42]. Another powerful ML algorithm employed for classification is random forest (RF), which creates a variety of individual decision trees that operate as an ensemble [43]. Although the DT and RF algorithms have been applied to monitor sea ice using satellite remote sensing data, such as MODIS and CryoSat-2, there is a lack of information about how DT and RF algorithms can be utilized for monitoring sea ice using spaceborne GNSS-R data.…”
Section: Introductionmentioning
confidence: 99%
“…For classifying the MYI and FYI, we use four classifiers that have already been applied to Ku-band altimeter measurements [15,17,27,28]. The classifiers are trained separately for each year.…”
Section: Classificationmentioning
confidence: 99%
“…A k-nearest neighbour (KNN) classifier finds k objectives in a provided training dataset that are closest to the test point; the class is based on a majority vote among these k objects and the distance is based on the ordinary Euclidean metric; based on this distance, the classification output is determined [17,28]. The training data used are the same for all supervised classifiers (and threshold-based), and the features used are the waveform parameters (which particular parameters selected are described in Section 4).…”
Section: K-nearest Neighbour (Knn) Classificationmentioning
confidence: 99%
See 2 more Smart Citations