2020
DOI: 10.1051/jnwpu/20203830471
|View full text |Cite
|
Sign up to set email alerts
|

Feature Selection on Maximum Information Coefficient for Underwater Target Recognition

Abstract: Feature selection is an essential process in the identification task because the irrelevant and redundant features contained in the unselected feature set can reduce both the performance and efficiency of recognition. However, when identifying the underwater targets based on their radiated noise, the diversity of targets, and the complexity of underwater acoustic channels introduce various complex relationships among the extracted acoustic features. For this problem, this paper employs the normalized maximum i… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(3 citation statements)
references
References 11 publications
0
3
0
Order By: Relevance
“…For a group of two-dimensional variables (X, Y), m and n intervals are divided in their respective directions, and then the two-dimensional variables form a grid of m × n on the plane. For the two-dimensional variables sample dataset D, the MIC formulas are as follows [19]:…”
Section: Correlation Analysis Methods Of Load and Influencing Factorsmentioning
confidence: 99%
“…For a group of two-dimensional variables (X, Y), m and n intervals are divided in their respective directions, and then the two-dimensional variables form a grid of m × n on the plane. For the two-dimensional variables sample dataset D, the MIC formulas are as follows [19]:…”
Section: Correlation Analysis Methods Of Load and Influencing Factorsmentioning
confidence: 99%
“…To avoid information loss during the calculation, a parameter matrix is introduced as a feature selector, as shown in Formula (20).…”
Section: ( [ :]) Arptopk a Imentioning
confidence: 99%
“…Zhang M X et al [ 20 ] proposed an NMIC-FS method, which is able to measure the correlation with categories and the redundancy between features and realize rapid feature selection in combination with a forward sequential search strategy. However, when the data density is low, or when the number of variables is small, the maximum information coefficient may fail.…”
Section: Related Workmentioning
confidence: 99%