2018
DOI: 10.1007/s00521-018-3432-2
|View full text |Cite
|
Sign up to set email alerts
|

Single-label and multi-label conceptor classifiers in pre-trained neural networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
1
1

Relationship

1
5

Authors

Journals

citations
Cited by 6 publications
(3 citation statements)
references
References 25 publications
0
3
0
Order By: Relevance
“…The BPNN models employed for layer categorization in semi-infinite seabed, single sediment layer, and double sediment layer environments are denoted as NET-2-X, where X represents the specific environment (X=1, 2, 3). In total, four BPNN models are trained (Qian et al, 2019).…”
Section: Bpnn Inversion Model Of Geoacoustic Parameters Inversionmentioning
confidence: 99%
“…The BPNN models employed for layer categorization in semi-infinite seabed, single sediment layer, and double sediment layer environments are denoted as NET-2-X, where X represents the specific environment (X=1, 2, 3). In total, four BPNN models are trained (Qian et al, 2019).…”
Section: Bpnn Inversion Model Of Geoacoustic Parameters Inversionmentioning
confidence: 99%
“…where  denotes the motion feature calculation parameter, and construct the shared feature subspace [16] according to  . To improve the use effect of the classifier designed in this paper, the support vector machine is integrated with some theories of multi-labeling transfer learning, and a multi-labeling classifier is constructed, which regards the labels of motion posture features [17] as…”
Section: Feature Extraction Of Hmpmentioning
confidence: 99%
“…This gives us the chance to investigate the networks from different perspectives. Exemplar applications of this property include the Conceptor-based post-processing of word vectors in [28] and the multi-label classification in [29]. In Conceptor-CAM, we will use it for converting the pseudo-negative evidences to pseudo-positive evidences using NOT operation.…”
Section: B Conceptor Learningmentioning
confidence: 99%