2021
DOI: 10.1155/2021/5526479
|View full text |Cite
|
Sign up to set email alerts
|

Adaptive‐Weighted Multiview Deep Basis Matrix Factorization for Multimedia Data Analysis

Abstract: Feature representation learning is a key issue in artificial intelligence research. Multiview multimedia data can provide rich information, which makes feature representation become one of the current research hotspots in data analysis. Recently, a large number of multiview data feature representation methods have been proposed, among which matrix factorization shows the excellent performance. Therefore, we propose an adaptive-weighted multiview deep basis matrix factorization (AMDBMF) method that integrates m… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
3
1

Relationship

1
3

Authors

Journals

citations
Cited by 4 publications
(2 citation statements)
references
References 47 publications
(56 reference statements)
0
2
0
Order By: Relevance
“…Different techniques to multi-view clustering have been explored before, but they have not been analyzed in terms of dimensionality reduction in the representation of multi labelled data in supervised learning. In order to identify the text, the supervised learning technique defines the labelled information based on its features [20,21]. It is thus crucial to find data sources for unsupervised learning with multiple labelling [22,23].…”
Section: Introductionmentioning
confidence: 99%
“…Different techniques to multi-view clustering have been explored before, but they have not been analyzed in terms of dimensionality reduction in the representation of multi labelled data in supervised learning. In order to identify the text, the supervised learning technique defines the labelled information based on its features [20,21]. It is thus crucial to find data sources for unsupervised learning with multiple labelling [22,23].…”
Section: Introductionmentioning
confidence: 99%
“…All these approaches are not utilized with elements in matrix in the process of decomposition of matrix; it means maximization of matrix representation consist negative elements in representation of data in low-dimensions. As of late, deep learning has shown extraordinary execution in include portrayal undertakings [18][19][20]. Consequently, numerous analysts have brought profound learning into lattice factorization and proposed countless profound component portrayal strategies [21][22][23][24][25][26][27].…”
Section: Introductionmentioning
confidence: 99%