2021
DOI: 10.1016/j.neucom.2021.03.090
|View full text |Cite
|
Sign up to set email alerts
|

Deep multi-view learning methods: A review

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
46
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
5
4
1

Relationship

0
10

Authors

Journals

citations
Cited by 155 publications
(53 citation statements)
references
References 175 publications
(301 reference statements)
0
46
0
Order By: Relevance
“…The optimization problem (2) can be extended to the multi-view scenario, where different views of the data are reflected by different types of features. Multi-view learning provides a mechanism for exploiting multiple types of features [40] , [41] , [42] . Let denote a data matrix from the -th view, where is the feature space dimension for the -th view.…”
Section: Methodsmentioning
confidence: 99%
“…The optimization problem (2) can be extended to the multi-view scenario, where different views of the data are reflected by different types of features. Multi-view learning provides a mechanism for exploiting multiple types of features [40] , [41] , [42] . Let denote a data matrix from the -th view, where is the feature space dimension for the -th view.…”
Section: Methodsmentioning
confidence: 99%
“…Apart from that, the authors also listed out the YOLO series of detectors -YOLOV2. Yan et al [11] reported a review on deep multi-view learning from videos focusing on representational deep learning methods such as conventional neural networks, deep brief networks, and multi-view auto-encoders.…”
Section: A Literature Surveymentioning
confidence: 99%
“…Cur- rently, model learning and prediction fusion are separate. A future direction is to combine deep multi-view learning [58] into the training session, which leads to an end-toend multi-view visibility prediction framework. Another direction is integrating edge computing [1,14] to construct a more efficient and robust visibility monitoring framework.…”
Section: Limitation and Future Workmentioning
confidence: 99%