Proceedings of the 21th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining 2015
DOI: 10.1145/2783258.2783304
|View full text |Cite
|
Sign up to set email alerts
|

Deep Model Based Transfer and Multi-Task Learning for Biological Image Analysis

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
23
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 96 publications
(23 citation statements)
references
References 23 publications
0
23
0
Order By: Relevance
“…All of the learning tasks are assumed to be related to each other, and it is found that learning these tasks jointly can lead to performance improvement compared with learning them individually. In general, MTL algorithms can be classified into several categories, including feature learning approach [34,41], low-rank approach [7,16], task clustering approach [47], task relation learning approach [12], and decomposition approach [6]. For example, the cross-stitch network [41] determines the inputs of hidden layers in different tasks by a knowledge transfer matrix; Zhou et.…”
Section: Multi-task Learningmentioning
confidence: 99%
See 1 more Smart Citation
“…All of the learning tasks are assumed to be related to each other, and it is found that learning these tasks jointly can lead to performance improvement compared with learning them individually. In general, MTL algorithms can be classified into several categories, including feature learning approach [34,41], low-rank approach [7,16], task clustering approach [47], task relation learning approach [12], and decomposition approach [6]. For example, the cross-stitch network [41] determines the inputs of hidden layers in different tasks by a knowledge transfer matrix; Zhou et.…”
Section: Multi-task Learningmentioning
confidence: 99%
“…In general, MTL algorithms can be classified into several categories, including feature learning approach [34,41], low-rank approach [7,16], task clustering approach [47], task relation learning approach [12], and decomposition approach [6]. For example, the cross-stitch network [41] determines the inputs of hidden layers in different tasks by a knowledge transfer matrix; Zhou et. al [47] aims to cluster tasks by identifying representative tasks which are a subset of the given m tasks, i.e., if task T i is selected by task T j as a representative task, then it is expected that model parameters for T j are similar to those of T i .…”
Section: Multi-task Learningmentioning
confidence: 99%
“…In image analysis, previous examples of deep transfer learning applications proved large-scale natural image sets [45] to be useful for pre-training models that serve as generic feature extractors for various types of biological images [14,283,534,535]. More recently, deep learning models predicted protein sub-cellular localization for proteins not originally present in a training set [536].…”
Section: Multimodal Multi-task and Transfer Learningmentioning
confidence: 99%
“…All three techniques can be used together in the same model. For example, Zhang et al [534] combined deep model-based transfer and multi-task learning for cross-domain image annotation. One could imagine extending that approach to multimodal inputs as well.…”
Section: Multimodal Multi-task and Transfer Learningmentioning
confidence: 99%
“…Zhang et al introduced such a symmetric approach named Multitask Relationship Learning (MTRL) [45] which regularizes the parallel learning of multiple tasks and models their relationships in a non-parametric manner as a task covariance matrix. Many other symmetric approaches [21,43,17,14,46,42] have been developed in recent years. Permutations of utilizing different regularization strategies [14], multi-level sharing [42], cross-layer parameter combinations [21] or meshes of all options [25] have been extensively tested, however they are vulnerable to noisy and outlier tasks which when introduced dramatically deteriorate performance.…”
Section: Related Workmentioning
confidence: 99%