2018
DOI: 10.1109/tsp.2018.2807422
|View full text |Cite
|
Sign up to set email alerts
|

Weakly Supervised Dictionary Learning

Abstract: We present a probabilistic modeling and inference framework for discriminative analysis dictionary learning under a weak supervision setting. Dictionary learning approaches have been widely used for tasks such as low-level signal denoising and restoration as well as high-level classification tasks, which can be applied to audio and image analysis. Synthesis dictionary learning aims at jointly learning a dictionary and corresponding sparse coefficients to provide accurate data representation. This approach is u… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
8
0

Year Published

2018
2018
2021
2021

Publication Types

Select...
5
2

Relationship

1
6

Authors

Journals

citations
Cited by 9 publications
(9 citation statements)
references
References 52 publications
0
8
0
Order By: Relevance
“…However, a simple reduction of the model for MIML would fail in the cancer classification problem as the bag labels in MIL are binary but not a subset of the class labels. Recently, You et al [41] introduced cardinality constraints to the MIML setting and demonstrated that optimizing the control over the maximum number of instances per bag can significantly improve the performance of the model. Motivated by the result, we propose using cardinality constraints to limit the number of relevant attributes in each bag.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…However, a simple reduction of the model for MIML would fail in the cancer classification problem as the bag labels in MIL are binary but not a subset of the class labels. Recently, You et al [41] introduced cardinality constraints to the MIML setting and demonstrated that optimizing the control over the maximum number of instances per bag can significantly improve the performance of the model. Motivated by the result, we propose using cardinality constraints to limit the number of relevant attributes in each bag.…”
Section: Related Workmentioning
confidence: 99%
“…Given that there are hundreds of instances per bag, cardinality constraints can help control the model complexity. While the probabilistic machinery for implementing cardinality constraints is similar to the approach in You et al [41], the graphical model and inference methods differ. The modeling difference: instance-level labels include unknown attribute/ cluster labels (see Fig.…”
Section: Related Workmentioning
confidence: 99%
“…The following research applied weakly supervised learning or unsupervised learning to time-series data. You et al analyzed weakly supervised dictionary learning that only relies on weak supervision that describes presence or absence in a set of data points [37]. Zhang et al proposed a weakly supervised method to detect involuntary body vibrations due to illness from voluntary exercise [38].…”
Section: Related Workmentioning
confidence: 99%
“…There exists a family of "task-driven" dictionary learning methods that tune the dictionary for a specific task, e.g. classification and clustering [176,177,178,179,180,181,182], image denoising [183,184,185,186], medical imaging [187,188,189], plenoptic imaging [190], or even reducing the block artifacts of JPEG images [191].…”
Section: )mentioning
confidence: 99%