2021 55th Asilomar Conference on Signals, Systems, and Computers 2021
DOI: 10.1109/ieeeconf53345.2021.9723109
|View full text |Cite
|
Sign up to set email alerts
|

Semi-supervised Nonnegative Matrix Factorization for Document Classification

Abstract: We propose new semi-supervised nonnegative matrix factorization (SSNMF) models for document classification and provide motivation for these models as maximum likelihood estimators. The proposed SSNMF models simultaneously provide both a topic model and a model for classification, thereby offering highly interpretable classification results. We derive training methods using multiplicative updates for each new model, and demonstrate the application of these models to single-label and multi-label document classif… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
1

Relationship

1
3

Authors

Journals

citations
Cited by 4 publications
(5 citation statements)
references
References 23 publications
0
5
0
Order By: Relevance
“…While there are other variants of SSNMF according to [14], we developed GSSNMF only based on the standard Frobenius norm. In the future, we plan to make use of other comparable measures like the information divergence, and derive a corresponding multiplicative updates solver.…”
Section: Discussionmentioning
confidence: 99%
See 3 more Smart Citations
“…While there are other variants of SSNMF according to [14], we developed GSSNMF only based on the standard Frobenius norm. In the future, we plan to make use of other comparable measures like the information divergence, and derive a corresponding multiplicative updates solver.…”
Section: Discussionmentioning
confidence: 99%
“…Besides topic modeling, one variant of classical NMF, Semi-Supervised NMF (SSNMF) [10,11,14] is designed to further perform classification. SSNMF introduces a masking matrix…”
Section: Semi-supervised Nmfmentioning
confidence: 99%
See 2 more Smart Citations
“…This can be done by minimizing a least squares error with a weighted penalty for the 1 -norm of the parameters. Prior authors have combined NMF with a linear regression procedure to maximize the predictive power of a classifier [10][11][12]. This is accomplished through a penalty function that combines NMF with another objective function-a (semi) supervised approach.…”
Section: Relation To Current Work and Contributionsmentioning
confidence: 99%