2018
DOI: 10.1016/j.neucom.2018.03.028
|View full text |Cite
|
Sign up to set email alerts
|

GAR: An efficient and scalable graph-based activity regularization for semi-supervised learning

Abstract: In this paper, we propose a novel graph-based approach for semi-supervised learning problems, which considers an adaptive adjacency of the examples throughout the unsupervised portion of the training. Adjacency of the examples is inferred using the predictions of a neural network model which is first initialized by a supervised pretraining. These predictions are then updated according to a novel unsupervised objective which regularizes another adjacency, now linking the output nodes. Regularizing the adjacency… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
18
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
7
1

Relationship

1
7

Authors

Journals

citations
Cited by 27 publications
(20 citation statements)
references
References 18 publications
(33 reference statements)
0
18
0
Order By: Relevance
“…More recent approaches include convolutional embedded networks (CENs) [ 30 ], deep convolutional embedded clustering (DCEN) [ 29 ], discriminatively boosted clustering (DBC) [ 47 ], CNN-based joint clustering and RL with feature drift compensation (UMMC) [ 51 ], deep continuous clustering (DCC) [ 49 ], learning latent representations for clustering (IMSAT) [ 50 ] and deep adaptive clustering (DAC) [ 48 ]. We listed reviewed approaches including links to the original papers and their implementations that can be found at https://github.com/rezacsedu/Deep-learning-for-clustering-in-bioinformatics .…”
Section: DL For Clusteringmentioning
confidence: 99%
“…More recent approaches include convolutional embedded networks (CENs) [ 30 ], deep convolutional embedded clustering (DCEN) [ 29 ], discriminatively boosted clustering (DBC) [ 47 ], CNN-based joint clustering and RL with feature drift compensation (UMMC) [ 51 ], deep continuous clustering (DCC) [ 49 ], learning latent representations for clustering (IMSAT) [ 50 ] and deep adaptive clustering (DAC) [ 48 ]. We listed reviewed approaches including links to the original papers and their implementations that can be found at https://github.com/rezacsedu/Deep-learning-for-clustering-in-bioinformatics .…”
Section: DL For Clusteringmentioning
confidence: 99%
“…sub-classes, under each one of parents [45]. ACOL is a novel output layer modification for deep neural networks to allow simultaneous supervised classification (per provided parent-classes) and unsupervised clustering (within each parent) where clustering is performed with a Graph-based Activity Regularization (GAR) technique recently proposed in [44]. More specifically, as ACOL duplicates the softmax nodes at the output layer for each class, GAR allows for competitive learning between these duplicates on a traditional error-correction learning framework.…”
Section: Supplementary Methodsmentioning
confidence: 99%
“…To estimate an unbiased distance between our metagenomes based on SAAVs, we used a novel deep neural network modification called the auto-clustering output layer (ACOL). Briefly, ACOL relies on a recently introduced graph-based activity regularization (GAR) technique for competitive learning from hyper-dimensional data to demarcate fine clusters within user-defined 'parent' classes [44]. In this application of ACOL, however, we modified the algorithm so it can reveal latent groups in our SAAVs in a fully unsupervised manner through frequent random sampling of SAAVs to create pseudo-parent class labels instead of user-defined classes [45].…”
Section: Sar11 Cultivar Genomesmentioning
confidence: 99%
“…x x (10) where Cauchy σ is an adjustable parameter. The Cauchy kernel is a long-tailed kernel, deriving from the Cauchy distribution.…”
Section: Lssvm Based On Self-organizing Mk Learningmentioning
confidence: 99%
“…For different types of modeling methods, machine learning can be classified as unsupervised learning, semi-supervised learning, and supervised learning [9][10][11]. The focus of this paper is on supervised learning methods.…”
Section: Introductionmentioning
confidence: 99%