Proceedings of SustaiNLP: Workshop on Simple and Efficient Natural Language Processing 2020
DOI: 10.18653/v1/2020.sustainlp-1.4
|View full text |Cite
|
Sign up to set email alerts
|

End to End Binarized Neural Networks for Text Classification

Abstract: Deep neural networks have demonstrated their superior performance in almost every Natural Language Processing task, however, their increasing complexity raises concerns. A particular concern is that these networks pose high requirements for computing hardware and training budgets. The state-of-the-art transformer models are a vivid example. Simplifying the computations performed by a network is one way of addressing the issue of the increasing complexity. In this paper, we propose an end to end binarized neura… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
12
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
7
1
1

Relationship

5
4

Authors

Journals

citations
Cited by 15 publications
(12 citation statements)
references
References 9 publications
0
12
0
Order By: Relevance
“…The works in this domain dealt with different tasks such as text categorization, news identification, and intent classification. Most of the works [Rachkovskij, 2007], [Shridhar et al, 2020], [Alonso et al, 2021] used HVs as a way to represent data for some conventional machine learning classification algorithms. 1.…”
Section: Classification Based On Feature Vectorsmentioning
confidence: 99%
“…The works in this domain dealt with different tasks such as text categorization, news identification, and intent classification. Most of the works [Rachkovskij, 2007], [Shridhar et al, 2020], [Alonso et al, 2021] used HVs as a way to represent data for some conventional machine learning classification algorithms. 1.…”
Section: Classification Based On Feature Vectorsmentioning
confidence: 99%
“…The most illustrative use cases for encoding of input data into hypervectors and interfacing convectional machine learning algorithms are [34], [47]- [52]. For example, works [48], [50] proposed encoding n-gram statistics into hypervectors and subsequently solving typical natural language processing tasks with either supervised or unsupervised learning using standard artificial neural network architectures.…”
Section: Related Workmentioning
confidence: 99%
“…The experiments reported in this paper were done with two approaches to forming network's classifier: regularized least squares (RLS) and centroids. Though, other alternatives are available [1,5,22,38]. The activations of the hidden layer of the network were formed according to a recently proposed version of RVFL network [25], which simplifies the conventional architecture [16] using some of the HDC/VSA principles [24,27].…”
Section: Introductionmentioning
confidence: 99%