Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence 2018
DOI: 10.24963/ijcai.2018/310
|View full text |Cite
|
Sign up to set email alerts
|

Time-evolving Text Classification with Deep Neural Networks

Abstract: Traditional text classification algorithms are based on the assumption that data are independent and identically distributed. However, in most non-stationary scenarios, data may change smoothly due to long-term evolution and short-term fluctuation, which raises new challenges to traditional methods. In this paper, we present the first attempt to explore evolutionary neural network models for time-evolving text classification. We first introduce a simple way to extend arbitrary neural networks to evolutionary l… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
31
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
6
3
1

Relationship

1
9

Authors

Journals

citations
Cited by 38 publications
(31 citation statements)
references
References 0 publications
0
31
0
Order By: Relevance
“…Due to the recent remarkable achievements of deep learning methods, several methods have been proposed to investigate the problem of concept drift using deep learning methods. For example, [107,152] proposed an RNN model for dealing with the problem of the concept drift for time series anomaly detection to address challenges posed by sudden or regular changes in normal behavior. The model is trained incrementally as new data becomes available and is capable of adapting to the changes in the data distribution.…”
Section: Recommendation and Future Research Directionmentioning
confidence: 99%
“…Due to the recent remarkable achievements of deep learning methods, several methods have been proposed to investigate the problem of concept drift using deep learning methods. For example, [107,152] proposed an RNN model for dealing with the problem of the concept drift for time series anomaly detection to address challenges posed by sudden or regular changes in normal behavior. The model is trained incrementally as new data becomes available and is capable of adapting to the changes in the data distribution.…”
Section: Recommendation and Future Research Directionmentioning
confidence: 99%
“…We use the model from the epoch when the model achieves the best result on the development set for the final model. He et al (2018) propose an evolving framework to train document classifiers. We re-implement two classifiers, RCNN and HAN with diachronic propagation learning strategy, which achieved the best performances in their paper.…”
Section: Dannmentioning
confidence: 99%
“…The most widely used type of recommendation algorithm is based on collaborative filtering. Such algorithms can automatically learn the hidden features of users and items, which can help identify potential interests of users [19] [20]. However, collaborative filtering algorithms present data sparseness and cold start problems.…”
Section: Introductionmentioning
confidence: 99%