2020
DOI: 10.1109/access.2020.3024750
|View full text |Cite
|
Sign up to set email alerts
|

Sentiment Analysis About Investors and Consumers in Energy Market Based on BERT-BiLSTM

Abstract: With the rapid development of social media, the number of online comments has exploded, and more and more people are willing to express their attitudes and feelings on the Internet. Under the influence of a series of major events all over the world, production order is facing a serious challenge, which severely impact on energy market. In 2020, a large number of investors' and consumers' comments related to social events began to appear on the Internet from China. However, the style and quality of online comme… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
13
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
4
4

Relationship

0
8

Authors

Journals

citations
Cited by 57 publications
(20 citation statements)
references
References 15 publications
(11 reference statements)
0
13
0
Order By: Relevance
“…Attention mechanism is obtained by adopting encoder-decoder architectures to transformers architectures with the aim to find the "summary" of the data with encoder and then the decoder translate them. In this sense, the transformers try to find mindfulness and find a way of looking at the relationships between words in a particular sentence [17], [18]. Currently, BERT is one of the most powerful representations of context and word.…”
Section: Propose Methodsmentioning
confidence: 99%
“…Attention mechanism is obtained by adopting encoder-decoder architectures to transformers architectures with the aim to find the "summary" of the data with encoder and then the decoder translate them. In this sense, the transformers try to find mindfulness and find a way of looking at the relationships between words in a particular sentence [17], [18]. Currently, BERT is one of the most powerful representations of context and word.…”
Section: Propose Methodsmentioning
confidence: 99%
“…In this study, we employed the BiLSTM model [9] over BERT embeddings for contextualized word representation. BiLSTM can learn the context information and latent meaning of words by reading the input sentence in two directions [3]. Contextualized representations were created according to the following steps:…”
Section: B Creating Contextualized Word Representationsmentioning
confidence: 99%
“…Unlike the basic grammar models that mainly depend on the statistical characteristics, BiLSTM is competent in learning context information by encoding the sentence according to both directions. Therefore, BiLSTM can capture the real meaning hidden between words from their context [3]. GCNs can effectively learn graph representations and have obtained satisfactory results in various applications and tasks, particularly classification tasks [25].…”
Section: Introductionmentioning
confidence: 99%
“…Because the symbol is different from other words in the text, this symbol independent of the input text does not carry obvious semantic information, so it will output the result more objectively and impartially. The output can effectively integrate the semantic information carried by each word in the input text and then better represent the text [27]. The model structure of BERT is shown in Figure 1.…”
Section: Bertmentioning
confidence: 99%
“…BERT+BiLSTM+Att [27]: The words are dynamically converted into word vectors through BERT, so that the converted word vectors are closer to the context. The generated word vectors are used to extract the sentiment features through the BiLSTM network, and the extracted feature vectors are given different attention weights through the attention mechanism.…”
mentioning
confidence: 99%