2023
DOI: 10.1007/s10958-023-06519-6
|View full text |Cite
|
Sign up to set email alerts
|

Deep Learning for Natural Language Processing: A Survey

E. O. Arkhangelskaya,
S. I. Nikolenko
Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
2
1
1
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(1 citation statement)
references
References 174 publications
0
1
0
Order By: Relevance
“…This technology possesses robust capabilities in information extraction, text understanding, and semantic analysis, enabling it to identify and label crucial concepts and entities from unstructured text, providing a solid foundation for automatic semantic generation. Long short-term memory neural networks (LSTMs), which are a form of recurrent neural networks (RNNs), are widely used in NLP [18][19][20][21]. RNNs face issues such as gradient vanishing and exploding when dealing with long sequences, making it difficult to capture long-term dependencies.…”
Section: Related Workmentioning
confidence: 99%
“…This technology possesses robust capabilities in information extraction, text understanding, and semantic analysis, enabling it to identify and label crucial concepts and entities from unstructured text, providing a solid foundation for automatic semantic generation. Long short-term memory neural networks (LSTMs), which are a form of recurrent neural networks (RNNs), are widely used in NLP [18][19][20][21]. RNNs face issues such as gradient vanishing and exploding when dealing with long sequences, making it difficult to capture long-term dependencies.…”
Section: Related Workmentioning
confidence: 99%