2021
DOI: 10.1007/978-981-15-9712-1_31
|View full text |Cite
|
Sign up to set email alerts
|

Natural Language Processing: History, Evolution, Application, and Future Work

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
13
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 34 publications
(15 citation statements)
references
References 12 publications
0
13
0
Order By: Relevance
“…KeyBERT is based on BERT (Bidirectional Encoder Representations from Transformers) (Johri et al, 2021), published by Google in 2018, a model that uses a transformer (Wolf et al, 2019) architecture to map words, phrases and text into number vectors that capture their meaning. BERT is a bidirectional model, which makes it able to interpret a word based on the context of the sentence, regardless of whether the relevant information is left or right, unlike leftto-right architectures, which look only at the words preceding the one being processed (Devlin et al, 2018).…”
Section: Keybertmentioning
confidence: 99%
“…KeyBERT is based on BERT (Bidirectional Encoder Representations from Transformers) (Johri et al, 2021), published by Google in 2018, a model that uses a transformer (Wolf et al, 2019) architecture to map words, phrases and text into number vectors that capture their meaning. BERT is a bidirectional model, which makes it able to interpret a word based on the context of the sentence, regardless of whether the relevant information is left or right, unlike leftto-right architectures, which look only at the words preceding the one being processed (Devlin et al, 2018).…”
Section: Keybertmentioning
confidence: 99%
“…Historically, NLP techniques were limited to heuristic rule-based processes, which greatly limited their performance [7]. The implementation of machine learning methods, which shifted the approach of language processing from rule-based models to statistical inference, has greatly increased the performance of NLP models.…”
Section: Natural Language Processingmentioning
confidence: 99%
“…Our candidate timelines become Gaussians of intelligence each centered on a sensor's collection time, with heights set by the sensor recommendation scores, and widths set by historical values. Additionally, an intelligence threshold, 𝑉 PS , is selected for the current ISR need based on historically similar needs 7 . We make a simplifying assumption that collections from multiple sensors of the same type (i.e.…”
Section: Persistent Intelligence Satisfactionmentioning
confidence: 99%
“…The first task that the first computers solved in the early stages of the formation of NLP was the task of machine translation, i.e., automatic translation of text from one language to another using a computer. This problem was successfully solved and started to be applied in the mid-1950s, in the past century, for the "Russian-English" pair [2].…”
Section: информатика вычислительная техника и управлениеmentioning
confidence: 99%