2021
DOI: 10.3390/sym13091742
|View full text |Cite
|
Sign up to set email alerts
|

MRE: A Military Relation Extraction Model Based on BiGRU and Multi-Head Attention

Abstract: A great deal of operational information exists in the form of text. Therefore, extracting operational information from unstructured military text is of great significance for assisting command decision making and operations. Military relation extraction is one of the main tasks of military information extraction, which aims at identifying the relation between two named entities from unstructured military texts. However, the traditional methods of extracting military relations cannot easily resolve problems suc… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
2
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
8
1

Relationship

1
8

Authors

Journals

citations
Cited by 12 publications
(4 citation statements)
references
References 24 publications
(31 reference statements)
0
2
0
Order By: Relevance
“…The term "Bi-directional" implies that the model processes information in both forward and backward directions. Specifically, BiGRU consists of two independent GRU layers: one processing the sequence in chronological order (forward GRU), and the other in reverse chronological order (backward GRU) [22]. The output at each time step is a concatenation of the outputs from the forward and backward GRUs, thereby incorporating both past and future contexts at each time step [23].…”
Section: Bert-bigrumentioning
confidence: 99%
“…The term "Bi-directional" implies that the model processes information in both forward and backward directions. Specifically, BiGRU consists of two independent GRU layers: one processing the sequence in chronological order (forward GRU), and the other in reverse chronological order (backward GRU) [22]. The output at each time step is a concatenation of the outputs from the forward and backward GRUs, thereby incorporating both past and future contexts at each time step [23].…”
Section: Bert-bigrumentioning
confidence: 99%
“…Then, a bidirectional gated recurrent unit network is used to capture the contextual information of the input sentences, and the attention mechanism is then used to focus on the most relevant parts of the input sentences for event detection. Lu et al (2021b) propose a military RE model using a combination of BiGRU and multi-head attention mechanisms. The method adopts a BiGRU-based NN to encode the input sentences and generate contextualized word representations.…”
Section: Militarymentioning
confidence: 99%
“…The weight is denoted by W , and the elementwise production is denoted by •. The BGRUs offer a mechanism to maximize the information processed [30,31]. It consists of concatenating the forward hidden layer and my backward hidden layer, which gives an output H t (formula (7)).…”
Section: Bidirectional Grumentioning
confidence: 99%