2022 IEEE International Conference on Mechatronics and Automation (ICMA) 2022
DOI: 10.1109/icma54519.2022.9856400
|View full text |Cite
|
Sign up to set email alerts
|

Feature Fusion Transformer Network for Natural Language Inference

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 5 publications
0
1
0
Order By: Relevance
“…Concatenate matching solves the multiple-premise entailment (MPE) task by transforming it into a singlepremise task by concatenating multiple premises into a single premise. Recently, an inference network model is proposed in [27], that is based on the Transformer model structure. The model retains the nonlocal feature extraction advantages of the self-attention mechanism and combines it with the convolution method to enhance the feature attention of the local domain, and effectively combine local and non-local features.…”
Section: Related Workmentioning
confidence: 99%
“…Concatenate matching solves the multiple-premise entailment (MPE) task by transforming it into a singlepremise task by concatenating multiple premises into a single premise. Recently, an inference network model is proposed in [27], that is based on the Transformer model structure. The model retains the nonlocal feature extraction advantages of the self-attention mechanism and combines it with the convolution method to enhance the feature attention of the local domain, and effectively combine local and non-local features.…”
Section: Related Workmentioning
confidence: 99%