2021 IEEE International Conference on Systems, Man, and Cybernetics (SMC) 2021
DOI: 10.1109/smc52423.2021.9658632
|View full text |Cite
|
Sign up to set email alerts
|

Four-way Bidirectional Attention for Multiple-choice Reading Comprehension

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
7
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 7 publications
(7 citation statements)
references
References 11 publications
0
7
0
Order By: Relevance
“…Multiway attention (Wang et al, 2018a;Zhu et al, 2018) has been shown to be effective in capturing the interactions between each pair of input paragraph, question and candidate answers, leading to better context interpretation, while BERT fine-tuning (Devlin et al, 2018) also shows its prominent ability in commonsense inference. To further enhance the context understanding ability of BERT fine-tuning, we perform multiway bidirectional attention over the BERT encoding output.…”
Section: Bert With Multiway Attentionmentioning
confidence: 99%
“…Multiway attention (Wang et al, 2018a;Zhu et al, 2018) has been shown to be effective in capturing the interactions between each pair of input paragraph, question and candidate answers, leading to better context interpretation, while BERT fine-tuning (Devlin et al, 2018) also shows its prominent ability in commonsense inference. To further enhance the context understanding ability of BERT fine-tuning, we perform multiway bidirectional attention over the BERT encoding output.…”
Section: Bert With Multiway Attentionmentioning
confidence: 99%
“…Yet in a task like Social IQA, there is a distinct separation of context, question, and answer, and we would like to emphasize the differences between these segments. Previous work has dealt with this property with different architectures, usually using complex cross-segment (or named multiway) attention blocks [2,21,22,23].…”
Section: Cross-segment Attentionmentioning
confidence: 99%
“…and model the semantic relationships among paragraph, question and candidate options from multiple aspects of matching. Zhu et al (2018a) propose a hierarchical attention flow model, which leverages candidate options to capture the interactions among paragraph, question and candidate options. merge various attentions to fully extract the mutual information among the paragraph, question and options and form the enriched representations.…”
Section: Multi-choice Reading Comprehensionmentioning
confidence: 99%