Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Confer 2021
DOI: 10.18653/v1/2021.acl-long.358
|View full text |Cite
|
Sign up to set email alerts
|

Recursive Tree-Structured Self-Attention for Answer Sentence Selection

Abstract: Syntactic structure is an important component of natural language text. Recent topperforming models in Answer Sentence Selection (AS2) use self-attention and transfer learning, but not syntactic structure. Tree structures have shown strong performance in tasks with sentence pair input like semantic relatedness. We investigate whether tree structures can boost performance in AS2. We introduce the Tree Aggregation Transformer: a novel recursive, tree-structured self-attention model for AS2. The recursive nature … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
1
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
2
2
1

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(2 citation statements)
references
References 34 publications
(35 reference statements)
0
1
0
Order By: Relevance
“…There is much prior work that induces, operates over, or otherwise uses a tree structure in neural net-work models (Socher et al, 2013a;Tai et al, 2015;Le and Zuidema, 2015;Dyer et al, 2016;Bradbury and Socher, 2017;Choi et al, 2017Choi et al, , 2018Drozdov et al, 2019;Ahmed et al, 2019;Wang et al, 2019;Mrini et al, 2021;Hu et al, 2021;Sartran et al, 2022). Such models are especially of interest due to the prevalence of trees in natural language.…”
Section: Related Workmentioning
confidence: 99%
“…There is much prior work that induces, operates over, or otherwise uses a tree structure in neural net-work models (Socher et al, 2013a;Tai et al, 2015;Le and Zuidema, 2015;Dyer et al, 2016;Bradbury and Socher, 2017;Choi et al, 2017Choi et al, , 2018Drozdov et al, 2019;Ahmed et al, 2019;Wang et al, 2019;Mrini et al, 2021;Hu et al, 2021;Sartran et al, 2022). Such models are especially of interest due to the prevalence of trees in natural language.…”
Section: Related Workmentioning
confidence: 99%
“…Language models have now become ubiquitous in NLP (Devlin et al, 2019;Liu et al, 2019b;Alsentzer et al, 2019), pushing the state of the art in a variety of tasks (Strubell et al, 2018;Liu et al, 2019a;Mrini et al, 2021). While language models capture meaning and various linguistic properties of text (Jawahar et al, 2019;Yenicelik et al, 2020), an individual's written text can include highly sensitive information.…”
Section: Introductionmentioning
confidence: 99%