2022
DOI: 10.48550/arxiv.2205.12496
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Teaching Broad Reasoning Skills via Decomposition-Guided Contexts

Abstract: Question-answering datasets require a broad set of reasoning skills. We show how to use question decompositions to teach language models these broad reasoning skills in a robust fashion. Specifically, we use widely available QDMR representations to programmatically create synthetic contexts for real questions in six multihop reasoning datasets. These contexts are carefully designed to avoid common reasoning shortcuts prevalent in real contexts that prevent models from learning the right skills. This results in… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
references
References 32 publications
(51 reference statements)
0
0
0
Order By: Relevance