2022
DOI: 10.48550/arxiv.2203.10741
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

HIBRIDS: Attention with Hierarchical Biases for Structure-aware Long Document Summarization

Abstract: Document structure is critical for efficient information consumption. However, it is challenging to encode it efficiently into the modern Transformer architecture. In this work, we present HIBRIDS, which injects Hierarchical Biases foR Incorporating Document Structure into the calculation of attention scores. We further present a new task, hierarchical questionsummary generation, for summarizing salient content in the source document into a hierarchy of questions and summaries, where each follow-up question in… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
0
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 22 publications
0
0
0
Order By: Relevance
“…Miculicich and Han (2022) propose a two-stage method which detects text segments and incorporates this information in an extractive summarization model. Cao and Wang (2022) collect a new dataset for long and structure-aware document summarization, consisting of 21k documents written in English and extracted from WikiProject Biography.…”
Section: Introductionmentioning
confidence: 99%
“…Miculicich and Han (2022) propose a two-stage method which detects text segments and incorporates this information in an extractive summarization model. Cao and Wang (2022) collect a new dataset for long and structure-aware document summarization, consisting of 21k documents written in English and extracted from WikiProject Biography.…”
Section: Introductionmentioning
confidence: 99%