2020
DOI: 10.1088/1742-6596/1616/1/012078
|View full text |Cite
|
Sign up to set email alerts
|

Improving Abstractive Summarization via Dilated Convolution

Abstract: In this paper, a sequence-to-sequence based hybrid neural network model is proposed for abstractive summarization. Our method utilizes Bi-directional Long Short-Term Memory (Bi-LSTM) and multi-level dilated convolutions (MDC) to capture the global semantic information and semantic-unit level information, respectively. In decoding phrase, our model generates words according to summary relevant information captured by attention mechanism. Experiment shows that this proposed model outperforms several strong basel… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 10 publications
0
1
0
Order By: Relevance
“…Dilated CNNs have the ability to signifcantly grow receptive felds without afecting resolution or coverage [23]. Dilated CNNs have lately achieved remarkable success in image segmentation [24], text classifcation, and text-to-speech [25,26]. Tis study's main goal is to accurately identify web attacks using the full text of HTTP requests while keeping the preprocessing phase as simple as possible.…”
Section: Limits Of Prior Atrsmentioning
confidence: 99%
“…Dilated CNNs have the ability to signifcantly grow receptive felds without afecting resolution or coverage [23]. Dilated CNNs have lately achieved remarkable success in image segmentation [24], text classifcation, and text-to-speech [25,26]. Tis study's main goal is to accurately identify web attacks using the full text of HTTP requests while keeping the preprocessing phase as simple as possible.…”
Section: Limits Of Prior Atrsmentioning
confidence: 99%