Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing 2021
DOI: 10.18653/v1/2021.emnlp-main.351
|View full text |Cite
|
Sign up to set email alerts
|

Structural Adapters in Pretrained Language Models for AMR-to-Text Generation

Abstract: Pretrained language models (PLM) have recently advanced graph-to-text generation, where the input graph is linearized into a sequence and fed into the PLM to obtain its representation. However, efficiently encoding the graph structure in PLMs is challenging because such models were pretrained on natural language, and modeling structured data may lead to catastrophic forgetting of distributional knowledge. In this paper, we propose STRUCTADAPT, an adapter method to encode graph structure into PLMs. Contrary to … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
13
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
3
2

Relationship

0
8

Authors

Journals

citations
Cited by 27 publications
(13 citation statements)
references
References 47 publications
0
13
0
Order By: Relevance
“…Nonetheless, adjacent nodes in the graph may be at multiple positions away from one another in the final serialization. To counteract this, Ribeiro et al (2021b) introduced StructAdapt, a structure-aware (encoder) adapter. It solves the problem of segmented nodes labels by reconstructing a new graph from the resulting subwords.…”
Section: Structadapt: a Structural Adaptermentioning
confidence: 99%
See 2 more Smart Citations
“…Nonetheless, adjacent nodes in the graph may be at multiple positions away from one another in the final serialization. To counteract this, Ribeiro et al (2021b) introduced StructAdapt, a structure-aware (encoder) adapter. It solves the problem of segmented nodes labels by reconstructing a new graph from the resulting subwords.…”
Section: Structadapt: a Structural Adaptermentioning
confidence: 99%
“…GAT is akin to GCN but differs in that aggregation of neighbors embeddings are weighted using an attention mechanism. Unlike GAT and GCN, RGCN further captures the type of the relation between (Ribeiro et al, 2021b). The details of the representations computation for each model can be found in Appendix A.…”
Section: Structadapt: a Structural Adaptermentioning
confidence: 99%
See 1 more Smart Citation
“…We also experiment with adapters since they allow for effective transfer learning (Ribeiro et al, 2021). For this model, the intention is that the adapter parameters can learn the linearised graph representation, while the parameters of the model that hold the distributed knowledge of pre-training are not changed.…”
Section: Encoding Linearised Amr With a Plmmentioning
confidence: 99%
“…While previous work has integrated graphs into neural models for NLP tasks, adding additional neural architectures to PLMs can be non-trivial, as training a graph network without compromising the original architecture of PLMs can be challenging (Ribeiro et al, 2021). Converting AMR graphs directly into text sequences and appending them can be natural, but leads to excessively long sequences, exceeding the maximum process-Figure 1: An example from our experiments.…”
Section: Introductionmentioning
confidence: 99%