2022
DOI: 10.48550/arxiv.2201.05966
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

UnifiedSKG: Unifying and Multi-Tasking Structured Knowledge Grounding with Text-to-Text Language Models

Abstract: Structured knowledge grounding (SKG) leverages structured knowledge to complete user requests, such as semantic parsing over databases and question answering over knowledge bases. Since the inputs and outputs of SKG tasks are heterogeneous, they have been studied separately by different communities, which limits systematic and compatible research on SKG. In this paper, we overcome this limitation by proposing the UNIFIEDSKG framework, which unifies 21 SKG tasks into a text-to-text format, aiming to promote sys… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
48
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
3
1

Relationship

2
7

Authors

Journals

citations
Cited by 33 publications
(49 citation statements)
references
References 24 publications
1
48
0
Order By: Relevance
“…Retrieval-augmented LMs. Several works (Lewis et al, 2020b;Xie et al, 2022) introduce a retrieval module for LMs, where given an anchor text (e.g. question), retrieved text is added to the same LM context to improve model inference (e.g.…”
Section: Related Workmentioning
confidence: 99%
“…Retrieval-augmented LMs. Several works (Lewis et al, 2020b;Xie et al, 2022) introduce a retrieval module for LMs, where given an anchor text (e.g. question), retrieved text is added to the same LM context to improve model inference (e.g.…”
Section: Related Workmentioning
confidence: 99%
“…TableGPT [46] distinctly adapts a template-based table serialization way on relatively simple tables. Experiments conducted by UnifiedSKG [97] show that putting eternal text (like questions) ahead of tables helps T5 to generalise better on tabular tasks. [66] directly encodes markup languages like NL.…”
Section: Tabular Sequence Serializationmentioning
confidence: 99%
“…RPT [87] also adopts the encoder-decoder model, similar to a BERT [30] model combined with a GPT-3 [10] model. TaPEx [69] and Uni-fiedSKG [97] implements the encoder-decoder (text-to-text) model based on BART [64] and T5 [80], respectively, for downstream [46] directly fine-tunes pre-trained GPT-2 decoder to take advantage of its contextual knowledge learned from linguistic corpora.…”
Section: Encoder-decodermentioning
confidence: 99%
“…There are two series of related works: invasive methods and non-invasive methods. Invasive methods, which are built on a strong assumption that the inner structure (e.g., self-attention and feedforward layers) of the PLM can be modified, includes Prefix-Tuning (Li and Liang, 2021), Bitfit (Ben Zaken et al, 2021), Child-Tuning , P-Tuning v2 (Liu et al, 2021b), LoRA (Hu et al, 2021, UnifiedSKG (Xie et al, 2022) and Adapter-based models (Rebuffi et al, 2017;Houlsby et al, 2019;Lin et al, 2020;He et al, 2021;Pfeiffer et al, 2021). Non-invasive methods, which only modify input embeddings and regard the inner structure as a black box, mostly are prompting methods (including our Input-Tuning).…”
Section: Related Workmentioning
confidence: 99%