Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining 2022
DOI: 10.1145/3534678.3539294
|View full text |Cite
|
Sign up to set email alerts
|

Semantic Enhanced Text-to-SQL Parsing via Iteratively Learning Schema Linking Graph

Abstract: The generalizability to new databases is of vital importance to Text-to-SQL systems which aim to parse human utterances into SQL statements. Existing works achieve this goal by leveraging the exact matching method to identify the lexical matching between the question words and the schema items. However, these methods fail in other challenging scenarios, such as the synonym substitution in which the surface form differs between the corresponding question words and schema items. In this paper, we propose a frame… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
7
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
8
1
1

Relationship

2
8

Authors

Journals

citations
Cited by 14 publications
(7 citation statements)
references
References 34 publications
(48 reference statements)
0
7
0
Order By: Relevance
“…Schema-linking-based methods, such as RAT-SQL (Wang et al 2019), introduce a relation-aware transformer encoder to improve the joint encoding of a question and schema. Liu et al (2022) propose a similarity learning-based question-schema-alignment method to obtain a semantic schema-linking graph and observed how the pre-trained language model (PLM) embeddings for the schema items are affected. Zhao and Yang (2022) use the same words that appear in both the natural language statement and the table as weak supervised key points and design an interaction network to explore the correlation between the representations of the natural language statements and tables.…”
Section: Table-question Alignmentmentioning
confidence: 99%
“…Schema-linking-based methods, such as RAT-SQL (Wang et al 2019), introduce a relation-aware transformer encoder to improve the joint encoding of a question and schema. Liu et al (2022) propose a similarity learning-based question-schema-alignment method to obtain a semantic schema-linking graph and observed how the pre-trained language model (PLM) embeddings for the schema items are affected. Zhao and Yang (2022) use the same words that appear in both the natural language statement and the table as weak supervised key points and design an interaction network to explore the correlation between the representations of the natural language statements and tables.…”
Section: Table-question Alignmentmentioning
confidence: 99%
“…Lastly, the performance gain of RAPL on cross-domain tasks is lower than that on in-domain tasks. An intriguing avenue for future research is to explore techniques for better performance on cross-domain tasks, e.g., data augmentation (Hu et al, 2023c) and structured knowledge guidance (Liu et al, 2022a;Ma et al, 2023a).…”
Section: Limitationsmentioning
confidence: 99%
“…This formulation is related to incremental parsing, where a sentence is scanned from left-to-right and the structured is built incrementally by inserting a node or attaching an edge. Incremental parsers are widely used in semantic parsing (Zhou et al, 2016;Cheng et al, 2017;Guo and Lu, 2018;Naseem et al, 2019;Liu et al, 2022a) and syntactic parsing (Huang and Sagae, 2010;Dyer et al, 2015;Liu and Zhang, 2017), as they are computationally efficient, and can use machine learning to predict actions based on partially generated structures. Our feature fusion module can be viewed as the parser state as it carries the structural information and serves as a writable memory during the expansion step.…”
Section: Related Workmentioning
confidence: 99%