Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume 2021
DOI: 10.18653/v1/2021.eacl-main.201
|View full text |Cite
|
Sign up to set email alerts
|

Structural Encoding and Pre-training Matter: Adapting BERT for Table-Based Fact Verification

Abstract: Growing concern with online misinformation has encouraged NLP research on fact verification. Since writers often base their assertions on structured data, we focus here on verifying textual statements given evidence in tables. Starting from the Table Parsing (TAPAS) model developed for question answering (Herzig et al., 2020), we find that modeling table structure improves a language model pre-trained on unstructured text. Pre-training language models on English Wikipedia table data further improves performanc… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 22 publications
0
1
0
Order By: Relevance
“…A table-to-text pretrained model (Xing and Wan, 2021) was proposed though, the large and diversified table corpus is often unavailable. In addition, recent works on fact verification taking tabular as input (Yin et al, 2020;Dong and Smith, 2021) have suggested the effectiveness of the table-structure-aware pretrained model.…”
Section: Table-to-text Generationmentioning
confidence: 99%
“…A table-to-text pretrained model (Xing and Wan, 2021) was proposed though, the large and diversified table corpus is often unavailable. In addition, recent works on fact verification taking tabular as input (Yin et al, 2020;Dong and Smith, 2021) have suggested the effectiveness of the table-structure-aware pretrained model.…”
Section: Table-to-text Generationmentioning
confidence: 99%