The platform will undergo maintenance on Sep 14 at about 7:45 AM EST and will be unavailable for approximately 2 hours.
Proceedings of the 15th International Workshop on Semantic Evaluation (SemEval-2021) 2021
DOI: 10.18653/v1/2021.semeval-1.39
|View full text |Cite
|
Sign up to set email alerts
|

SemEval-2021 Task 9: Fact Verification and Evidence Finding for Tabular Data in Scientific Documents (SEM-TAB-FACTS)

Abstract: Understanding tables is an important and relevant task that involves understanding table structure as well as being able to compare and contrast information within cells. In this paper, we address this challenge by presenting a new dataset and tasks that addresses this goal in a shared task in SemEval 2020 Task 9: Fact Verification and Evidence Finding for Tabular Data in Scientific Documents (SEM-TAB-FACTS). Our dataset contains 981 manuallygenerated tables and an auto-generated dataset of 1980 tables providi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
8
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
2
2

Relationship

0
8

Authors

Journals

citations
Cited by 16 publications
(15 citation statements)
references
References 21 publications
0
8
0
Order By: Relevance
“…More work is needed to make models interpretable, either through explanations or by pointing to the evidence that is used for predictions (e.g., Feng et al, 2018;Serrano and Smith, 2019;Jain and Wallace, 2019;Wiegreffe and Pinter, 2019;DeYoung et al, 2020;Paranjape et al, 2020;Hewitt and Liang, 2019;Niven and Kao, 2019;Ravichander et al, 2021). Many recent shared tasks on reasoning over semi-structured tabular data (such as SemEval 2021 Task 9 [Wang et al, 2021a] and FEVEROUS [Aly et al, 2021]) have highlighted the importance of, and the challenges associated with, evidence extraction for claim verification.…”
Section: Discussion and Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…More work is needed to make models interpretable, either through explanations or by pointing to the evidence that is used for predictions (e.g., Feng et al, 2018;Serrano and Smith, 2019;Jain and Wallace, 2019;Wiegreffe and Pinter, 2019;DeYoung et al, 2020;Paranjape et al, 2020;Hewitt and Liang, 2019;Niven and Kao, 2019;Ravichander et al, 2021). Many recent shared tasks on reasoning over semi-structured tabular data (such as SemEval 2021 Task 9 [Wang et al, 2021a] and FEVEROUS [Aly et al, 2021]) have highlighted the importance of, and the challenges associated with, evidence extraction for claim verification.…”
Section: Discussion and Related Workmentioning
confidence: 99%
“…Dataset Recently, datasets such as TabFact (Chen et al, 2020b) and INFOTABS , and also shared tasks such as SemEval 2021 Task 9 (Wang et al, 2021a) and FEVER-OUS (Aly et al, 2021), have sparked interest in tabular NLI research. In this study, we use the INFOTABS dataset for our investigations.…”
Section: Introductionmentioning
confidence: 99%
“…Even though tables are also widely used to convey information, especially in scientific texts, there has been comparatively less work on verifying if a given table supports a statement. To this end, Se-mEval 2021 Task 9 (Wang et al, 2021) focuses on statement verification and evidence finding for tables from scientific articles in the English language. The task is divided into two subtasks -A and B.…”
Section: Introductionmentioning
confidence: 99%
“…This year, SemEval-2021 Task 9: Statement Verification and Evidence Finding with Tables (SEM-TAB-FACT), aims to verify statements and find evidence from tables in scientific articles (Wang et al, 2021). It is an important task targeting at promoting proper interpretation of the surrounding article.…”
Section: Introductionmentioning
confidence: 99%
“…The task of verification from structured evidence, such as tables, charts, and databases, is still less explored. This paper describes sattiy team's system in SemEval-2021 task 9: Statement Verification and Evidence Finding with Tables (SEM-TAB-FACT) (Wang et al, 2021). This competition aims to verify statements and to find evidence from tables for scientific articles and to promote the proper interpretation of the surrounding article.…”
mentioning
confidence: 99%