DOI: 10.29007/fm8f
|View full text |Cite
|
Sign up to set email alerts
|

Overview of COLIEE 2017

Abstract: We present the evaluation of the legal question answering Competition on Legal Information Extraction/Entailment (COLIEE) 2017. The COLIEE 2017 Task consists of two sub-Tasks: legal information retrieval (Task 1), and recognizing entailment between articles and queries (Task 2). Participation was open to any group based on any approach, and the tasks attracted 10 teams. We received 9 submissions to Task 1 (for a total of 17 runs), and 8 submissions to Task 2 (for a total of 20 runs).

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
9
0

Publication Types

Select...
3
2
2

Relationship

0
7

Authors

Journals

citations
Cited by 11 publications
(10 citation statements)
references
References 8 publications
(8 reference statements)
0
9
0
Order By: Relevance
“…For the semantic matching case (1), it is difficult to select appropriate pairs ("adult" and "adult ward") for replacement to make such a semantic mismatch sentence. For both cases (3), it is also difficult to make the data and to use these data for negative examples to identify types of errors to judge the entailment results.…”
Section: Data Augmentation Using Articlesmentioning
confidence: 99%
See 2 more Smart Citations
“…For the semantic matching case (1), it is difficult to select appropriate pairs ("adult" and "adult ward") for replacement to make such a semantic mismatch sentence. For both cases (3), it is also difficult to make the data and to use these data for negative examples to identify types of errors to judge the entailment results.…”
Section: Data Augmentation Using Articlesmentioning
confidence: 99%
“…We constructed 10 different training and validation sets to make different BERT models. In addition to the training data constructed from the original data, all augmented data (3,331 examples for the submission system) were merged with the training data. As a result, we used 3,956 examples for training and 70 for validation.…”
Section: Coliee 2021 Submission Systemmentioning
confidence: 99%
See 1 more Smart Citation
“…Instead, standalone sentences are provided to determine their entailment. It is also the case of existing shared tasks for legal information extraction, such as COLIEE (Kano et al, 2017), where one has to recognize entailment between articles and queries, as considered in the question answering problem. Obviously, the tasks aimed at retrieving documents consisting of multiple sentences, such as TREC legal track (Baron The aim of this task is to identify spans in the requested documents (referred to as target documents) representing clauses analogous to the spans selected in other documents (referred to as seed documents).…”
Section: Review Of Existing Datasetsmentioning
confidence: 99%
“…ir in the legal domain is widely connected with the Competition on Legal Information Extraction/Entailment (coliee). From 2015 to 2017 (Kim et al, 2015a(Kim et al, , 2016aKano et al, 2017) the task was to retrieve Japanese Civil Code articles given a question, while in coliee 2018 and 2019 (Kano et al, 2018;Rabelo et al, 2019) the task was to retrieve supporting cases given a short description of an unseen case. However, the texts of these competitions are short compared to our datasets.…”
Section: Related Workmentioning
confidence: 99%