2022
DOI: 10.1007/s12626-022-00101-3
|View full text |Cite
|
Sign up to set email alerts
|

Applying BERT Embeddings to Predict Legal Textual Entailment

Abstract: Textual entailment classification is one of the hardest tasks for the Natural Language Processing community. In particular, working on entailment with legal statutes comes with an increased difficulty, for example in terms of different abstraction levels, terminology and required domain knowledge to solve this task. In course of the COLIEE competition, we develop three approaches to classify entailment. The first approach combines Sentence-BERT embeddings with a graph neural network, while the second approach … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(3 citation statements)
references
References 16 publications
(23 reference statements)
0
3
0
Order By: Relevance
“…e corpus used in the intelligent translation model plays an important role [22]. e corpus can be used to store bilingual phrase data, accurately label the parts of speech of short words, standardize the function of each phrase, and improve the timeliness and accuracy of the automatic phrase recognition algorithm in the English machine translation process.…”
Section: Application Of Machine Learning Methods In Cross-mentioning
confidence: 99%
“…e corpus used in the intelligent translation model plays an important role [22]. e corpus can be used to store bilingual phrase data, accurately label the parts of speech of short words, standardize the function of each phrase, and improve the timeliness and accuracy of the automatic phrase recognition algorithm in the English machine translation process.…”
Section: Application Of Machine Learning Methods In Cross-mentioning
confidence: 99%
“…Wehnert et al [27] have introduced three distinct methods for the classification of entailment. The first approach harmonizes Sentence-BERT embeddings with a graph neural network, while the second strategy leans on the specific LEGAL-BERT model, which undergoes additional training on the competition's retrieval task and is fine-tuned specifically for entailment classification.…”
Section: Background and Related Workmentioning
confidence: 99%
“…The authors of BERT (Devlin et al, 2018) proposed an approach for determining the most optimal parameters for fine-tuning that is based on a search within a limited range. In this concept, the learn- (Wehnert et al, 2022;Rogers et al, 2020). Since those parameters don't always produce the best results and their use may still result in a model remaining undertrained, an alternative strategy was adopted for choosing the upper limit of training epochs that tracks validation loss and terminates training only when the conditions are met.…”
Section: Fine-tuningmentioning
confidence: 99%