2021
DOI: 10.1016/j.aiopen.2021.06.003
|View full text |Cite
|
Sign up to set email alerts
|

Lawformer: A pre-trained language model for Chinese legal long documents

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
46
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
5
2
1

Relationship

1
7

Authors

Journals

citations
Cited by 107 publications
(72 citation statements)
references
References 24 publications
0
46
0
Order By: Relevance
“…LegalAI can not only provide handy references for people who are not familiar with legal knowledge, but also reduce the redundant paperwork for legal practitioners. Many efforts have been devoted to a variety of LegalAI tasks, including legal judgment prediction Chalkidis et al, 2019;Yang et al, 2019), legal question answering (Ravichander et al, 2019;Zhong et al, 2020b;Kien et al, 2020), contract review (Hendrycks et al, 2021;Koreeda and Manning, 2021), legal case retrieval , and legal pre-trained models (Chalkidis et al, 2020;Xiao et al, 2021). Most existing works focus on the application in LegalAI while ignoring the basic key event information in the legal documents.…”
Section: Legal Artificial Intelligencementioning
confidence: 99%
See 2 more Smart Citations
“…LegalAI can not only provide handy references for people who are not familiar with legal knowledge, but also reduce the redundant paperwork for legal practitioners. Many efforts have been devoted to a variety of LegalAI tasks, including legal judgment prediction Chalkidis et al, 2019;Yang et al, 2019), legal question answering (Ravichander et al, 2019;Zhong et al, 2020b;Kien et al, 2020), contract review (Hendrycks et al, 2021;Koreeda and Manning, 2021), legal case retrieval , and legal pre-trained models (Chalkidis et al, 2020;Xiao et al, 2021). Most existing works focus on the application in LegalAI while ignoring the basic key event information in the legal documents.…”
Section: Legal Artificial Intelligencementioning
confidence: 99%
“…As pre-trained language models have achieved promising results in many legal tasks (Chalkidis et al, 2020;Xiao et al, 2021), we adopt the BERT as our basic encoder. To verify the effectiveness of event detection in LegalAI, we only make minor changes in the embedding layer to integrate the event information.…”
Section: Encoder Architecturementioning
confidence: 99%
See 1 more Smart Citation
“…Charges r Prison terms r Link TOPJUDGE-CJO [29] -TOPJUDGE-PKU [29] TOPJUDGE-CAIL [29] CAIL-Long [47] Link FLA [15] Charges r -RACP [48] Fact snippets a MAMD [49] -Court-View-Gen [33] Court View r Link AC-NLG [34] Facts r…”
Section: Articles Rmentioning
confidence: 99%
“…Although BERT-based methods have made breakthroughs in different fields in recent years [3][4][5][6][7][8], there is still a deficiency in terms of solving understanding long, domain-specific text, mainly because pre-trained models trained by generic corpora often do not work well in the legal field. There are currently several pre-trained models for the legal domain [9][10][11]. Nevertheless, there is only one open-source pre-training model for Chinese judicial scripts and its effect has been experimentally verified to have no significant performance improvement.…”
Section: Introductionmentioning
confidence: 99%