2024
DOI: 10.1021/acs.jpcc.4c01974
|View full text |Cite
|
Sign up to set email alerts
|

MagBERT: Magnetics Knowledge Aware Language Model Coupled with a Question Answering Pipeline for Curie Temperature Extraction Task

Aigerim Zhumabayeva,
Nikhil Ranjan,
Martin Takáč
et al.

Abstract: In this study, we develop and release two Bidirectional Encoder Representations (BERT) models that are trained primarily with roughly ≈144 K peer-reviewed publications within the magnetic materials domain, which we refer to as MagBERT and MagMatBERT, respectively. These transformer models are then used in chemical named entity recognition (CNER) and question answering (QA) tasks in a data extraction workflow. We demonstrate this approach by developing a magnetics data set of well-known magnetic property, i.e.,… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 41 publications
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?