MagBERT: Magnetics Knowledge Aware Language Model Coupled with a Question Answering Pipeline for Curie Temperature Extraction Task
Aigerim Zhumabayeva,
Nikhil Ranjan,
Martin Takáč
et al.
Abstract:In this study, we develop and release two Bidirectional Encoder Representations (BERT) models that are trained primarily with roughly ≈144 K peer-reviewed publications within the magnetic materials domain, which we refer to as MagBERT and MagMatBERT, respectively. These transformer models are then used in chemical named entity recognition (CNER) and question answering (QA) tasks in a data extraction workflow. We demonstrate this approach by developing a magnetics data set of well-known magnetic property, i.e.,… Show more
Set email alert for when this publication receives citations?
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.