Semantic role labelling plays a major role in determining the main predicate and related arguments and its relationship to the predicate. These arguments identify the semantic roles in a sentence such as agent, goal, result, etc. Many approaches have been made to perform the task of semantic role labelling. Many recent works incorporated neural networks making use of syntactic features of the given text such as part of speech tags, POS tagged dependency tree, etc. Other works included the use of BERT along with LSTM and MLP to create semantic role labeling models. In this work, BERT model has been fine tuned to be used as a classifier to detect the main predicate, arguments of the predicate and also the relationship arguments maintain with the predicate in the sentence. Using BERT to build a simple classifier creates an efficient semantic role labelling model. Model performance was assessed using metric accuracy, precision, recall and F1 score. This is achieved by comparing the output of each word with the predicted word and labels as well as the arguments in the prediction of the output.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.