This paper presents a relation-centric algorithm for solving arithmetic word problems (AWPs) by synergizing a syntax-semantics extractor for extracting explicit relations, and a neural network miner for mining implicit relations. This is the first algorithm that has a specific component to acquire implicit knowledge items for solving AWPs. This paper proposes a three-phase scheme to decompose the challenging task of designing an algorithm for solving AWPs into three smaller tasks. The first phase proposes a state-action paradigm; the second phase instantiates the paradigm into a relation-centric approach; and the third phase implements a relation-centric algorithm for solving AWPs. There are two main steps in the proposed algorithm: problem understanding and symbolic solver. By adopting the relation-centric approach, problem understanding becomes a task of relation acquisition. For conducting the task of relation acquisition, a relaxed syntax-semantics method first extracts a group of explicit relation candidates. In parallel, a neural network miner acquires implicit relation candidates. The miner computes the vectors encoded by BERT to determine which implicit relations should be added. Thus, problem understanding can acquire both explicit relations and implicit relations, which addresses the challenge of building a problem understanding method that can acquire all the knowledge items to find the solution. In the subsequent step of symbolic solver, a fusion procedure forms a distilled set of relations from all the candidates by discarding unnecessary relations. Experimentation on nine benchmark datasets validates the superiority of the proposed algorithm that outperforms the state-of-the-art algorithms.
This paper presents TransCrimeNet, a novel transformer-based model for predicting future crimes in criminal networks from textual data. Criminal network analysis has become vital for law enforcement agencies to prevent crimes. However, existing graph-based methods fail to effectively incorporate crucial textual data like social media posts and interrogation transcripts that provide valuable insights into planned criminal activities. To address this limitation, we develop TransCrimeNet which leverages the representation learning capabilities of transformer models like BERT to extract features from unstructured text data. These text-derived features are fused with graph embeddings of the criminal network for accurate prediction of future crimes. Extensive experiments on real-world criminal network datasets demonstrate that TransCrimeNet outperforms previous state-of-the-art models by 12.7% in F1 score for crime prediction. The results showcase the benefits of combining textual and graph-based features for actionable insights to disrupt criminal enterprises.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.