“…Mathisen, Bach, and Aamodt [2021] and Amin et al [2020] utilize Siamese neural networks in CBR retrieval to learn similarity measures used in aquaculture and natural language processing tasks, respectively. Leake, Ye, and Crandall [2021], Liao, Liu, and Chao [2018], and Ye, Leake, and Crandall [2022] discuss the application of DL methods in the reuse phase of CBR. Leake, Wilkerson, and Crandall [2022] use neural networks to learn features in CBR tasks.…”
Section: Related Workmentioning
confidence: 99%
“…Leake, Ye, and Crandall [2021], Liao, Liu, and Chao [2018], and Ye, Leake, and Crandall [2022] discuss the application of DL methods in the reuse phase of CBR. Leake, Wilkerson, and Crandall [2022] use neural networks to learn features in CBR tasks. In a more narrow sense, regarding the integration of DL methods in the retrieval of POCBR, the work of Klein, and the work of , on which the present work builds, are related.…”
Similarity-based retrieval of semantic graphs is a crucial task of Process-Oriented Case-Based Reasoning (POCBR) that is usually complex and time-consuming, as it requires some kind of inexact graph matching. Previous work tackles this problem by using Graph Neural Networks (GNNs) to learn pairwise graph similarities. In this paper, we present a novel approach that improves on the GNN-based case retrieval with a Transfer Learning (TL) setup, composed of two phases: First, the pretraining phase trains a model for assessing the similarities between graph nodes and edges and their semantic annotations. Second, the pretrained model is then integrated into the GNN model by either using fine-tuning, i.e., the parameters of the pretrained model are further trained, or feature extraction, i.e., the parameters of the pretrained model are converted to constants. The experimental evaluation examines the quality and performance of the models based on TL compared to the GNN models from previous work for three semantic graph domains with various properties. The results show the great potential of the proposed approach for reducing the similarity prediction error and the training time.
“…Mathisen, Bach, and Aamodt [2021] and Amin et al [2020] utilize Siamese neural networks in CBR retrieval to learn similarity measures used in aquaculture and natural language processing tasks, respectively. Leake, Ye, and Crandall [2021], Liao, Liu, and Chao [2018], and Ye, Leake, and Crandall [2022] discuss the application of DL methods in the reuse phase of CBR. Leake, Wilkerson, and Crandall [2022] use neural networks to learn features in CBR tasks.…”
Section: Related Workmentioning
confidence: 99%
“…Leake, Ye, and Crandall [2021], Liao, Liu, and Chao [2018], and Ye, Leake, and Crandall [2022] discuss the application of DL methods in the reuse phase of CBR. Leake, Wilkerson, and Crandall [2022] use neural networks to learn features in CBR tasks. In a more narrow sense, regarding the integration of DL methods in the retrieval of POCBR, the work of Klein, and the work of , on which the present work builds, are related.…”
Similarity-based retrieval of semantic graphs is a crucial task of Process-Oriented Case-Based Reasoning (POCBR) that is usually complex and time-consuming, as it requires some kind of inexact graph matching. Previous work tackles this problem by using Graph Neural Networks (GNNs) to learn pairwise graph similarities. In this paper, we present a novel approach that improves on the GNN-based case retrieval with a Transfer Learning (TL) setup, composed of two phases: First, the pretraining phase trains a model for assessing the similarities between graph nodes and edges and their semantic annotations. Second, the pretrained model is then integrated into the GNN model by either using fine-tuning, i.e., the parameters of the pretrained model are further trained, or feature extraction, i.e., the parameters of the pretrained model are converted to constants. The experimental evaluation examines the quality and performance of the models based on TL compared to the GNN models from previous work for three semantic graph domains with various properties. The results show the great potential of the proposed approach for reducing the similarity prediction error and the training time.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.