“…In the hypernucleus 17 Λ F with the adjusted force, an s-state Λ is bound by B Λ = 13.8 MeV and a p-state Λ by B Λ = 3.2 MeV ([101]) or B Λ = 3.9 MeV ([110]). In the latter case both nuclear core and hyperon orbital are prolately deformed with β 2 ≈ 0.2, which causes an important gain in binding energy [20,28,53,56,57]. This is also decisive for the effect of the hyperon on the proton separation energy, which can better be seen by the precise values listed in Table II.…”
Graphs can model complex relationships between objects, enabling a myriad of Web applications such as online page/article classification and social recommendation. While graph neural networks (GNNs) have emerged as a powerful tool for graph representation learning, in an end-to-end supervised setting, their performance heavily relies on a large amount of task-specific supervision. To reduce labeling requirement, the "pre-train, fine-tune" and "pre-train, prompt" paradigms have become increasingly common. In particular, prompting is a popular alternative to fine-tuning in natural language processing, which is designed to narrow the gap between pre-training and downstream objectives in a task-specific manner. However, existing study of prompting on graphs is still limited, lacking a universal treatment to appeal to different downstream tasks. In this paper, we propose GraphPrompt, a novel pre-training and prompting framework on graphs. GraphPrompt not only unifies pre-training and downstream tasks into a common task template, but also employs a learnable prompt to assist a downstream task in locating the most relevant knowledge from the pre-trained model in a task-specific manner. Finally, we conduct extensive experiments on five public datasets to evaluate and analyze GraphPrompt.
CCS CONCEPTS• Computing methodologies → Learning latent representations; • Information systems → Data mining.
“…In the hypernucleus 17 Λ F with the adjusted force, an s-state Λ is bound by B Λ = 13.8 MeV and a p-state Λ by B Λ = 3.2 MeV ([101]) or B Λ = 3.9 MeV ([110]). In the latter case both nuclear core and hyperon orbital are prolately deformed with β 2 ≈ 0.2, which causes an important gain in binding energy [20,28,53,56,57]. This is also decisive for the effect of the hyperon on the proton separation energy, which can better be seen by the precise values listed in Table II.…”
Graphs can model complex relationships between objects, enabling a myriad of Web applications such as online page/article classification and social recommendation. While graph neural networks (GNNs) have emerged as a powerful tool for graph representation learning, in an end-to-end supervised setting, their performance heavily relies on a large amount of task-specific supervision. To reduce labeling requirement, the "pre-train, fine-tune" and "pre-train, prompt" paradigms have become increasingly common. In particular, prompting is a popular alternative to fine-tuning in natural language processing, which is designed to narrow the gap between pre-training and downstream objectives in a task-specific manner. However, existing study of prompting on graphs is still limited, lacking a universal treatment to appeal to different downstream tasks. In this paper, we propose GraphPrompt, a novel pre-training and prompting framework on graphs. GraphPrompt not only unifies pre-training and downstream tasks into a common task template, but also employs a learnable prompt to assist a downstream task in locating the most relevant knowledge from the pre-trained model in a task-specific manner. Finally, we conduct extensive experiments on five public datasets to evaluate and analyze GraphPrompt.
CCS CONCEPTS• Computing methodologies → Learning latent representations; • Information systems → Data mining.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.