2022
DOI: 10.1016/j.patter.2022.100521
|View full text |Cite
|
Sign up to set email alerts
|

Federated learning of molecular properties with graph neural networks in a heterogeneous setting

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
3
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 17 publications
(4 citation statements)
references
References 48 publications
0
3
0
Order By: Relevance
“…One potential solution is Federated Learning. Federated Learning is an approach that allows for model training across multiple decentralized devices or servers holding local data samples, without data exchange [101,102]. This not only preserves data privacy but also facilitates the collaborative nature of AI without centralizing sensitive information.…”
Section: Discussionmentioning
confidence: 99%
“…One potential solution is Federated Learning. Federated Learning is an approach that allows for model training across multiple decentralized devices or servers holding local data samples, without data exchange [101,102]. This not only preserves data privacy but also facilitates the collaborative nature of AI without centralizing sensitive information.…”
Section: Discussionmentioning
confidence: 99%
“…So far, existing FGL studies can be categorized into three types, i.e., inter-graph, intra-graph, and graph-structured FGL (Zhang et al 2021a). In intergraph FGL, each client owns a set of graphs and participates in FL training to learn a better GNN to model local data (Xie et al 2021), learn a generalizable model (Zhu, Luo, and White 2022), or to model spatial-temporal graph data (Jiang et al 2022;Lou et al 2021). In intra-graph FGL, instead of complete graphs, each client only owns a subgraph of the entire graph and the learning scheme is to deal with the missing links (Chen et al 2021), i.e., generating missing neighbors (Zhang et al 2021b), community discover (Baek et al 2022).…”
Section: Related Workmentioning
confidence: 99%
“…Federated learning (FL) represents a crucial machine learning paradigm in which distributed clients (e.g., several medical institutions) collaboratively train a shared global model while retaining their private data. 10 , 11 , 12 , 13 However, inherent biases may arise in the federated model because of spurious correlations and distribution shifts across data subpopulations. 14 , 15 , 16 , 17 , 18 Consequently, the model’s performance may significantly degrade for certain data subpopulations, leading to concerns regarding unfairness, particularly in critical domains such as healthcare.…”
Section: Introductionmentioning
confidence: 99%