2022
DOI: 10.1109/tpami.2022.3207500
|View full text |Cite
|
Sign up to set email alerts
|

Dynamic Graph Message Passing Networks for Visual Recognition

Abstract: Modelling long-range dependencies is critical for scene understanding tasks in computer vision. Although convolution neural networks (CNNs) have excelled in many vision tasks, they are still limited in capturing long-range structured relationships as they typically consist of layers of local kernels. A fully-connected graph, such as the self-attention operation in Transformers, is beneficial for such modelling, however, its computational overhead is prohibitive. In this paper, we propose a dynamic graph messag… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(3 citation statements)
references
References 63 publications
0
3
0
Order By: Relevance
“…24 Among these approaches, the graph convolutional neural network (GCN) is commonly employed. GCN treats atoms as nodes and bonds as edges, utilizing message-passing mechanisms 25 to continuously aggregate neighborhood features and capture local molecular motifs and molecular topologies. 26,27 Graph-based models consider more molecular topological structures, making them particularly promising for predicting alternating, random, and block copolymers.…”
Section: Introductionmentioning
confidence: 99%
“…24 Among these approaches, the graph convolutional neural network (GCN) is commonly employed. GCN treats atoms as nodes and bonds as edges, utilizing message-passing mechanisms 25 to continuously aggregate neighborhood features and capture local molecular motifs and molecular topologies. 26,27 Graph-based models consider more molecular topological structures, making them particularly promising for predicting alternating, random, and block copolymers.…”
Section: Introductionmentioning
confidence: 99%
“…In GNNs, the message passing mechanism is used in conducting updates and aggregations of nodes' information and thus the GNN learns details of the entire graph. 55 For the GNN, the graph G is often represented by the node matrix V and the matrix E of edges, as shown in eqn (1). A common messaging passing mechanism consists of two steps: message aggregation and message combination, as shown in eqn ( 2) and (3) respectively.…”
Section: Graph Neural Network (Gnn)mentioning
confidence: 99%
“…It is suitable for learning the correlation between different kinds of data. It is also widely used in image processing [22][23][24]. So, to better fuse the cross-modal features, we proposed a GNN-based Language and Image Fusion (GLIF) module.…”
Section: Introductionmentioning
confidence: 99%