Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005.
DOI: 10.1109/ijcnn.2005.1555942
|View full text |Cite
|
Sign up to set email alerts
|

A new model for learning in graph domains

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
849
0
8

Publication Types

Select...
4
3
1
1

Relationship

1
8

Authors

Journals

citations
Cited by 1,393 publications
(936 citation statements)
references
References 7 publications
0
849
0
8
Order By: Relevance
“…More precisely, they encoded the distribution of each individual features by a n-bits histogram. From another perspective, several authors [47,48,49] proposed to learn a graph-level embeddings by creating a virtual "supernode", that is connected to all other nodes via a particular type of edge. This virtual node is updated like real nodes, and the final representation of this virtual node is used as the graph representation.…”
Section: Molecular Graph Encodermentioning
confidence: 99%
“…More precisely, they encoded the distribution of each individual features by a n-bits histogram. From another perspective, several authors [47,48,49] proposed to learn a graph-level embeddings by creating a virtual "supernode", that is connected to all other nodes via a particular type of edge. This virtual node is updated like real nodes, and the final representation of this virtual node is used as the graph representation.…”
Section: Molecular Graph Encodermentioning
confidence: 99%
“…It is worth to mention that even if the gradient could be computed also by applying the standard backpropagation through time algorithm (Werbos 1990) to the encoding network, however the procedure adopted by GNNs is faster and uses less memory by exploiting Almeida-Pineda algorithm peculiarities. More details can be found in Gori et al (2005), Scarselli et al (2009b).…”
Section: Graph Neural Networkmentioning
confidence: 99%
“…They extend support vector machines (Kondor and Lafferty 2002;Gärtner 2003), neural networks (Sperduti and Starita 1997;Frasconi et al 1998;Gori et al 2005) and SOMs to structured data. The main idea underlying those methods is to automatically obtain an internal flat representation of the symbolic and subsymbolic information collected in the graphs.…”
Section: Introductionmentioning
confidence: 99%
“…Graph Neural Networks (GNNs) have been recently proposed [17], [18] as an evolution of Recursive Neural Networks and have been conceived to fully implement the framework presented in Sect. III, in particular Eqs.…”
Section: B Graph Neural Networkmentioning
confidence: 99%