This post is built to list the generation and improvement of graph neural networks. # Why Use * Non-euclidean data: Irregular. Each graph has a variable size of unordered nodes and each node in a graph has a different number of neighbors,
Basic lines
Contents in this block mainly comes from paper Graph Neural Networks: A Review of Methods and Applications and
History
The proposal of GNNs
learn a target node’s representation by propagating neighbor information via recurrent neural architectures in an iterative manner until a stable fixed point is reached Computation expensive * A new model for learning in graph domains * Big Question: processing the graph without losing topological information * reason
Traditional preprocessing methods for graphs dropped topological information, and thus leads to poor performance and generalization. * background RNN can only handle graph-level problems; Traditional methods dropped topological information.
- the approximation capability of GNN, Computational Capabilities of Graph Neural Networks under mild generic conditions, most of the practically useful functions on graphs can be approximated in probability by GNNs up to any prescribed degree of accuracy.
- Neural network for graphs: A contextual constructive approach
- The graph neural network model ### Go to GNNs #### Spectral-based Graph difficult to parallel or scale to large graphs,cause they need to load the whole graph into the memory. relies on eigen-decomposition of the Laplacian matrix.
- Spectral GNN, Spectral networks and locally connected networks on graphs
- ChebNet, Convolutional neural networks on graphs with fast localized spectral filtering,codes here.
- 1st ChebNet, Semi-supervised classification with graph convolutional networks, code here. localized in space, but the computation requirement grow exponentially, to reduce it, sampling methods are proposed. See FASTGCN, reduce variance and adaptive sampling for details.
- AGCN, calculate a pairwise distance of nodes to construct a residual graph, see Adaptive Graph Convolutional Neural Networks for details.
Spatial-based Graph Convolution
has gained more attention * Inductive representation learning on large graphs * Geometric deep learning on graphs and manifolds using mixture model cnns * Learning convolutional neural networks for graphs * Large-scale learnable graph convolutional networks [1],[4] used sampling strategy the common way is to stack multiple graph convolution layer together. ##### Recurrent-based Spatial GCNs update a node's representation recursively until a stable fixed point is reached * GNNs, The graph neural network model * GGNNs, used GRU, Learning phrase representations using rnn encoder-decoder for statistical machine translation, codes here. * Stochastic Steady-state Embedding (SSE), updates the node latent representations stochastically in an asynchronous fashion, Learning steady-states of iterative algorithms over graphs, codes here.
Composition-based Spatial GCNs
- Message Passing Neural Networks (MPNNs), Neural Message Passing for Quantum Chemistry
- GraphSage, Inductive representation learning on large graphs, codes here. ###### Miscellaneous Variants of Spatial GCNs
- Diffusion Convolution Neural Networks (DCNN), the hidden node representation is get by independently convolving inputs with power series or transition probability matrix, Diffusion-convolutional neural networks
- Build GCN into a standard grid to do CNN, Learning convolutional neural networks for graphs, but it ignored the node information.
- Large-scale Graph Convolution Networks (LGCN), Large-scale learnable graph convolutional networks, still using standard grid, but it also collects nodes' information and draw subgraph for mini-batch training. Codes here.
- Mixture Model Network (MoNet), Geometric deep learning on graphs and manifolds using mixture model CNNs, introduce pseudo-coordinates and weight functions to let the weight of a node’s neighbor be determined by the relative position (pseudo-coordinates) between the node and its neighbor.
- Geniepath: Graph neural networks with adaptive receptive paths, everages gating mechanisms to control the depth and breadth of a node's neighborhood.
- Dual graph convolutional networks for graph-based semi-supervised classification, one for global representation the other for local representation.
- On filter size in graph convolutional networks, introduce a hyperparameter to influence the receptive field size of a node.
Pooling module
- Convolutional neural networks on graphs with fast localized spectral filtering
- pooling by rearranging vertices into meaningful order, Deep convolutional networks on graph-structured data
- An end-to-end deep learning architecture for graph classification
- DIFFPOOL pools nodes hierarchically by learning a cluster assignment matrix in each layer to get a cluster embedding, which can be combined with any standard GCN module, Hierarchical graph representation learning with differentiable pooling
Graph attention networks
For sequence-based tasks, in total, assigning attention weights to different neighbors when aggregating feature information, ensembling multiple models according to attention weights, and using attention weights to guide random walks. * Graph attention networks, (GAT), multi-head weights. Codes here. * Gaan: Gated attention networks for learning on large and spatiotemporal graphs, also use multi-head, but it use a self attention mechanism to compute a different head for each head. * Graph classification using structural attention, Graph Attention Model (GAM), adaptively visiting a sequence of important nodes. * Watch your step: Learning node embeddings via graph attention, factorize the co-occurrence matrix with differentiable attention weights. ### Graph Autoencoders had to handle the problem caused by the sparsity of adjacency matrix. * Deep neural networks for learning graph representations, uses the stacked denoising auto-encoders to reconstruct PPMI matrix. Codes here * Structural deep network embedding, preserve nodes first-order proximity (drive representations of adjacent nodes close to each other) and second-order proximity (a node's neighbourhood information) jointly. Codes here * Variational graph auto-encoders, Graph Auto-Encoder (GAE), combined with GCN firstly. Codes here * Mgae: Marginalized graph autoencoder for graph clustering, reconstruct node's hidden state. * Adversarially regularized graph autoencoder for graph embedding, using GANs to regularize the graph auto-encoders, recover adjacency matrix. Codes here * Learning deep network representations with adversarially regularized autoencoders, recover node sequences rather than adjacency matrix. * Deep recursive network embedding with regular equivalence, codes here ### Graph Generative Networks Not scalable to large graphs. * Graphrnn: A deep generative model for graphs, graph-level RNN + node-level RNN, use breadth-first-search (BFS) to sequence the nodes and Bernoulli assumption for edge generation. Codes here. * Learning deep generative models of graphs, utilize spatial-based GCNs to obtain a hidden representation of an existing graph. * Molgan: An implicit generative model for small molecular graphs, RL+GAN+GCN * Net-gan: Generating graphs via random walks, combines LSTM with Wasserstein GAN to generate graphs from a random-walk-based approach. As for random walk, see here. * Constrained generation of semantically valid graphs via regularizing variational autoencoders ### Graph Spatial-Temporal Networks
- Structured sequence modeling with graph convolutional recurrent networks
- Diffusion convolutional recurrent neural network: Data-driven traffic forecasting, can work forwardly or backwardly. Codes here.
- Spatio-temporal graph convolutional networks: A deep learning framework for traffic forecasting, codes here.
- Spatial temporal graph convolutional networks for skeleton-based action recognition, extend the temporal flow as graph edges, and then assign each a label to each edge. Codes here.
- Structural-rnn: Deep learning on spatio-temporal graphs, aims at predicting nodes' labels at each time, has nodeRNN and edgeRNN, and split nodes and edges into semantic groups. Codes here.
Main Methodologies -- Graph Embedding
Matrix Factorization
- Discrete network embedding
- Binarized attributed network embedding ### Random Walks
- Deepwalk: Online learning of social representations
Problems
- Does going deeper always work in GNNs?
- How to select representative receptive field for a node?
- How to work on large graphs?
- How to handle dynamic and heterogeneous graph structures?
Papers
Introduction
- M. M. Bronstein, J. Bruna, Y. LeCun, A. Szlam, and P. Vandergheynst, "Geometric deep learning: going beyond euclidean data,"IEEE Signal Processing Magazine, vol. 34, no. 4, pp. 18–42, 2017 ## others ### Are Graph Neural Networks Miscalibrated?
Estimating Node Importance in Knowledge Graphs Using Graph Neural Networks
GENI, a GNN-based method designed to deal with distinctive challenges involved with predicting node importance in KGs.
Understanding attention in graph neural networks
We aim to better understand attention over nodes in graph neural networks and identify factors influencing its effectiveness. We find that under typical conditions the effect of attention is negligible or even harmful, but under certain conditions it provides an exceptional gain in performance of more than 40% in some of our classification tasks
Are Graph Neural Networks Miscalibrated?
Graph Neural Networks (GNNs) have proven to be successful in many classification tasks, outperforming previous state-of-the-art methods in terms of accuracy
Graph Convolutional Networks with EigenPooling
To apply graph neural networks for the graph classification task, approaches to generate the from node representations are demanded. Experimental results of the graph classification task on \(6\) commonly used benchmarks demonstrate the effectiveness of the proposed framework.
PAN: Path Integral Based Convolution for Deep Graph Neural Networks
Experimental results show that the path integral based graph neural networks have great learnability and fast convergence rate, and achieve state-of-the-art performance on benchmark tasks.
Attacking Graph-based Classification via Manipulating the Graph Structure
We evaluate our attacks and compare them with a recent attack designed for graph neural networks. Results show that our attacks 1) can effectively evade graph-based classification methods; 2) do not require access to the true parameters, true training dataset, and/or complete graph; and 3) outperform the existing attack for evading collective classification methods and some graph neural network methods
Deep learning in bioinformatics: introduction, application, and perspective in big data era
After that, we introduce deep learning in an easy-to-understand fashion, from shallow neural networks to legendary convolutional neural networks, legendary recurrent neural networks, graph neural networks, generative adversarial networks, variational autoencoder, and the most recent state-of-the-art architectures
Constant Time Graph Neural Networks
Recent advancements in graph neural networks (GNN) have led to state-of-the-art performance in various applications including chemo-informatics, question answering systems, and recommendation systems, to name a few
A Comprehensive Survey on Graph Neural Networks
We propose a new taxonomy to divide the state-of-the-art graph neural networks into different categories
Graph Transformation Policy Network for Chemical Reaction Prediction
To this end, we propose Graph Transformation Policy Network (GTPN) -- a novel generic method that combines the strengths of graph neural networks and reinforcement learning to learn the reactions directly from data with minimal chemical knowledge. Evaluation results show that GTPN improves the top-1 accuracy over the current state-of-the-art method by about 3% on the large USPTO dataset
Contextualized Non-local Neural Networks for Sequence Learning
Recently, a large number of neural mechanisms and models have been proposed for sequence learning, of which self-attention, as exemplified by the Transformer model, and graph neural networks (GNNs) have attracted much attention. Specifically, we propose contextualized non-local neural networks (CN\(^{\textbf{3}}\)), which can both dynamically construct a task-specific structure of a sentence and leverage rich local dependencies within a particular neighborhood.
Automated Theorem Proving in Intuitionistic Propositional Logic by Deep Reinforcement Learning
Using the large volume of augmented data, we train highly accurate graph neural networks that approximate the value function for the set of the syntactic structures of formulas. Within the specified time limit, our prover solved 84% of the theorems in a benchmark library, while \(\texttt{tauto}\) was able to solve only 52%.
Pileup mitigation at the Large Hadron Collider with Graph Neural Networks
We present a classifier based on Graph Neural Networks, trained to retain particles coming from high-transverse-momentum collisions, while rejecting those coming from pileup collisions. This model is designed as a refinement of the PUPPI algorithm, employed in many LHC data analyses since 2015
Weisfeiler and Leman Go Neural: Higher-order Graph Neural Networks
In recent years, graph neural networks (GNNs) have emerged as a powerful neural architecture to learn vector representations of nodes and graphs in a supervised, end-to-end fashion. The following work investigates GNNs from a theoretical point of view and relates them to the \(1\)-dimensional Weisfeiler-Leman graph isomorphism heuristic (\(1\)-WL). We show that GNNs have the same expressiveness as the \(1\)-WL in terms of distinguishing non-isomorphic (sub-)graphs
Multitask Learning on Graph Neural Networks - Learning Multiple Graph Centrality Measures with a Unified Network
Graph neural networks (GNN), consisting of trained neural modules which can be arranged in different topologies at run time, are sound alternatives to tackle relational problems which lend themselves to graph representations. The proposed model achieves \(89\%\) accuracy on a test dataset of random instances with up to 128 vertices and is shown to generalise to larger problem sizes
Meta-GNN: On Few-shot Node Classification in Graph Meta-learning
However, there are very few works applying meta-learning to non-Euclidean domains, and the recently proposed graph neural networks (GNNs) models do not perform effectively on graph few-shot learning problems. Additionally, Meta-GNN is a general model that can be straightforwardly incorporated into any existing state-of-the-art GNN
MR-GNN: Multi-Resolution and Dual Graph Neural Network for Predicting Structured Entity Interactions
In recent years, graph neural networks have become attractive. Experiments conducted on real-world datasets show that MR-GNN improves the prediction of state-of-the-art methods.
Revisiting Graph Neural Networks: All We Have is Low-Pass Filters
In this paper, we develop a theoretical framework based on graph signal processing for analyzing graph neural networks. Our results indicate that graph neural networks only perform low-pass filtering on feature vectors and do not have the non-linear manifold learning property
Multi-hop Reading Comprehension across Multiple Documents by Reasoning over Heterogeneous Graphs
We employ Graph Neural Networks (GNN) based message passing algorithms to accumulate evidences on the proposed HDE graph. Evaluated on the blind test set of the Qangaroo WikiHop data set, our HDE graph based model (single model) achieves state-of-the-art result.
IPC: A Benchmark Data Set for Learning with Graph-Structured Data
The data set, named IPC, consists of two self-contained versions, grounded and lifted, both including graphs of large and skewedly distributed sizes, posing substantial challenges for the computation of graph models such as graph kernels and graph neural networks
Datasets
- Citation Networks: Cora,Citeseer,Pubmed,DBLP
- Social Networks: BlogCatalog,Reddit,Epinions
- Chemical/Biological Graphs: NCI-1,NCI-9,MUTAG, D&D,QM9,Tox21,PPI.
- Unstructured Graphs: convert MNIST, Wikipedia or News Groups into graphs.
- Others: METR-LA, Movies-Lens1M, NELL.
Thanks for the links given here.