-
SPS
IEEE Members: $11.00
Non-members: $15.00Length: 00:06:45
Graph Convolutional Networks (GCNs) are a powerful approach for learning graph representations and show promising results in various applications. Despite their success, they are usually limited to shallow architectures due to the vanishing gradients, over-smoothing, and over-squashing problems. As Convolutional Neural Networks benefit tremendously from stacking very deep layers, recently techniques such as various types of residual connections and dense connections are proposed to tackle these problems and make GCNs go deeper. In this work, we further study the problem of designing deep architectures for GCNs. Firstly, we introduce the Higher Order Graph Recurrent Networks (HOGRNs), which can unify most existing architectures of GCNs. Then we show that ResGCN and DenseGCN are special cases of HOGRNs. To enjoy the benefits from both residual connections and dense connections and compensate for the drawbacks from each other, we propose Dual Path Graph Convolutional Networks (DPGCNs), which exploit a new topology of connection paths internally. In DPGCNs, we maintain both a residual path and a densely connected path while learning the graph representations. Extensive experiments on OGB datasets demonstrate superior performances of the proposed DPGCNs over competitive baseline methods on the large-scale graph learning tasks of node property prediction and graph property prediction.