Dynamic Graph Message Passing Networks
Citations Over TimeTop 10% of 2020 papers
Abstract
Modelling long-range dependencies is critical for scene understanding tasks in computer vision. Although CNNs have excelled in many vision tasks, they are still limited in capturing long-range structured relationships as they typically consist of layers of local kernels. A fully-connected graph is beneficial for such modelling, however, its computational overhead is prohibitive. We propose a dynamic graph message passing network, that significantly reduces the computational complexity compared to related works modelling a fully-connected graph. This is achieved by adaptively sampling nodes in the graph, conditioned on the input, for message passing. Based on the sampled nodes, we dynamically predict node-dependent filter weights and the affinity matrix for propagating information between them. Using this model, we show significant improvements with respect to strong, state-of-the-art baselines on three different tasks and backbone architectures. Our approach also outperforms fully-connected graphs while using substantially fewer floating-point operations and parameters.
Related Papers
- → An improved filtering algorithm based on median filtering algorithm and medium filtering algorithm(2012)6 cited
- → Reparameterization Gradient Message Passing(2019)4 cited
- → Remarks on Algorithm 2, Algorithm 3, Algorithm 15, Algorithm 25 and Algorithm 26(1961)2 cited
- → Remarks on Algorithm 332: Jacobi polynomials: Algorithm 344: student's t -distribution: Algorithm 351: modified Romberg quadrature: Algorithm 359: factoral analysis of variance(1970)
- → Self-Stabilization in Message Passing Systems(2019)