Multi-Scale Frequency Reconstruction for Guided Depth Map Super-Resolution via Deep Residual Network
Citations Over TimeTop 10% of 2019 papers
Abstract
The depth maps obtained by the consumer-level sensors are always noisy in the low-resolution (LR) domain. Existing methods for the guided depth super-resolution, which are based on the pre-defined local and global models, perform well in general cases (e.g., joint bilateral filter and Markov random field). However, such model-based methods may fail to describe the potential relationship between RGB-D image pairs. To solve this problem, this paper proposes a data-driven approach based on the deep convolutional neural network with global and local residual learning. It progressively upsamples the LR depth map guided by the high-resolution intensity image in multiple scales. A global residual learning is adopted to learn the difference between the ground truth and the coarsely upsampled depth map, and the local residual learning is introduced in each scale-dependent reconstruction sub-network. This scheme can restore the depth structure from coarse to fine via multi-scale frequency synthesis. In addition, batch normalization layers are used to improve the performance of depth map denoising. Our method is evaluated in noise-free and noisy cases. A comprehensive comparison against 17 state-of-the-art methods is carried out. The experimental results show that the proposed method has faster convergence speed as well as improved performances based on the qualitative and quantitative evaluations.
Related Papers
- → An analysis of binarization ground truthing(2010)44 cited
- → Evaluation of manually created ground truth for multi-view people localization(2013)9 cited
- → Document Segmentation Using Pixel-Accurate Ground Truth(2010)7 cited
- → Tryggo: Old norse for truth: The real truth about ground truth: New insights into the challenges of generating ground truth maps for WSI CAD algorithm evaluation(2012)4 cited
- → Ground-Truth Data Modeling for Gait Analysis based on Kinect V2 Sensor Data(2022)