RePaint: Inpainting using Denoising Diffusion Probabilistic Models
Citations Over TimeTop 1% of 2022 papers
Abstract
Free-form inpainting is the task of adding new content to an image in the regions specified by an arbitrary binary mask. Most existing approaches train for a certain distribution of masks, which limits their generalization capabilities to unseen mask types. Furthermore, training with pixel-wise and perceptual losses often leads to simple textural extensions towards the missing areas instead of semantically meaningful generation. In this work, we propose RePaint: A Denoising Diffusion Probabilistic Model (DDPM) based inpainting approach that is applicable to even extreme masks. We employ a pretrained unconditional DDPM as the generative prior. To condition the generation process, we only alter the reverse diffusion iterations by sampling the unmasked regions using the given image infor-mation. Since this technique does not modify or condition the original DDPM network itself, the model produces high-quality and diverse output images for any inpainting form. We validate our method for both faces and general-purpose image inpainting using standard and extreme masks. Re-Paint outperforms state-of-the-art Autoregressive, and GAN approaches for at least five out of six mask distributions. Github Repository: git.io/RePaint
Related Papers
- → Overview of Image Inpainting Techniques: A Survey(2022)5 cited
- A Local Nontexture Image Inpainting and Denoising Based on Anisotropic Diffusion Equation(2007)
- A survey paper based on effective inpainting technique using DIBR(2016)
- Dictionary Learning-based Inpainting on Triangular Meshes(2018)
- → A Wasserstein GAN for Joint Learning of Inpainting and Spatial Optimisation(2022)