Direct Diffusion Bridge for Inverse Problems with Data Consistency

Image credit: Hyungjin Chung

Abstract

Diffusion model-based inverse problem solvers have shown impressive performance, but are limited in speed, mostly as they require reverse diffusion sampling starting from noise. Several recent works have tried to alleviate this problem by building a diffusion process, directly bridging the clean and the corrupted for specific inverse problems. To provide a coherent explanation of the seemingly different approaches, we first develop a unified framework under the name Direct Diffusion Bridges (DDB), showing that while motivated by different theories, the resulting algorithms only differ in the choice of parameters. Then, we highlight a critical limitation of the current DDB framework, namely that it does not ensure data consistency. To address this problem, we propose a modified inference procedure that imposes data consistency without the need for fine-tuning. We term the resulting method data Consistent DDB (CDDB), which outperforms its inconsistent counterpart in terms of both perception and distortion metrics, thereby effectively pushing the Pareto-frontier toward the optimum. Our proposed method achieves state-of-the-art results on both evaluation criteria, showcasing its superiority over existing methods.

Publication
In Advances in Neural Information Processing Systems (NeurIPS) 2023
Hyungjin Chung
Hyungjin Chung
Ph.D. student - Generative Models & Inverse Problems

My research interests include, but is not restricted to developing efficient, modular deep generative models (diffusion models), and solving real-world inverse problems (MRI, tomography, microscopy, phase retrieval, etc.) with deep generative priors.

Related