Third year Ph.D. student @KAIST bio-imaging signal processing & learning lab (BISPL). Intern at NVIDIA Research. Prior research intern at Google Research Perception (LUMA), and Los Alamos National Laboratory (LANL) applied math and plasma physics group (T-5). Diffusion models and inverse problems enthusiast. Hyungjin Chung has pioneered and advanced some of the most widely acknowledged works on diffusion model-based inverse problem solvers. Interested in 1) Advancing and widening the applicability of diffusion models in inverse imaging, 2) Acceleration of diffusion models, 3) Application to solve real-world problems (e.g. medical imaging).
Download my CV.
PhD in Bio & Brain Engineering, Current
KAIST
MS in Bio & Brain Engineering, 2021
KAIST
BS in Biomedical Engineering, 2019
Korea University
We show that seemingly different direct diffusion bridges are equivalent, and that we can push the pareto frontier of the perception-distortion tradeoff with data consistency gradient guidance.
TPDM improves 3D voxel generative modeling with 2D diffusion models. We show that 3D generative prior can be accurately represented as the product of two independent 2D diffusion priors that scale to both unconditional sampling and solving inverse problems.
We propose a method that can solve 3D inverse problems in the medical imaging domain using only the pre-trained 2D diffusion model augmented with the conventional model-based prior.
Diffusion posterior sampling enables solving arbitrary noisy (e.g. Gaussian, Poisson) inverse problems that are both linear or non-linear.
Manifold constraint dramatically improves the performance of unsupervised inverse problem solving using diffusion models.
Come-close to diffuse-fast when solving inverse problems with diffusion models. We establish state-of-the-art results with only 20 diffusion steps across various tasks including SR, inpainting, and CS-MRI