학술논문

An Educated Warm Start for Deep Image Prior-Based Micro CT Reconstruction
Document Type
Periodical
Source
IEEE Transactions on Computational Imaging IEEE Trans. Comput. Imaging Computational Imaging, IEEE Transactions on. 8:1210-1222 2022
Subject
Signal Processing and Analysis
Computing and Processing
General Topics for Engineers
Geoscience
Electronics packaging
Image reconstruction
Task analysis
Imaging
Computed tomography
Training
Neural networks
deep image prior
pretraining
Language
ISSN
2573-0436
2333-9403
2334-0118
Abstract
Deep image prior (DIP) was recently introduced as an effective unsupervised approach for image restoration tasks. DIP represents the image to be recovered as the output of a deep convolutional neural network, and learns the network's parameters such that the model output matches the corrupted observation. Despite its impressive reconstructive properties, the approach is slow when compared to supervisedly learned, or traditional reconstruction techniques. To address the computational challenge, we bestow DIP with a two-stage learning paradigm: (i) perform a supervised pretraining of the network on a simulated dataset; (ii) fine-tune the network's parameters to adapt to the target reconstruction task. We provide a thorough empirical analysis to shed insights into the impacts of pretraining in the context of image reconstruction. We showcase that pretraining considerably speeds up and stabilizes the subsequent reconstruction task from real-measured 2D and 3D micro computed tomography data of biological specimens.