Meeting Banner
Abstract #2830

Multiscale Dictionary Learning for MRI

Saiprasad Ravishankar1, Yoram Bresler1

1Department of Electrical & Computer Engineering & the Coordinated Science Laboratory, University of Illinois, Urbana, IL, United States


Compressed Sensing (CS) MRI with non-adaptive sparsifying transforms such as wavelets and finite differences can perform poorly at high undersampling factors. In this work, we introduce an adaptive framework for MR image reconstruction employing multiscale sparse representations. The multiscale patch-based sparsifying transform (dictionary) is learnt directly using the undersampled k-space data and is thus adapted to the current image. An alternating reconstruction algorithm learns the sparsifying dictionary at different scales, and uses it to remove aliasing and noise in one step, and subsequently restores and fills-in the sampled k-space data in the other step. Experimental results demonstrate the superior performance of such an image reconstruction formulation that exploits image patch sparsity at several scales. The multiscale framework provides highly accurate reconstructions at high undersampling factors. We also show the reconstructions with previous CSMRI methods employing nonadaptive dictionaries and demonstrate significant improvements with our approach over the former.