Skip to content
/ HKGM Public

One-shot Generative Prior in Hankel-k-space for Parallel Imaging Reconstruction

Notifications You must be signed in to change notification settings

yqx7150/HKGM

Repository files navigation

HKGM

Paper: One-shot Generative Prior in Hankel-k-space for Parallel Imaging Reconstruction https://arxiv.org/abs/2208.07181
IEEE Transactions on Medical Imaging https://ieeexplore.ieee.org/document/10158730

Authors: Hong Peng, Chen Jiang, Yu Guan, Jing Cheng, Minghui Zhang, Dong Liang, Senior Member, IEEE, Qiegen Liu, Senior Member, IEEE

Date : August-21-2022
Version : 1.0
The code and the algorithm are for non-comercial use only.
Copyright 2022, Department of Electronic Information Engineering, Nanchang University.

Magnetic resonance imaging serves as an essential tool for clinical diagnosis. However, it suffers from a long acquisition time. The utilization of deep learning, especially the deep generative models, offers aggressive acceleration and better reconstruction in magnetic resonance imaging. Nevertheless, learning the data distribution as prior knowledge and reconstructing the image from limited data remains challenging. In this work, we propose a novel Hankel-k-space generative model (HKGM), which can generate samples from a training set of as little as one k-space data. At the prior learning stage, we first construct a large Hankel matrix from k-space data, then extract multiple structured k- space patches from the large Hankel matrix to capture the internal distribution among different patches. Extracting patches from a Hankel matrix enables the generative model to be learned from redundant and low- rank data space. At the iterative reconstruction stage, it is observed that the desired solution obeys the learned prior knowledge. The intermediate reconstruction solution is updated by taking it as the input of the generative model. The updated result is then alternatively operated by imposing low-rank penalty on its Hankel matrix and data consistency constrain on the measurement data. Experimental results confirmed that the internal statistics of patches within a single k-space data carry enough information for learning a powerful generative model and provide state-of-the-art reconstruction.

Requirements and Dependencies

python==3.7.11
Pytorch==1.7.0
tensorflow==2.4.0
torchvision==0.8.0
tensorboard==2.7.0
scipy==1.7.3
numpy==1.19.5
ninja==1.10.2
matplotlib==3.5.1
jax==0.2.26

Test Demo

python PCsampling_demo_parallel.py

Checkpoints

We provide pretrained checkpoints. You can download pretrained models from [Google Drive] (https://drive.google.com/file/d/1UMULob7RG70X9ChI1UgwHb6Lt3FB6THC/view?usp=sharing) [Baidu cloud] (https://pan.baidu.com/s/1P1h7FEvz9FuH3ZE6NX2WMA?pwd=jlzu)

Graphical representation

Pipeline of the prior learning process and PI reconstruction procedure in HKGM

The training flow chart of HKGM. The training process mainly consists of three steps. Firstly, we construct a large Hankel matrix from k-space data. After that, we extract a lot of redundancy and low-rank patches to generate sufficient data samples. Finally, we feed these training patches to the network to capture the internal distribution at different patches.

The pipeline of the PI reconstruction procedure in HKGM. The iterative reconstruction process mainly consists of three steps. Firstly, we iteratively reconstruct objects from the trained network using a PC sampler on the input k-space data. After that, we construct Hankel matrix from the output of the network and apply low-rank penalty on it. Finally, we perform data consistency on the k-space data formed reversely from the matrix.

Acknowledgement

The implementation is based on this repository: https://github.com/yang-song/score_sde_pytorch.

Other Related Projects

About

One-shot Generative Prior in Hankel-k-space for Parallel Imaging Reconstruction

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published