WKGM: weighted k-space generative model for parallel imaging reconstruction

Zongjiang Tu, Die Liu, Xiaoqing Wang, Chen Jiang, Pengwen Zhu, Minghui Zhang, Shanshan Wang, Dong Liang, Qiegen Liu*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

Deep learning based parallel imaging (PI) has made great progress in recent years to accelerate MRI. Nevertheless, it still has some limitations: for example, the robustness and flexibility of existing methods are greatly deficient. In this work, we propose a method to explore the k-space domain learning via robust generative modeling for flexible calibrationless PI reconstruction, coined the weighted k-space generative model (WKGM). Specifically, WKGM is a generalized k-space domain model, where the k-space weighting technology and high-dimensional space augmentation design are efficiently incorporated for score-based generative model training, resulting in good and robust reconstructions. In addition, WKGM is flexible and thus can be synergistically combined with various traditional k-space PI models, which can make full use of the correlation between multi-coil data and realize calibrationless PI. Even though our model was trained on only 500 images, experimental results with varying sampling patterns and acceleration factors demonstrate that WKGM can attain state-of-the-art reconstruction results with the well learned k-space generative prior.

Original languageEnglish
Article numbere5005
JournalNMR in Biomedicine
Volume36
Issue number11
DOIs
Publication statusPublished - Nov 2023

Keywords

  • generative model
  • parallel imaging
  • score-based network
  • weighted k-space domain

ASJC Scopus subject areas

  • Molecular Medicine
  • Radiology Nuclear Medicine and imaging
  • Spectroscopy

Fingerprint

Dive into the research topics of 'WKGM: weighted k-space generative model for parallel imaging reconstruction'. Together they form a unique fingerprint.

Cite this