Decreasing MRI Scan Times Using Deep Learning with NVIDIA Clara AGX

Originally published at: https://developer.nvidia.com/blog/decreasing-mri-scan-times-using-deep-learning-with-nvidia-clara-agx/

An intern on the NVIDIA Clara AGX team gives an overview on a deep learning method to remove noise and the Gibbs phenomenon in magnetic resonance imaging (MRI). She discusses how this method could allow for reduced scan times in MRI.

Read article about decreasing the effects of the Gibbs phenomenon in MRI.

Before I came across an article that showed how to compensate for the data shift effect in the fft, I
was considering using ML to obtain a signed magnitude spectra from the complex fft data.
This for any arbitrary input vector, might you have
attempted similar.

We haven’t tried that before, but sounds like an interesting approach.