My model has some FFT and IFFT operations in between of my deep learning model. I would like to convert the model to Tensorrt for high throughput inference. I have seen the supported layers https://docs.nvidia.com/deeplearning/sdk/tensorrt-support-matrix/index.html and it doesn’t mention anything about FFT operations. I am new to tensorrt (Cuda aswell)and I would like to know that if I need to write FFT and IFFT plugins ? or can the problem be solved by just using CuFFT on output from one network layer and feeding the output from CuFFT to the next layer ?
Any help is appreciated. Thank You!