Using MATLAB and TensorRT on NVIDIA GPUs

Originally published at: https://developer.nvidia.com/blog/using-matlab-and-tensorrt-on-nvidia-gpus/

As we design deep learning networks, how can we quickly prototype the complete algorithm—including pre- and postprocessing logic around deep neural networks (DNNs) —to get a sense of timing and performance on standalone GPUs? This question comes up frequently from the scientists and engineers I work with. Traditionally, they would hand translate the complete algorithm…

As far as I know, the inference process using INT8 with TensorRT requires calibration of the network in MATLAB with a dataset. This has not been carried out in the work explained in this article. Could you give an explanation about this, please?

Also, I am training a deep CNN on raw time-series data which I have stored on the hard disk as MAT files. I noticed that TensorRT accepts only images for the calibration process. Is there a trick to calibrate my network using the MAT files?