I have been trying to import TensorRT based inference module in the system.
when I run the inference, I have met the error which is ‘cuTensor Error in executeCutensor: 18 (Internal cuTensor reformat failed)’.
Does anyone have experienced same issue or similar?
Can you provide the following information so we can better help?
Provide details on the platforms you are using:
o Linux distro and version
o GPU type
o Nvidia driver version
o CUDA version
o CUDNN version
o Python version [if using python]
o Tensorflow version
o TensorRT version
o If Jetson, OS, hw versions
Also, if possible please share the script and model file to reproduce the issue.
Hi, Below are the information
o Linux distro and version: 16.04 xenial
o GPU type: Geforce 1080 ti
o Nvidia driver version: 418.87
o CUDA version: 10.1
o CUDNN version: 7
o Python version [if using python]: 3.6.9
o Tensorflow version: 1.14.0 (tensorflow-gpu)
o TensorRT version : 22.214.171.124
o No hw environment
Could you provide minimal steps to reproduce this error?