Description
I am runing an inferance using the API: virtual bool execute(int batchSize, void** bindings) noexcept = 0;
On few of our systems we are getting an exception: [hardwareContext.cpp::configure::92] Error Code 1: Cudnn (CUDNN_STATUS_MAPPING_ERROR).
The TRT model was genearted on other equal HW.
I would love to understand the area of the exception to look for the cause of it.
Can we assume that it is a missmatch between HW and the other HW where it was generated?
Environment
JetPack4.6
XavierJCB
Ubuntu 18.04.5 LTS (GNU/Linux 4.9.253-tegra aarch64)
libcudnn_cnn_infer.so.8.2.1
libnvinfer.so.8.0.1
tegra/libcuda.so.1.1
GPU@59C