I have converted my Pytorch model to ONNX and to TRT(with Custom Plugins).
But I am using another TRT(2) Model whose output will be input to the above TRT(1) Model. I am serving those two TRT models using Triton Server on Jetson Nano, I am sending the request from my Laptop to Jetson Nano, but during the response, I am getting error in the Jetson nano terminal as:
E0311 13:59:57.029723 29688 logging.cc:43] …/rtSafe/safeContext.cpp (133) - Cudnn Error in configure: 7 (CUDNN_STATUS_MAPPING_ERROR)
E0311 13:59:57.030261 29688 logging.cc:43] FAILED_EXECUTION: std::exception
Not sure what is going wrong. what is causing an issue here?
Can you please assist me in resolving this error?
Command I used in Jetson Nano to start the serving of two models:
LD_PRELOAD=“Einsum_op.so RoI_Align.so libyolo_layer.so” ./Downloads/bin/tritonserver --model-repository=./models --min-supported-compute-capability=5.3 --log-verbose=1
- SlowFast.(with two Custom Plugins).
The bounding boxes outputed by the 1st model will go as the input to the 2nd model(Slowfast).
Looking forward to the reply