TensorRT Cask Error in checkCaskExecError<false>


**Nvidia card: Jetson AGX Xavier
TensorRT Version:
CUDA Version: 10.2.300
CUDNN Version:
Operating System + Version: Ubuntu 18.04

Do you have any idea how to solve this problem

Dear @smaiah.sarah,
Is the engine generated on same machine with same TensoRT version?

Yes, the engine was generated on the same machine with same TensorRT version

Dear @smaiah.sarah,
Could you please share the reproducible steps and relavant files

I used the scripts in the following repository to convert my yolo model to TRT:

and the following repository for object detection detection

The error is with the following script
trt_yolo_v4.py (5.9 KB)

The previous script calls the following script for the inference
yolo_with_plugins.py (12.3 KB)

Dear @SivaRamaKrishnaNV
Do you have any idea about thIs error?
Is it a metter of multithreading ? the TRT and the inference in different Threads?

Dear @smaiah.sarah,
The error indicate issue with CUDA context.
Is it possible to check loading engine with trtexec to confirm no issue with engine.

Dear @SivaRamaKrishnaNV
I tried to launch trtexec and i get th following error

Dear @smaiah.sarah,
The error in trtexec indicates, it has a non supported layers which are implemented like plugins.
Are you running the it like ROS node?
I would recommend to get the TRT model preparation/inference code from repo and test on target to see if it working correctly with out any issues.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.