TensorRT Cask Error in checkCaskExecError<false>: 7 (Cask Convolution execution)

I wrote a gststreamer plugin using cv::cuda::remap function, and then the TensorRT model report following error when do inference:

[TensorRT] ERROR: …/rtSafe/cuda/caskConvolutionRunner.cpp (245) - Cask Error in checkCaskExecError: 7 (Cask Convolution execution)
[TensorRT] ERROR: FAILED_EXECUTION: std::exception

If I using cv::remap, the errors gone and all works fine.

How to fix this bug?

Note:

  • Board:
    • Jetpack: 4.3 [L4T 32.3.1]
    • Type: NANO/TX1
    • Name: NVIDIA Jetson NANO/TX1
    • GPU-Arch: 5.3
  • Libraries:
    • cuDNN: 7.6.3.28-1+cuda10.0
    • VisionWorks: 1.6.0.500n
    • OpenCV: 4.2.0 compiled CUDA: YES
    • CUDA: 10.0.326
    • TensorRT: 6.0.1.10-1+cuda10.0

Hi,

The main difference is cv::cuda::remap is a GPU function and cv::remap is a CPU version.

This should depend on how you implement the inference.
For example, if there is a host to device memory copy between openCV and TensorRT.
Please use cv::remap since it’s expected to be CPU buffer.

Thanks.

I solved this problem by initializing and using gstreamer pipe and tensorrt in different thread.