[TensorRT] ERROR: ../rtExt/cuda/cudaTiledPoolingRunner.cpp (117) - Cuda Error in execute: 719 (unspecified launch failure)

I am facing a very strange error and I’m not able to determine where it originates from.

The system:

  • Device: Nvidia Jetson Nano 4GB
  • Image: JetPack 4.5 with JetBot 0.4.3
  • Programming language: Python
  • Python Libraries (installed global):
    torch @ file:///home/jetbot/torch-1.7.0a0-cp36-cp36m-linux_aarch64.whl

I am new to working with CUDA and working on understanding creating a self driving AI. I started of with this project https://github.com/gsurma/jetson. Running this example gives no error and works fine. I then started my own version of it and therefore created a new project and copied the files over there and started modifying it. In my new project directory I get the error

[TensorRT] ERROR: ../rtExt/cuda/cudaTiledPoolingRunner.cpp (117) - Cuda Error in execute: 719 (unspecified launch failure)
[TensorRT] ERROR: FAILED_EXECUTION: std::exception
RuntimeError: CUDA error: unspecified launch failure

sometimes I also get

[TensorRT] ERROR: ../rtSafe/runnerUtils.cpp (442) - Cudnn Error in safeCudnnAddTensor: 8 (CUDNN_STATUS_EXECUTION_FAILED)
[TensorRT] ERROR: FAILED_EXECUTION: std::exception

which is caused by the line

output = self.model_trt(preprocessed_frame).detach().clamp(-1.0, 1.0).cpu().numpy().flatten()

in the file



I also wanted to create a nice project structure with sub-directories which looks as like this:

     – programs

Inside “myproject” are some python scripts which load programs placed in the “programs” folder.

If I create this structure in the project https://github.com/gsurma/jetson the same error occurs.

I have read many posts in forums and git issues but I am not able to solve the problem.

Does anyone can help me or guide me in the right direction?

Thanks in advance!

Never mind, I found the problem. I added torch.device in some classes to check if everything is running on the GPU and not CPU and this messed everything up. After deleting them it worked fine.

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.