CUDA Error: TensorRT samples in Docker Environment

Dear Team,
Software Version
DRIVE OS 6.0.4 SDK

Target Operating System
QNX

Host Machine Version

native Ubuntu Linux 20.04 Host installed with DRIVE OS Docker Containers

I have setup Docker Image “drive-agx-orin-linux-aarch64-sdk-build-x86:latest” on Ubuntu 20.04.

I tried to build tensorrt samples and successfully build it.

While running my onnx model with wrapper “trtexec”, I am facing below errors:

[10/17/2022-08:51:45] [W] [TRT] Caught cuda error during timing, format rejected.
[10/17/2022-08:51:45] [W] [TRT] Caught cuda error during timing, format rejected.
[10/17/2022-08:51:45] [W] [TRT] Caught cuda error during timing, format rejected.
[10/17/2022-08:51:45] [W] [TRT] GPU error during getBestTactic: 265 copy : no kernel image is available for execution on the device
[10/17/2022-08:51:45] [W] [TRT] GPU error during getBestTactic: 265 copy : no kernel image is available for execution on the device
[10/17/2022-08:51:45] [E] Error[2]: [optimizer.cpp::computeCosts::3884] Error Code 2: Internal Error (Impossible to reformat.)
[10/17/2022-08:51:45] [E] Error[2]: [builder.cpp::buildSerializedNetwork::636] Error Code 2: Internal Error (Assertion engine != nullptr failed. )
[10/17/2022-08:51:45] [E] Engine could not be created from network
[10/17/2022-08:51:45] [E] Building engine failed
[10/17/2022-08:51:45] [E] Failed to create engine from model or file.
[10/17/2022-08:51:45] [E] Engine set up failed

Can you please help!.

Command used to build and run the Docker image is as follows:

sudo docker pull nvcr.io/drive/driveos-sdk/drive-agx-orin-linux-aarch64-sdk-build-x86:latest

sudo docker run -it --gpus=all --privileged --net=host --ulimit memlock=-1 --ipc=host -v /dev/bus/usb:/dev/bus/usb -v $PWD:/DRV3 nvcr.io/drive/driveos-sdk/drive-agx-orin-linux-aarch64-sdk-build-x86:latest

Thanks and Regards,
Vyom Mishra

Dear @vyom.mishra,
could you share complete log and model file to reproduce it our end?

console.txt (8.6 KB)
mobilenetv2-7.onnx (13.3 MB)
Dear @SivaRamaKrishnaNV

Please find the attached model and log as requested.

Thanks and Regards,
Vyom Mishra

Dear @vyom.mishra,
Just want to confirm if you had run trtexec on target or host? Also please share the compilation steps on docker

Dear @SivaRamaKrishnaNV

I am running on Host Environment!

Command used to build and run the Docker image is as follows:

sudo docker pull nvcr.io/drive/driveos-sdk/drive-agx-orin-linux-aarch64-sdk-build-x86:latest

sudo docker run -it --gpus=all --privileged --net=host --ulimit memlock=-1 --ipc=host -v /dev/bus/usb:/dev/bus/usb -v $PWD:/DRV3 nvcr.io/drive/driveos-sdk/drive-agx-orin-linux-aarch64-sdk-build-x86:latest

Thanks and Regards,
Vyom Mishra

Does that mean, you ran the trtexec from docker container on host? Were you able to run any other models?
May I know which GPU do you have on host?
Please share the complete trtexec log

Dear Sir,

This experiment is in docker environment on host side.
I am not able to run any model.
Please check my reply of Oct 19, I have attached the complete log with it " console.txt".

My Laptop(Host) has NVIDIA P2000 GPU

Kindly do the needful.

Thanks and Regards,
Vyom Mishra

Dear @vyom.mishra,
You need to have Ampere based GPU to use DRIVE OS on host.

@vyom.mishra please read the private message from me. Thanks.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.