Error doing inference with yolo exported enginge

Description

After exporting and converting the trained model to an engine in fp16 I’m getting this error when I try to run inference. No errors are reported when exporting or converting, I’m using the sample jupyter notebook for yolo_v4

Environment

We are using this container: nvcr.io/nvidia/tlt-streamanalytics:v3.0-py3
**GPU Type: Tesla T4
**Nvidia Driver Version: NVIDIA-SMI 465.19.01
CUDA Version: CUDA Version: 11.3

Command:
!tlt yolo_v4 inference -m $USER_EXPERIMENT_DIR/export_fp16/trt.engine
-e $SPECS_DIR/yolo_v4_train_resnet18_kitti.txt
-i $DATA_DOWNLOAD_DIR/testing/images
-o $USER_EXPERIMENT_DIR/yolo_infer_images
-k $KEY

Output:
2021-07-30 15:52:45,421 [WARNING] tlt.components.docker_handler.docker_handler:
Docker will run the commands as root. If you would like to retain your
local host permissions, please add the “user”:“UID:GID” in the
DockerOptions portion of the ~/.tlt_mounts.json file. You can obtain your
users UID and GID by using the “id -u” and “id -g” commands on the
terminal.
Using TensorFlow backend.
Using TensorFlow backend.
WARNING:tensorflow:Deprecation warnings have been disabled. Set TF_ENABLE_DEPRECATION_WARNINGS=1 to re-enable them.
WARNING:tensorflow:From /usr/local/lib/python3.6/dist-packages/keras/backend/tensorflow_backend.py:95: The name tf.reset_default_graph is deprecated. Please use tf.compat.v1.reset_default_graph instead.

2021-07-30 15:52:53,820 [WARNING] tensorflow: From /usr/local/lib/python3.6/dist-packages/keras/backend/tensorflow_backend.py:95: The name tf.reset_default_graph is deprecated. Please use tf.compat.v1.reset_default_graph instead.

WARNING:tensorflow:From /usr/local/lib/python3.6/dist-packages/keras/backend/tensorflow_backend.py:98: The name tf.placeholder_with_default is deprecated. Please use tf.compat.v1.placeholder_with_default instead.

2021-07-30 15:52:53,820 [WARNING] tensorflow: From /usr/local/lib/python3.6/dist-packages/keras/backend/tensorflow_backend.py:98: The name tf.placeholder_with_default is deprecated. Please use tf.compat.v1.placeholder_with_default instead.

WARNING:tensorflow:From /usr/local/lib/python3.6/dist-packages/keras/backend/tensorflow_backend.py:102: The name tf.get_default_graph is deprecated. Please use tf.compat.v1.get_default_graph instead.

2021-07-30 15:52:53,824 [WARNING] tensorflow: From /usr/local/lib/python3.6/dist-packages/keras/backend/tensorflow_backend.py:102: The name tf.get_default_graph is deprecated. Please use tf.compat.v1.get_default_graph instead.

[TensorRT] ERROR: Parameter check failed at: engine.cpp::setBindingDimensions::1136, condition: profileMaxDims.d[i] >= dimensions.d[i]
Traceback (most recent call last):
File “/opt/tlt/.cache/dazel/_dazel_tlt/2b81a5aac84a1d3b7a324f2a7a6f400b/execroot/ai_infra/bazel-out/k8-fastbuild/bin/magnet/packages/iva/build_wheel.runfiles/ai_infra/iva/yolo_v4/scripts/inference.py”, line 177, in
File “/opt/tlt/.cache/dazel/_dazel_tlt/2b81a5aac84a1d3b7a324f2a7a6f400b/execroot/ai_infra/bazel-out/k8-fastbuild/bin/magnet/packages/iva/build_wheel.runfiles/ai_infra/iva/common/utils.py”, line 494, in return_func
File “/opt/tlt/.cache/dazel/_dazel_tlt/2b81a5aac84a1d3b7a324f2a7a6f400b/execroot/ai_infra/bazel-out/k8-fastbuild/bin/magnet/packages/iva/build_wheel.runfiles/ai_infra/iva/common/utils.py”, line 482, in return_func
File “/opt/tlt/.cache/dazel/_dazel_tlt/2b81a5aac84a1d3b7a324f2a7a6f400b/execroot/ai_infra/bazel-out/k8-fastbuild/bin/magnet/packages/iva/build_wheel.runfiles/ai_infra/iva/yolo_v4/scripts/inference.py”, line 173, in main
File “/opt/tlt/.cache/dazel/_dazel_tlt/2b81a5aac84a1d3b7a324f2a7a6f400b/execroot/ai_infra/bazel-out/k8-fastbuild/bin/magnet/packages/iva/build_wheel.runfiles/ai_infra/iva/yolo_v4/scripts/inference.py”, line 160, in inference
File “/opt/tlt/.cache/dazel/dazel_tlt/2b81a5aac84a1d3b7a324f2a7a6f400b/execroot/ai_infra/bazel-out/k8-fastbuild/bin/magnet/packages/iva/build_wheel.runfiles/ai_infra/iva/common/inferencer/inferencer.py”, line 42, in _init
File “/opt/tlt/.cache/dazel/dazel_tlt/2b81a5aac84a1d3b7a324f2a7a6f400b/execroot/ai_infra/bazel-out/k8-fastbuild/bin/magnet/packages/iva/build_wheel.runfiles/ai_infra/iva/common/inferencer/trt_inferencer.py”, line 57, in _init
File “/opt/tlt/.cache/dazel/_dazel_tlt/2b81a5aac84a1d3b7a324f2a7a6f400b/execroot/ai_infra/bazel-out/k8-fastbuild/bin/magnet/packages/iva/build_wheel.runfiles/ai_infra/iva/common/inferencer/engine.py”, line 99, in allocate_buffers
pycuda._driver.MemoryError: cuMemHostAlloc failed: out of memory
Exception ignored in: <bound method TRTInferencer.del of <iva.common.inferencer.trt_inferencer.TRTInferencer object at 0x7f7f6475c828>>
Traceback (most recent call last):
File “/opt/tlt/.cache/dazel/dazel_tlt/2b81a5aac84a1d3b7a324f2a7a6f400b/execroot/ai_infra/bazel-out/k8-fastbuild/bin/magnet/packages/iva/build_wheel.runfiles/ai_infra/iva/common/inferencer/trt_inferencer.py”, line 139, in _del
File “/opt/tlt/.cache/dazel/_dazel_tlt/2b81a5aac84a1d3b7a324f2a7a6f400b/execroot/ai_infra/bazel-out/k8-fastbuild/bin/magnet/packages/iva/build_wheel.runfiles/ai_infra/iva/common/inferencer/trt_inferencer.py”, line 102, in clear_trt_session
AttributeError: ‘TRTInferencer’ object has no attribute ‘stream’
2021-07-30 15:53:11,349 [INFO] tlt.components.docker_handler.docker_handler: Stopping container.

Hi,
We recommend you to raise this query in TLT forum for better assistance.

Thanks!

1 Like

How did you generate $USER_EXPERIMENT_DIR/export_fp16/trt.engine ?
Could you please share the full command and full log?

We had some problems in the way that we convert the engine to .engine. Problem solve!

Thanks!

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.