Error in importing tensorflow in l4t-tensorflow:r35.3.1-tf2.11-py3 container

I’m running l4t-tensorflow:r35.3.1-tf2.11-py3 container, inside the container whenever i tried to import tensorflow getting this error:
W tensorflow/compiler/xla/stream_executor/platform/default/dso_loader.cc:65] Could not load dynamic library ‘libnvinfer.so.8’; dlerror: /usr/lib/aarch64-linux-gnu/tegra/libnvdla_compiler.so: file too short; LD_LIBRARY_PATH: /usr/lib/aarch64-linux-gnu/tegra:/usr/lib/aarch64-linux-gnu:/usr/local/cuda/lib64
2024-10-03 14:52:34.404992: W tensorflow/compiler/xla/stream_executor/platform/default/dso_loader.cc:65] Could not load dynamic library ‘libnvinfer_plugin.so.8’; dlerror: /usr/lib/aarch64-linux-gnu/tegra/libnvdla_compiler.so: file too short; LD_LIBRARY_PATH: /usr/lib/aarch64-linux-gnu/tegra:/usr/lib/aarch64-linux-gnu:/usr/local/cuda/lib64
2024-10-03 14:52:34.405050: W tensorflow/compiler/tf2tensorrt/utils/py_utils.cc:38] TF-TRT Warning: Cannot dlopen some TensorRT libraries. If you would like to use Nvidia GPU with TensorRT, please make sure the missing libraries mentioned above are installed properly.

Hi,

Do you setup the device with JetPack 6?
Please note that you will need to use the container with the same BSP version.

Thanks.

yes, I’ve setup with jetpack 6. How to use container with same BSP version, I’m using latest release of l4t-tensorflow image

Hi,

The BSP in JetPack is r36.

From JetPack 6, please use the TensorFlow container instead of l4t-tensorflow.
For example nvcr.io/nvidia/tensorflow:24.09-tf2-py3-igpu

Thanks.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.