Compatibility of NGC docker images for TX2

Hi Everyone,
We are trying to install a Tensorflow container from NGC on the TX2.
Specifically, the image in question is:

nvcr.io/nvidia/tensorflow:19.09-py2

This fails with the following error:

nvidia-docker run --rm -ti nvcr.io/nvidia/tensorflow:19.10-py2 /bin/bash
docker: Error response from daemon: OCI runtime create failed: container_linux.go:345: starting container process caused "process_linux.go:430: container init caused \"process_linux.go:413: running prestart hook 1 caused \\"error running hook: exit status 1, stdout: , stderr: exec command: [/usr/bin/nvidia-container-cli --load-kmods configure --ldconfig=@/sbin/ldconfig.real --device=all --compute --utility --video --require=cuda>=9.0 --pid=14866 /var/lib/docker/overlay2/992f889aec90bddd79f61e973b416fb49135c80940fb79cc58fb2675cd26a8a3/merged]\\nnvidia-container-cli: mount error: mount operation failed: /usr/src/tensorrt: no such file or directory\\n\\"\"": unknown.

We checked on the Jetson host, which is running Jetpack 4.2.2, and the CSV files for the container mount directories are there, the TensorRT directory in question is also present.

What could be causing this error?
Are these containers compatible with Jetson?

Any help will be appreciated,
Thanks

Hi,

Sorry that not all the container is supported by the Jetson nvdocker.
Please use nvcr.io/nvidia/l4t-base:r32.2 instead.

You can find more information here:
https://github.com/NVIDIA/nvidia-docker/wiki/NVIDIA-Container-Runtime-on-Jetson

Thanks.

Apologies for maybe asking a stupid question, but I am completely new to to the Nvidia Jetson platform.

I am experiencing the same error when trying to run Tensorflow 20.03 on my TX2 running Jetpack 4.3 (which is why I found this post).

When you write “Please use nvcr.io/nvidia/l4t-base:r32:2 instead” does that mean:

  1. that you can call one container from within another container?
  2. or do you suggest to use the “l4t-base” container to create a new custom container?
    I tried to run the example in the “NVIDIA-Container-Runtime-on-Jetson” and the “l4t-base” container example runs fine.

One final question: There is no information in the “Tensorflow 20.03” container description indicating that it is not compatible with TX2. Is there any way to find this information or any specific means of testing? I have spent quite some time trying to debug this issue… A “warning” would have been great.

Thanks.