Fail to Build Jetson-Inference inside

Jetson Xavier NX 16GB.
Fresh install of Jetpack 5.0.2 (running on USB SSD)
using jetson-containers to run:
./scripts/ -c

Within ML container attempt to build jetson-inference, exactly as per, hence performing the git clone on the latest master today.
This fails with the first error being:
CMake Error at utils/cuda/FindCUDA.cmake:1802 (add_library):
add_library cannot create target “jetson-utils-python-38” because another target with the same name already exists. The existing target is a shared library created in source directory “/jetson-inference/utils/python/bindings”. See documentation for policy CMP0002 for more details.
Call Stack (most recent call first):
utils/python/bindings/CMakeLists.txt:57 (cuda_add_library)


Thanks for reporting this.
We are checking this internally. Will share more information later.

Hi @robotmad, there is already a pre-built jetson-inference container image up on DockerHub for JetPack 5.0.2 / L4T R35.1.0. To run it, you can do the following:

git clone --recursive
cd jetson-inference

This container image will be based on l4t-pytorch image. If you really need it based on l4t-ml, then you can re-build jetson-inference container against l4t-ml by running the following:

cd jetson-inference

Thanks @dusty_nv, I’ve used many of your pre-built containers before, but I actually want to include jeston-inference within a more complex container including ros and other things. This was just the simplest scenario I could construct that reproduced the issue starting with an container. (All had been fine with Jetpack 5.0.1)

Unfortunately when executing

cd jetson-inference

On Step 11/21 building the container I get an error (from

+ln -s /usr/lib/python3.8/dist-packages/cv2 /usr/local/lib/python3.8/dist-packages/cv2
ln: failed to create symbolic link ‘/usr/local/lib/python3.8/dist-packages/cv2/cv2’: File exists
The command ‘/bin/sh -c cd /tmp && ./ ${OPENCV_URL} ${OPENCV_DEB}’ returned a non-zero code: 1

jetson-inference will build locally outside of any container just fine.

Hi @robotmad, this is an issue that was fixed in the script - try pulling the latest from jetson-inference master and try again. I just confirmed that I was able to build the jetson-inference container against without issue.

I have an array of ROS containers already built for L4T R35.1.0 on DockerHub that have ROS/ROS2 + PyTorch + jetson-inference already installed, you can find them here:

It appears that the version of that you get from a fresh clone of the jetson_inference repo is pinned at the jetson_containers submodule commit @ 348630c, so does not have the fix that has been applied to the latest. Just bringing the latest version of that file in does indeed make this build. Thanks.

For anyone else with issues trying to build with R35.1.0 you can force the containers submodule part of jetson-inference to be updated to the working version with:
git submodule update --remote --merge

Ah right, sorry about that - I’ve just updated jetson-inference master to reflect the latest jetson-containers submodule, so on a fresh clone of the repo it should have the latest. Thanks!

1 Like

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.