Docker.io/nvidia/cuda vs nvcr.io/nvidia/l4t-cuda

Does nvcr.io/nvidia/l4t-cuda have a performance benefits over docker.io/nvidia/cuda on jetson? or just extra pre-installed libraries (cudnn or opencv)?
I can see that l4t-cuda does:

Dockerfile.l4t_r34 · master · nvidia / container-images / l4t-base · GitLab

RUN echo "/usr/lib/aarch64-linux-gnu/tegra" >> /etc/ld.so.conf.d/nvidia-tegra.conf && \
    echo "/usr/lib/aarch64-linux-gnu/tegra-egl" >> /etc/ld.so.conf.d/nvidia-tegra.conf

Does it make any performance improvement or enable tegra specific features?
I can find /use/lib/aarch64-linux/gnu/tegra also in docker.io/nvidia/cuda.
If I add ldconf path in the docker.io/nvidia/cuda, can I have same benefits what like nvcr.io/nvidia/l4t-cuda?.

Hi,

docker.io/nvidia/cuda doesn’t support Jetson platform.

Thanks.

What supprt can I expect by using nvcr.io/nvidia/l4t-cuda over docker.io/nvidia/cuda? performance improvement described in this document?
I want to know I can code and tset my CUDA program both on docker.io/nvidia/cuda in my desktop computer and on nvcr.io/nvidia/l4t-cuda in Jetson without code change.
Is there any compaitibility issue (I don’t concern minor performance issue at this stage)?
The program will be compiled and deployed on nvcr.io/nvidia/l4t-cuda in Jetson at production stage.

Hi,

This is a compatible issue with the package and underlying GPU driver.
docker.io/nvidia/cuda only support x86 and ARM CPU servers.

But the CUDA API is the same.
The same user-space app is expected to work on both containers.

Thanks.

1 Like

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.