Hi, we are trying to replace our Jetsons with more scalable AWS G5g nodes, mainly for CI. Unfortunately the TensorRT
library that you can get precompiled for aarch64 links against libnvmedia.so
, which comes pre-installed as part of nvidia-l4t-multimedia
on a Jetson.
After adding the proper repo one can call the install to nvidia-l4t-multimedia
, but the install then fails due to a missing device-tree:
echo "deb https://repo.download.nvidia.com/jetson/t194 r32.4 main" | sudo tee -a /etc/apt/sources.list.d/cuda.list
sudo apt-get update && sudo apt-get install -y nvidia-l4t-multimedia
[2021-12-06T15:57:27.786Z] /var/lib/dpkg/tmp.ci/preinst: line 40: /proc/device-tree/compatible: No such file or directory
[2021-12-06T15:57:27.786Z] dpkg: error processing archive /var/cache/apt/archives/nvidia-l4t-core_32.4.4-20201016123640_arm64.deb (--unpack):
[2021-12-06T15:57:27.786Z] new nvidia-l4t-core package pre-installation script subprocess returned error exit status 1
[2021-12-06T15:57:27.786Z] Errors were encountered while processing:
[2021-12-06T15:57:27.786Z] /var/cache/apt/archives/nvidia-l4t-core_32.4.4-20201016123640_arm64.deb
[2021-12-06T15:57:27.786Z] E: Sub-process /usr/bin/dpkg returned an error code (1)
Any idea how we should setup the environment on a generic aarch64 Ubuntu to be able to use TensorRT? And please don’t tell me to use the SDK installer, I have not seen a worse tool for professional use-case yet.
related
https://github.com/NVIDIA/nvidia-container-runtime/issues/124#issuecomment-828426819