Xavier NX Memory Constrained Installs

Hello all!

I have a Xavier NX with 15 GB built-in memory, no SD card slot, and I’m trying to use Tensorflow2, Opencv3/4, and CUDA for inferencing. However, I can’t seem to get all three on the board at once due to memory limitations.

When I use the Jetpack 4.4 install, without Deepstream, I am left with 500 MB of space and no Tensorflow.
When I use the Jetpack 4.4 install with only the OS, I have 9 GB, and installing the latest CUDA leaves me with 2 GB.

I have two questions:

  1. Is there a way to install Tensorflow2, OpenCV, and CUDA all at once in only 9 GB of space?
  2. Can I replace installing CUDA with the L4T-CUDA that comes with Jetpack 4.4 when I install only the OS? If so is there anything special I need to do to make this work?

Thank you all for any help you can provide!

Hi,

Suppose you are talking about the disk amount rather than the system memory.

1. SDKManager should automatically install OpenCV/CUDA for you.
The only extra package you need to install is TensorFlow.
It’s recommended to use our prebuilt package that is specified for Jetson platform:

2 The CUDA toolkit for Jetson platform do come from JetPack.
The package from website is from desktop user and cannot be used on the Xavier.

Thanks.

There seems to be very little space left after Jetpack install. Maybe removing some unused programs (openoffice, games etc.) in default jetpack install would be a good idea? Just like there is Ubuntu Base image. Sadly for Ubuntu (or Linux in general) it is not trivial to expand system storage via sd card, as a lot of stuff depends on root folders.

you might be able to add some external storage e.g. ssd/ usb stick, whatever and mount it e.g. as /home or a set of folders so that it will be used as an extension

On Linux this rarely helps though, as apt install doesn’t install to mnt’s by default and it is almost impossible to make that work.

I keep on stumbling on this issue though - often I have 0 bytes free on the internal flash and even tab completion doesn’t work. I use sdcard for my custom files, but I install some dependencies via apt which makes most AI projects impossible (like you cannot realistically install Tensorflow on top of the main image). On my next flash I will use sdcard for main OS and the flash memory will be left unused, because I don’t see how 16GB flash can function as linux internal memory. Either nvidia should expand module memory to something minimally decent (at least 64GB) or just don’t include anything on the module and add sd card to it instead (just like devkit). This would actually make it cheaper as well. Making the standalone module work with external drivers as boot drives is usually extra pain because of the internal memory. And some functions only work on one and doesn’t on the other (like disk encryption, but it seems in the next version it could work on external boot drivers as well).

So nvidia - for the module I would suggest much larger memory or just get rid of it and require sd/m.2 instead.

edit: While fighting this I also wanted to know where the space goes. If I list the top sized packages (in reversed order) I get:

dpkg-query -Wf '${Installed-Size}\t${Package}\n' | sort -n

170266kb  vpi
197119kb  cuda-cufft-10-2
207835kb  cuda-samples-10-2
213118kb  cuda-cusolver-10-2
221388kb  libnvinfer7
233347kb  chromium-browser
272535kb  libnvinfer-dev
328693kb  linux-firmware
335901kb  cuda-documentation-10-2
393728kb  cuda-cufft-dev-10-2
483180kb  libgl1-mesa-dri
544815kb  libnvinfer-samples
784774kb  libcudnn8
1073596kb nvidia-l4t-kernel
1548358kb libcudnn8-dev

You can see it has “libnvinfer-samples” and “cuda-samples-10-2”, which it seems I cannot remove without removing cuda itself.

Current small cleanup via this (saves like 500-600mb):

sudo apt remove libreoffice-* thunderbird chromium-browser vpi-samples -y
1 Like