Use jetson inference on a jetson with emmc 16gb

Hello everyone,

I want to know if there is any way to use jetson inference on a jetson with emmc 16gb. This jetson kernel comes modified as I am using spi ( can bus) on it.
I am trying to install all the dependencies on the jetson before running jetson inference ( cuda, opencv …) but obviously It doesn’t have enough space to install everything. Is there anyway around it?
If not, installing everything on the emmc, can I install hetson inference and dependencies on a flash disk qand run it from there?

Thank you

Any help would me much appreciated
@dusty_nv

Hi,

A better way is to setup the system on an external drive to have more space.
You can find it below:

Thanks.

Hi many thanks for your reply. The issur with that is that I am installing it on a custom os that allows an spi can bus readzr

Hi,

To minimize the storage usage, you can try to install the dependencies manually.

For example, installing TensorRT via apt:

$ sudo apt install nvidia-tensorrt-dev

And try to build jetson-inference refer to this script:

You might also need the PyTorch package for certain use cases.
The prebuilt can be found in the link below:

Thanks.