Jetson Inference on Raspberry Pi

Hello all, my first post on here!

I’m working on a robot which integrates a Jetson Xavier NX developer kit. I now need to do a quick side project, and I don’t want to have to remove the Xavier from its current project whilst this is underway.

I was wondering if anyone knew if it is possible to put Jetson Inference on a Raspberry Pi 4? I’m not sure if anyone has tried but this would be ideal for my side project which doesn’t need a huge amount of power and then I can get back to the Jetson.



That would depend on what you mean by “interface”. There are a number of possible ways to copy data back and forth between the RPi and the Jetson. Are the two physically right next to each other? What do you need, specifically, to go on between the Jetson and RPi? A specific example use-case would be of great benefit here.

1 Like

jetson-inference uses TensorRT to run the DNN inference, and TensorRT uses NVIDIA GPU and CUDA - so it won’t run on a Raspberry Pi. In theory you could use another library like PyTorch which can be made to run in CPU-only mode on the rPI.