For my current robotics project, We have an autonomous land rover with a 6 DOF arm on top.
We plan on putting several cameras (Intel Realsense and a couple cheap USB ones) on parts of the rover the xavier can’t be directly connected to. It is easier for us to embed a Raspberry Pi in these environments and send back the sensor data over a shielded Ethernet cable than it is to run many long USB 3.0 cables.
My question is this: Is it possible to run sensory nodes in an Isaac SDK application graph on a Raspberry pi, and send that data over the network back to the Xavier for processing (and possibly further on for monitoring via Websight)