Deploy sensory nodes to Raspberry Pi

For my current robotics project, We have an autonomous land rover with a 6 DOF arm on top.
We plan on putting several cameras (Intel Realsense and a couple cheap USB ones) on parts of the rover the xavier can’t be directly connected to. It is easier for us to embed a Raspberry Pi in these environments and send back the sensor data over a shielded Ethernet cable than it is to run many long USB 3.0 cables.

My question is this: Is it possible to run sensory nodes in an Isaac SDK application graph on a Raspberry pi, and send that data over the network back to the Xavier for processing (and possibly further on for monitoring via Websight)

Yes, I’m using a arduino to control my sensors and send the data to my jetson nano, to do that I use UART but it is the same principle that using Ethernet; simply create a custom node that andle the connection then just send the data to the Isaac nodes you need to use. By the way I’m also using TCP to send data from Isaac to Qt but default TCP used by Isaac is a mess because it use Cap’np proto so my tcp is only one way right now… any ways,
Regards