Jetson AGX Orin, Lucid Vision Camera live stream camera feed to webapp

Hi everyone,

I’m new to using Orin and this platform. I’ve built a FastAPI + React web app hosted on AWS, and I’m now working on streaming live video from four Lucid Vision Labs cameras connected to an Orin.

I’d like to compress the images (possibly with GStreamer) and stream the live feed to my web app, along with ROS topic messages from the Orin. The Orin is behind two subnet layers with internet access, but it doesn’t have a public IP. I’m considering running a lightweight server on the Orin and exposing APIs via something like Tailscale Funnel.

I’m new to ROS, GStreamer, and deployments with Orin. Any guidance on the best approach to tackle this would be greatly appreciated!

Thanks!

Hello @suryaak,

There are different ways of going about this.

In our experience what the way we set this kin of projects up is:

  1. Create a GStreamer media server which runs on Python. You could choose a different language, however, Python works great, is easy to implement and maintain.
  2. Add ROS support into the Python media server so you can get access to the ROS features.
  3. Integrate WebRTC into the media server so you can easily access the streams via React.
  4. Create a WebSocket or REST API for the media server so your app can interact with it.

Sorry if this is a bit vague, however, getting into details is a bit difficult with just a message.
However, please let us know if you want us to provide you with a bit more insight, we could setup a free call to go over the details for you.

regards,
Andrew

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.