I’m interested in streaming a viewport or the output of a camera on a stage, into an external, existing GStreamer pipeline. Could you advise on an approach for accomplishing this? A 12fps stream would work, but 30-60fps would be ideal.
So far I have attempted to use the WebRTC streaming extensions and connect them to GStreamer using the gstwebrtc element, like this: pipeline = Gst.parse_launch(“playbin uri=gstwebrtcs://{ip:port}peer-id={peer-id”).
Any general guidance or advice is welcome. Thank you!
Thank you, I’ve reviewed this documentation and the sample repo. But as I’m understanding, the key difference between these solutions and my problem is: these are both creating static image render products, whereas I’m looking to generate a stream.
My use case is to connect to an existing GStreamer image processing pipeline, then affect the scene based on inference.
This looks like the ticket, thank you! After updating graphics drivers and experimenting a bit, I was able to stream out to ffplay succesfully, now it’s just a matter of getting the right decoder for GStreamer.
This post was helpful in mentioning which extensions are necessary to make this custom RTSPWriter available. I wouldn’t have thought to look under character control. Usage of RTSPwriter