We are looking to design our software around fairly independent services, each of which can asynchronously access the latest frame from a camera hooked up to the AGX. Our current stack runs on an Intel NUC and shares frames through the native python multiprocessing library which works well, though we’d like to isolate each service even more and be able to start/stop individual services at will as we transition to the Jetson platform.
Would a gstreamer based approach (shmsink?) be ideal? Or more IP-based sharing over UDP?
Thanks in advance.
Hi. I am fairly new to gstreamer so may be off piste here. From what I know of gstreamer - it should work for you. I have not heard of shmsink before but it looks like it would work.
A more elaborate, yet efficient way might be to create your independent services as gstreamer plugins and add them to your pipeline - using gstreamer tee to send the camera frame data to the inputs of your multiple plugins. Written this way, you could not only leverage hardware decoding but also keep the decoded frames in video RAM throughout the pipeline.
stopping and starting services at will means that you will need to modify that pipeline during execution - which is something that I believe the gstreamer API is designed to do.
Someone with more experience with gstreamer may want to correct or expand on this.
Another way, as you say, would be to use gstreamer to get the h264 packets from the video camera and then broadcast those over UDP / RTSP / etc to all and any listening clients, either locally or over the network. This might be easier, but less efficient.