I’m using interpipe sink/src to create 3 output pipelines from single camera. So I have totally 4 pipelines:
- camera pipeline: nvarguscamerasrc → interpipesink
- liveview pipeline: interpipesrc → nvvidconv → shmsink
- Main video pipeline: interpipesrc → nvv4l2h264enc → rtph264pay
- Secondary video pipeline: interpipesrc → nvvidconv → nvv4l2h264enc → rtph264pay
Everything works fine if I set state of camera pipeline to PLAYING from starting and keep this state in whole life of the process. I can get videos from all three output at the same time correctly.
Now I want to stop camera pipeline (the #1 pipeline above) when none of three output pipeline is active (to save power, protect the image sensor from long time working, …). So I captured following events:
- on shmsink of live view pipeline (#2 pipeline above), I capture client-connected/disconnected event to know when the live view pipeline is idle.
- on two RTSP pipelines (main & secondary pipeline), I capture the ‘new-state’ event of ‘GstRTSPMedia’ object. Whenever a RTSP client connected, this object changes the state to PLAYING, and when all client disconnected, it changes the state to NULL.
When all none of the output pipeline is active, I will change state of camera pipeline to NULL. Whenever a client connects to one of output pipelines (liveview, main video, secondary video pipeline), I will change state of camera pipeline to PLAYING to get image from image sensor.
This solution works fine for liveview pipeline which using shmsink/shmsrc. But whenever a RTSP client tries to connect to Main or Secondary pipeline, I saw the camera pipeline state is changed to PLAYING, but segmentation fault occurs after that. I think it is caused by RTSP server.
Can you please share me the idea to do more troubleshoot, or better solution put my camera pipeline into idle when no output is used?