When using a live camera source on either the MultiMedia API level or the gstreamer level ( CSI input) it appears that the only way to get input frames is by using the nvarguscamerasrc ( on the gstreamer level) or the Argus API at the multimedia level). Both these approaches require the use of the nvargus-daemon. I have looked around on the MultiMedia API documentation and this application/daemon is not documented anywhere. Is this documentation available or can the source code be made available.
Specific questions around this daemon are
- What purpose does this daemon provide in the use-case where there is a single camera (that could not be provided by a thread dequeing frames from a v4l2 device and enqueing said frames to downstream NvV4l2Element(s)
- I have noticed that sometimes when running a gstreamer pipeline that nvargus-daemon segfaults and must be re-started ( fairly frequently). I am not sure what the root cause of such seg-faults are. Is it possible to provide a method to debug the root cause of this issue.