Multi-process libargus example?

Is there example code provided which demonstrates a single process producing frames from a camera and distributing them to other processes? So far I’ve only found examples for consuming multiple camera streams and/or multiple consumers from within the same process in the tegra_multimedia_api samples.

hello Allanm,

may I have more details about your use-case.
for example,
what’s the post-processing you would like to done?
thanks

Hi Jerry-

I’d like to set up a system that produces multiple camera feeds and distributes their frames to multiple consumers, but I need it to be done via IPC. I’m thinking something like the attached image.

Those consumers would be receiving RGB or YUV frames with associated metadata. One of those consumers might be a video encode/stream, another one might be an OpenCV application, another might be using CUDA.

Do EGLStreams support something like this multi-process model?

Thanks!

hello Allanm,

could you please try this with gstreamer pipeline.
please also refer to ACCELERATED GSTREAMER USER GUIDE for some examples.
thanks

Hi Jerry-

I’m specifically trying not to use gstreamer because I’d like to use some of the compositing and other tools available through the multimedia API. I was also told in another thread to use argus/etc instead of gstreamer because of the gstreamer plugin limitations.

So, how could this be done with the multimedia API?

Thanks

Hi Allanm,
As of now we only support single process case.

One issue is thae dmabuf cannot be transferred between processes.
[url]https://devtalk.nvidia.com/default/topic/1025764/jetson-tx1/-mmapi-how-to-transfer-video-yuv-data-between-two-processes/post/5228427/#5228427[/url]