Is there example code provided which demonstrates a single process producing frames from a camera and distributing them to other processes? So far I’ve only found examples for consuming multiple camera streams and/or multiple consumers from within the same process in the tegra_multimedia_api samples.
may I have more details about your use-case.
what’s the post-processing you would like to done?
I’d like to set up a system that produces multiple camera feeds and distributes their frames to multiple consumers, but I need it to be done via IPC. I’m thinking something like the attached image.
Those consumers would be receiving RGB or YUV frames with associated metadata. One of those consumers might be a video encode/stream, another one might be an OpenCV application, another might be using CUDA.
Do EGLStreams support something like this multi-process model?
could you please try this with gstreamer pipeline.
please also refer to ACCELERATED GSTREAMER USER GUIDE for some examples.
I’m specifically trying not to use gstreamer because I’d like to use some of the compositing and other tools available through the multimedia API. I was also told in another thread to use argus/etc instead of gstreamer because of the gstreamer plugin limitations.
So, how could this be done with the multimedia API?
As of now we only support single process case.
One issue is thae dmabuf cannot be transferred between processes.