Hello,
In the example code #12 in multimedia API, camera_v4l2_cuda, I’m trying to separate rendering and capturing parts of the code in Jetson Nano Dev Kit. The rendering part has to run in another process which is not related to the current process(meaning it is created separately and is not forked from the current process). But I cannot get it to work.
I tried sending the renderer_fd to the process using IPC sockets, it gives the following error:
nvbuff_utils: dmabuf_fd 38 mapped entry NOT found.
The IPC part is working correctly since I tested it with a similar code on my ubuntu 16.04 laptop.
That’s unfortunate. I have two processes that I need to share camera buffers with each other. What is your suggestion? Is there a way to create eglimage in the first process and share it with the second process?
the first process is for capturing the camera and doing some processing on the frame.
the second process is for rendering camera frame and other rendering stuff on same window.
Hi,
Sorry for the late response. Been busy for quite a while!
After lots of trials and errors, I successfully shared camera feed with another process.
EGL Stream examples that DaneLLL mentioned was quite helpful.