Confusion regarding EGL Stream Producers using OpenGLES offscreen rendering as source


We are currently developing a high-res camera framework, which needs a fast and configurable output- that’s why we’ve chosen Gstreamer and EGLStreams (specifically the nveglglessink). In order to use this Gstreamer-plugin we’ve created an Surface using eglCreateStreamProducerSurfaceKHR. We are able to get an image into the stream, but it’s always the one of the “main” framebuffer (0) and not the one we select with glBindFramebuffer - how could we get that into the stream (using eglMakeCurrent)?
(The final pipeline (simplified): Camera->NVArgus->EGLStream->OpenGLES->EGLStream->Gstreamer->etc.)

We have samples of demonstrating producer/consumer. Please refer to

If the samples do not fit your usecase, we would need your help to provide a patch so that we can apply, build run the sample to reproduce the issue.

The second sample covers OpenGL as a producer, but it doesn’t show how I could produce a stream from another framebuffer other than the main one (screen), but we would need that.

After a lot of trial and error, we’ve got a solution:
If you are using GLFW (or any other context management system), you have to create a new window, get the context on the eglMakeCurrent call and afterwards render on it without calling glBindFramebuffer.