How to do a GStreamer stream as EGLStream producer?

Hi all,

what we try to do is:

  • have some EGL client that creates a EGLStream consumer with a EGLStream
  • now we want to create a EGLStream producer that is fed by a stream from gstreamer

From a gstreamer pipeline perspective I’d imagine something like this (pseudo):

somesrc ! eglsink stream=mystream

alternatively:
somesrc ! eglsink texture=myOEStexture

Is this possible?

I stumble upon this: https://github.com/DaneLLL/simpleEGLStreams_producer
Has anyone experience with this, if this is some way to go?

Add:
We are on Jetson AGX and I inspected the nveglstreamsrc and nvelgglessink there but non of these have some stream caps or similar…

Thanks + Best,
Bodo

Hi,
The source of nveglglessink is in
https://developer.nvidia.com/embedded/dlc/r32-2-1_Release_v1.0/TX2-AGX/sources/public_sources.tbz2

Maybe you can run ‘somesrc ! nveglglessink’ and have your implementation in nveglglessink.
Sample pipleins if using nveglglessink is in
https://developer.nvidia.com/embedded/dlc/l4t-accelerated-gstreamer-guide-32-2

Ok thanks, I’ll have a look into it.

I see a “glEGLImageTargetTexture2DOES” is already available somehow in the code, so there is some OES sensible code.

It’s not clear to me, how the basic workflow would be.

I have a EGLStream between a Consumer and a Producer.
On the producer side I’d like to draw the gstreamer EGL data directly into the EGL Stream to avoid any CPU copying or else.

The route would be like:
SRC -> (NV pipeline H.264 decoding to EGL (keep the image in VRAM)-> EGLSINK (puts the stream data to EGL) -> ?? (some interface in the code to connect the sink and the stream) -> Producer is drawing with provided texture -> Consumer is getting new frame and rendering

I’m not sure if I’m on the right path here or if there is some misunderstanding of mine.

The basic intention is to have gstreamer pipeline that is processing some source ‘into’ EGL and give it then directly to the stream.

Add: Just discovered nvvideosink that has a “stream” property. I think this is the way to go.

Hi,
For h264 decoding, you may also consider to use tegra_multimedia_api and refer to

tegra_multimedia_api\samples

tegra_multimedia_api\samples\00_video_decode

0_video_decode

The frames are decoded to NvBuffer, and each NvBuffer is an EGLImage.


You should be able to post the buffer out by putting following code in thread_start():

hEglImage = NvEGLImageFromFd(egl_display, dmabuf_fd);
glActiveTexture(GL_TEXTURE0);
glBindTexture(GL_TEXTURE_EXTERNAL_OES, texture_id);
glEGLImageTargetTexture2DOES(GL_TEXTURE_EXTERNAL_OES, hEglImage);
glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);
eglSwapBuffers(egl_display, egl_surface);
NvDestroyEGLImage(egl_display, hEglImage);

I had a misunderstanding here. I thought that the gstreamer sink would feed the producer which then would render into the stream texture.

But the nvvideosink basically is some kind of producer already.

I wrote a simplified producer class which is handling the pipeline and this is working now.

Thank you