Export GL texture as dmabuf

Is there a way to render to a texture, and then export that texture as a dmabuf for use with the TX2 video encoder? I want to stream rendered video over the network, and I’m trying to see if I can avoid glGetTexImage. Thanks!

Hi wdouglass,

To bind your texture to dmabuffer, you need to use glEGLImageTargetTexture2DOES to help.

Following code snippet would bind texture to EGLImage(srcImage). EGLImage can be created by “NvEGLImageFromFd” and fd is the dmabuffer fd.

GL_CHECK(glActiveTexture(GL_TEXTURE0));
GL_CHECK(glBindTexture(srcTarget, srcTex));
GL_CHECK(glEGLImageTargetTexture2DOES(srcTarget, srcImage));

You may have misunderstood my question. I’ve rendered to a texture using a glsl shader. Now i need to pass that texture to the H264 encoder.

How do i export the texture from the opengl context to a dmabuf, so that it can then be compressed?

Hi wdouglass,

Sorry that it is not that clear. I meant, you can create a empty EGLimage first, bind to texture and use glsl shader on this texture. Then, after rendering, take the EGLimage to encoder.

Thanks! I thought a EGLimage texture bind was an upload, i didn’t realise it was RW from openGL

Thanks Again.

I’m binding my EGLImage (created with NvEGLImageFromFd from a dmabuf exported from the OUTPUT plane of a nvhost-vic converter) like this:

glBindFramebuffer(GL_FRAMEBUFFER, ctx->screen.fb);
glcheckr();
glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, ctx->screen.tex, 0);
glActiveTexture(GL_TEXTURE0);
glBindTexture(GL_TEXTURE_2D, ctx->screen.tex);
if (im != EGL_NO_IMAGE) {
    glEGLImageTargetTexture2DOES(GL_TEXTURE_2D, im);
    glcheckr();
}

After this binding, i do my render (to ctx->screen.tex), then draw ctx-screen.tex to the x11 window, and queue it to the converter. but when i dq buffers off of the capture plane of the converter, they look empty! did i miss something? (glcheckr is a macro that checks opengl error state.)

Correction: I had neglected to set the buffer layouts for my converter. It’s working now.

Hi douglas, can you explain better to me what you did?

How did you use glEGLImageTargetTexture2DOES with GL_TEXTURE_2D? Is that even allowed?

Not sure if it’s “allowed”, but this code works and i’ve been using it for a long time (on L4T 28.2)

If it’s against the specification, it may not work in later versions.

could you share your fragment shader and a little more detailed piece of code? I’ve read that GL_TEXTURE_2D can be used with an extension at least.

THank you so much!