Using eglimage as openmax H264 encoder input, then the encoder output video damage

egl render:

glGenTextures(1, &texture);
glBindTexture(GL_TEXTURE_2D, texture);
glTexImage2D (GL_TEXTURE_2D, 0, GL_RGB, width_, height_, 0, GL_RGB, GL_UNSIGNED_BYTE, nullptr);

eglimage = eglCreateImageKHR(display_, context_, EGL_GL_TEXTURE_2D_KHR, (EGLClientBuffer)texture, nullptr);

glGenFramebuffers(1, &framebuff);
glBindFramebuffer(GL_FRAMEBUFFER, framebuff);
glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, texture, 0);

glClearColor(1, 0, 0, 1);

openmax h264 encoder:

OMX_UseEGLImage(handle_, &buff, in_port_, eglimage, nullptr);

the code above just a part

it can output h264 stream successfully, but the top left corner of video picture is damaged

the attachment is a frame from the output h264 stream.

if i use yuv data reading from file as openmax h264 encoder input, the out put video picture is normal without any damage

Is there anyone can help me, it puzzle me a few days

If I’m not mistaken, attempting to get EGL data into a video stream wouldn’t work. An H.264 encoder would expect input that looks like frame data (i.e. a rectangle filled with pixels), but an EGL image would, I believe, look something like an OpenGL buffer, which wouldn’t have data in that kind of structure.

It’s been a while since I’ve worked directly with OpenGL code, but I don’t see a render step in the stack you mention, and I believe that a frame would need to be rendered out as RGB before the encoder could take it in.

This link might help:
This thread for the Raspberry Pi also mentions something similar:

hello cstotts:

thanks for your reply

  1. openmax decoder -> eglimage -> opengles render:
    I have successfully use eglimage to transfer rgb data from decode to render in TX1, just as you menthion:

so the decode data and the eglimage data maybe same kind, at least they are compatible for it work successfully

2.opengles render -> eglimage -> openmax encoder
when i put the eglimage to encoder it doesn’t report any error,but the output h264 video is damage(show in attachment),
so i wonder wether i make mistakes using the eglimage & encoder api or the openmax encoder using eglimage not implement in TX1, just as the Raspberry Pi (

What is interesting about the image that you attach is that it looks like an OpenGL surface composed of two triangles, as would be expected of an OpenGL surface, but one of the triangles isn’t rendering properly (maybe because it’s an incomplete buffer?), which is partially what made me recommend a render as an intermediate step.

Unfortunately I’m at the limit of my abilities here so I’m not sure what the next step is.

Thanks, cstotts!

Is there anyone else who can give me some help

Hi ColinaA,

We don’t support EGLImage as Input to encoder in Gstreamer plugins; trying to use OpenMax IL APIs directly which is not encouraged because that is not well tested path.


Hi kayccc,

Yes, I use OpenMax APIs directly, but the ecoder output video is damage(see in attachment), I donn’t know is there anything wrong when i use the api directly, or it is a bug of tx1. And Is there any examples or instructions about how to use openmax in TX1

Trying to use OpenMax IL APIs directly which is not encouraged because that is not well tested path, and not in the defined support scope.

Unfortunately, there is also no existing examples or instructions for end user.


When Android OS running on TX1 and TX2,is it supports OpenMax APIs? when using OpenMax APIs,video encode and decode processed by hardware unit?