OpenGL texture to Gstreamer appsrc for encoding

I have a GLFW program that grabs frames from a camera (using gstreamer and appsink), draws an overlay, then outputs the texture to the encoder for recording. I was using GLReadTexturePixels() for the copy to a GSTBuffer, but the copying was too slow.

I have now looked at several examples from Jetson Multimedia API and other examples from this forum, but I am still having issues passing my textures to Gstreamer.

My pipeline for testing:

pipelineString = FormatText("appsrc name=video%d is-live=true do-timestamp=true !  video/x-raw(memory:NVMM), format=RGBA, width=1920, height=1080, framerate=30/1 ! queue ! nvvidconv flip-method=6  ! video/x-raw(memory:NVMM),format=NV12 ! queue ! nvoverlaysink sync=false display-id=1", 0 ); 
GstElement  *pipeline = gst_parse_launch (pipelineString, &error);
if (error != NULL) {
    printf ("Something went wrong with overlay  %d initialization: %s\n", Cam,  error->message);
return NULL;

AppsrcPipe[Cam].src = gst_bin_get_by_name(GST_BIN ( AppsrcPipe[Cam].pipeline), FormatText("video%d", Cam==3?4:Cam));
    gst_app_src_set_stream_type(AppsrcPipe[Cam].src, GST_APP_STREAM_TYPE_STREAM);
gst_element_set_state(AppsrcPipe[Cam].pipeline, GST_STATE_PLAYING);

And my Function to push the buffers to appsrc:

void pushBuffer(int cam){
 int err;
void* pdata;
    printf("failed to get EGL display\n");	

GstBuffer *buffer;
GstFlowReturn ret;
GstMapInfo map = {0};
int dmabuf_fd = 0;
gpointer data = NULL, user_data = NULL;
NvBufferParams par;
GstMemoryFlags flags = (GstMemoryFlags)0;

NvBufferCreate(&dmabuf_fd, 1920, 1080, NvBufferLayout_BlockLinear, NvBufferColorFormat_ABGR32);
EGLImageKHR temp = NvEGLImageFromFd(MyeglDisplay ,dmabuf_fd);

glBindTexture(GL_TEXTURE_2D, _texture_id);
glEGLImageTargetTexture2DOES(GL_TEXTURE_2D, temp);
NvDestroyEGLImage(MyeglDisplay, temp);

	printf("Got GL error GL_INVALID_OPERATION: %03X \n", err);

	printf("Got GL error : %03X\n", err);
user_data = g_malloc(sizeof(int));
//printf ("NvBufferCreate %d", dmabuf_fd);
*(int *)user_data = dmabuf_fd;
NvBufferGetParams (dmabuf_fd, &par);
data = g_malloc(par.nv_buffer_size);

buffer = gst_buffer_new_wrapped_full(flags,data,par.nv_buffer_size, 0, par.nv_buffer_size,user_data, notify_to_destroy);

gst_buffer_map (buffer, &map, GST_MAP_WRITE);
memcpy(, par.nv_buffer , par.nv_buffer_size);
gst_buffer_unmap(buffer, &map);

int flow;

g_signal_emit_by_name(AppsrcPipe[cam].src, "push-buffer", buffer, &flow);
if(flow !=0){
printf("flow = %d  from camera %d\n", flow, cam);

The issue I am having is that the full image is often not making it to gstreamer. Most frames are only partial (see attached image)


I took a look at your code but couldn’t find where it is working. Looks most likely a synchronization problem.

If it helps you and your project, we have a product which does overlay a qml file over gstreamer buffers:

Best Regards,
Roberto Gutierrez
Embedded Software Engineer

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.