s.n.3
1
I’m trying the following pipeline to playing the video while resizing.
launch string: filesrc location=“4K60P_H.264_100M).mp4” ! decodebin ! nvvidconv ! video/x-raw(memory:NVMM), format=RGBA, width=1920, height=1080 ! appsink name=mysink sync=false
But the appsink doesn’t receive the resized egl image, it’s original size.
Why does this pipeline not work properly ?
Hi,
Please try the string:
launch string: filesrc location="4K60P_H.264_100M).mp4" ! decodebin ! nvvidconv ! video/x-raw(memory:NVMM), format=RGBA, width=1920, height=1080 ! nvvidconv ! video/x-raw ! videoconvert ! video/x-raw,format=BGR ! appsink name=mysink sync=false
Please convert frame data to BGR in CPU buffer and then send to appsink
s.n.3
4
I don’t use the CPU buffer.
I pass the data from the gstreamer to the glsl shader as egl image.
What should I do in this case ?
The following is the one part of my code.
EGLImageKHR hEglImage;
GstSample *sample = getGstSample();
GstBuffer *buffer = NULL;
GstCaps *caps = NULL;
GstMapInfo map = {0};
int dmabuf_fd = 0;
caps = gst_sample_get_caps (sample);
gst_caps_get_structure (caps, 0);
buffer = gst_sample_get_buffer (sample);
gst_buffer_map (buffer, &map, GST_MAP_READ);
ExtractFdFromNvBuffer((void *)map.data, &dmabuf_fd);
hEglImage = NvEGLImageFromFd(egl_display, dmabuf_fd);
glActiveTexture(GL_TEXTURE0);
glBindTexture(GL_TEXTURE_EXTERNAL_OES, texture_id);
glEGLImageTargetTexture2DOES(GL_TEXTURE_EXTERNAL_OES, hEglImage);
Hi,
Do you use Jetpack 4 or 5? Please share the version you are using for reference.
Hi,
NvBuffer APIs are supported on Jetpack 4 and it should be working. Please try to apply the pipeline to the sample:
How to run RTP Camera in deepstream on Nano - #29 by DaneLLL
And check if width and height are 1920x1080 in NvBufferParams.
system
Closed
9
This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.