How to render data from nvjpegdec with appsink

Hi,
we are implementing camera streaming with custom rendering on HMI app, and below is working pipeline:
gst-launch-1.0 videotestsrc ! nvvidconv ! ‘video/x-raw(memory:NVMM),format=NV12’ ! nvjpegenc ! tee name=“t” ! queue ! nvjpegdec ! appsink name=“myappsink” t. ! queue ! rtpjpegpay ! udpsink host=127.0.0.1 port=8084

problem is when we get data from appsink for rendering:
Caps is correct I420 data format, width/height:320x240
CAPS= “video/x-raw(memory:NVMM), format=(string)I420, width=(int)320, height=(int)240, interlace-mode=(string)progressive, multiview-mode=(string)mono, multiview-flags=(GstVideoMultiviewFlagsSet)0:ffffffff:/right-view-first/left-flipped/left-flopped/right-flipped/right-flopped/half-aspect/mixed-mono, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)jpeg, colorimetry=(string)1:4:0:0, framerate=(fraction)30/1”

But buffer size is very small: 1008
buffer = gst_sample_get_buffer (sample);
GstMapInfo bufferInfo;
gst_buffer_map(buffer, &bufferInfo, GST_MAP_READ);

normally when we use software decode jpegdec, buffer is correctly
Thanks.

nvjpegdec may output into NVMM memory by default so you get NvBuffer.
You may use nvvidconv after it before appsink if application is reading buffers from CPU:

gst-launch-1.0 videotestsrc ! nvvidconv ! 'video/x-raw(memory:NVMM),format=I420' ! nvjpegenc ! tee name=t ! queue ! nvjpegdec ! 'video/x-raw(memory:NVMM),format=I420' ! nvvidconv ! video/x-raw,format=I420 ! appsink name=myappsink      t. ! queue ! rtpjpegpay ! udpsink host=127.0.0.1 port=8084

# Or nvjpegdec outputing to standard memory
gst-launch-1.0 videotestsrc ! nvvidconv ! 'video/x-raw(memory:NVMM),format=I420' ! nvjpegenc ! tee name=t ! queue ! nvjpegdec ! video/x-raw,format=I420 ! appsink name=myappsink      t. ! queue ! rtpjpegpay ! udpsink host=127.0.0.1 port=8084

Thank for your support it is working correctly.
Can i have one more question, is it possible to draw NvBuffer using opengl. so we can improve performance with zero copy?

is this correct:

You may try something like:

gst-launch-1.0 videotestsrc ! nvvidconv ! 'video/x-raw(memory:NVMM),format=I420' ! nvjpegenc ! tee name=t ! queue ! nvjpegdec ! 'video/x-raw(memory:NVMM),format=I420' ! nvegltransform ! nveglglessink      t. ! queue ! rtpjpegpay ! udpsink host=127.0.0.1 port=8084

If you want to perform some additional GPU processing before display, you may use nvivafilter before nvegltransform.
It would map the nvBuffer to an egl image. Be aware that EGL image format would depend on nvivafilter output caps, and YUV formats may have stride such as 256 bytes. BGRA may be easier but not faster. Searching this forum for nvivafilter you may find various examples.

I mean is it possible to draw eglimage using NvBuffer from appsink? because of using nvvidconv I420 affect to CPU load

Found the solution from here: