About decode ,display and encode memory copy outof GPU

Hi all,
I receive network streams on tx1 or tx2 ,and want to decode stream ,display data. but i dont want copy out data from GPU to CPU. can I send data to the CUDA display after decode directly ?
I saw samples that decode frame to CPU Memory ,or encode CPU Memory .Can the data go through the CPU?

thanks.

We have gstreamer and tegra_multimedia_api, please check which is better for your cases:
https://developer.nvidia.com/embedded/dlc/l4t-documentation-28-1

About tegra_multimedia_api, I get the NvBuffer(basis of the v4l2_buffer structure) ,but it isn’t look like GPU memory. I want to know the tegra_multimedia_api that can support decode to GPU buffer, but not CPU buffer.

Hi he-lixiang,
NvBuffer is HW-accessible memory, not CPU memory. You can refer to the sample tegra_multimedia_api/samples/02_video_dec_cuda

thanks for your replay.

I read the sample tegra_multimedia_api/samples/02_video_dec_cuda in detail. and have three questions.

(1).
in the function read_decoder_input_nalu, it read the data from file stream and memcopy to the NvBuffer( HW-accessible memory), and “memcpy” is the function that to achieve the CPU memory to the HW-accessible memory?
fllow code:
char *buffer_ptr = (char *) buffer->planes[0].data; //NvBuffer * buffer

memcpy(buffer_ptr, stream_ptr, 4);
(2).
Create EGLImage from dmabuf fd, is just package the NvBuffer fd and no memory copy?

// Create EGLImage from dmabuf fd
ctx->egl_image = NvEGLImageFromFd(ctx->egl_display, buffer->planes[0].fd);

(3).how to convert eglimage to Nvbuffer?
I will use opengl to process the decoded Nvbuffer and then get new EGLImage ,convert eglimage to Nvbuffer, finally encode and record the Nvbuffer by video_enc_cuda .

This is to copy compressed H264 stream to input buffers. Loading is light.

It is not memory copy.

This may not be supported. You can do post-processing via CUDA, or create NvBuffer and call NvEGLImageFromFd() to get eglimage.

thanks .