There is a sample patch which runs Argus + NvVideoEncoder + gstreamer. For your case, you may replace Argus with NvBuffers and use below calls to access it through CUDA:
I started doing the compression with tegra_multimedia_api and using gstreamer for the output. It works great if I feed the encoder input planes with data from cpu.
CUDA->Frame to CPU->NvBuffer->Compression->Gstreamer.
Is there any way to create an NvBuffer directly from a gpu buffer, to avoid the second step?
If I try to use cudaMemCpy to fill the input NvBuffer planes cudaMemcpy returns error 70, which I don’t even see listed.
If I fill it using memcpy instead (with buffers from cpu) it works fine.
Hi,
Working flow is to create NvBuffer and get the CUDA pointer from EGLImage. You can fill in data with the CUDA pointer. The sample code of getting CUDA pointer is in several samples, such as 12_camera_v4l2_cuda.