Argus example of a cuda producer?

I am working on a prototype where I want to blend two Video inputs, using a cuda kernel, and finally encode them using h265.

Inside the jetson_multimedia_api I have found this sample “jetson_multimedia_api/argus/samples/syncSensor”. It opens two cameras, and using EGL streams join them in a shared context where I can access them from CUDA.

I have also found the sample “jetson_multimedia_api/argus/samples/gstVideoEncode”. This example opens a camera and sends images via an EGLStream to a gstreamer pipeline.

The part that I am missing is an example on how to create a stream, and the buffers needed, that can be accessed by cuda.

I have found varoius snippets, but I have not got the overview that allow me to do it in the context of the Argus examples. Other examples have other encapsulation of the EGL interface so I cant find my way around them.

Any recommendation on what to read or examples to examine are more than welcome.

Kind regards


We suggest use NvBuffer APIs. You can call createNvBuffer()/copyToNvBuffer() to put data in NvBuffer, and call NvBufferComposite() to blend the input frames. For video encoding, can set output plane to V4L2_MEMORY_DMABUF and directly send NvBuffer to encoder.

Please refer to the samples:


Hi DaneLLL,

Thank you for your response.

I have examined the examples, but I think they point me in a wrong direction for a couple of reasons.

I need to be able to access the data from a cuda kernel, as the “blend” is not a trivial blend. I am implementing a stitching between the two images that take distortion into account while fading between them.

Therefore the NvBufferComposite seems wrong for my usecase.

My final endpoint is a gstreamer pipeline, so I think the right track is using the argus/gstVideoEncode sample.

What I figure I need is an example of how to create an eglstream and allocate eglframes for it.

The syncSensor example shows how to access such eglframes from cuda.

Kind regards


There’s a DeepStream element to undistort camera viewpoints. If you’re going to use GStreamer, it might suit your purposes as a pre-built element:

If you don’t need the rest of DeepStream, the source for some plugins is included with DeepStream itself in /opt/nvidia/deepstream/deepstream/sources/gst-plugins. As of DeepStream 5.1 I don’t see the dewarper source there, but it could be in the future. Some DeepStream plugins require DeepStream metadata to work, but not all. You could try it on it’s own and see if it works. It expects a RGBA buffer, rather than NV12 so you’ll likely need a nvvideoconvert or nvvidconv before it.