I am working on a prototype where I want to blend two Video inputs, using a cuda kernel, and finally encode them using h265.
Inside the jetson_multimedia_api I have found this sample “jetson_multimedia_api/argus/samples/syncSensor”. It opens two cameras, and using EGL streams join them in a shared context where I can access them from CUDA.
I have also found the sample “jetson_multimedia_api/argus/samples/gstVideoEncode”. This example opens a camera and sends images via an EGLStream to a gstreamer pipeline.
The part that I am missing is an example on how to create a stream, and the buffers needed, that can be accessed by cuda.
I have found varoius snippets, but I have not got the overview that allow me to do it in the context of the Argus examples. Other examples have other encapsulation of the EGL interface so I cant find my way around them.
Any recommendation on what to read or examples to examine are more than welcome.