Documentation explaining fundamentals of Multimedia API

Hello,

I’ve only recently started with the TX2 and video processing in general, and am using the Multimedia API to Encode\Decode video.

I’ve got some code that is encodes and decodes correctly. However, now I need to add a conversion step to scale the stream, and I was going through example 07_video_convert which has the capture plane of conv0 become the output plane of conv1, and it made me realise I don’t have a good understanding of how the buffers between the planes interact, or what most of the values mean. I’ve been getting the API to work through cargo culting the samples and trial and error.

I’ve been going through the samples here: Jetson Linux API Reference: Main Page | NVIDIA Docs which are really helpful, but is there any documentation that explains precisely what queuing and dequeuing on the planes is actually doing with the buffers? Or at what point the DQ callbacks are called? Or what precisely the buffer and shared_buffer are in the callbacks?

Maybe this is basic video processing knowledge I don’t have, is there any good tutorials or documentation out there that explain this stuff?

Thanks

Hi,
The queuing and dequeuing are based on v4l2 interfaces. We have the implementation open source at

tegra_multimedia_api\samples\common\classes\NvV4L2Element.cpp
tegra_multimedia_api\samples\common\classes\NvV4L2ElementPlane.cpp

Beside NvVideoConverter, please also refer to NvBuffer APIs in nvbuf_utils.h. You can do conversion via NvBufferTransform().

In case anyone else is looking, I found this quite helpful: