~/tegra_multimedia_api/argus/apps/camera/renderer/Composer.h::Composer class

Hi Folks,

I intend to use Composer class in my pipeline. In one of nvidia example application - argus_camera (i.e. ~~/tegra_multimedia_api/argus/apps/camera/ui/camera/Main.cpp) this class is used for camera preview. I would like to use this class to put up my ‘processed images/video’ on screen.

Would appreciate any guidance for this .

Thanks

Hi,
The code is at

tegra_multimedia_api\argus\apps\camera\renderer\Composer.cpp
tegra_multimedia_api\argus\apps\camera\renderer\StreamConsumer.cpp

In initializing, it initializes vertex and fragment shader and binds to eglstream/texture.
Everytime it gets an frame, it executes

glBindTexture();
...
glDrawArrays();

And do swap buffers.

It is same as preview consumer at

tegra_multimedia_api\argus\samples\utils\PreviewConsumer.cpp

Hi DaneLLL

Thanks for your help.

In the examples you have provided - cameras are directly feeding frames for preview/composer. In my situation I am looking to put up ‘post processed’ video frames on composer/screen. I am looking for way to do that. To me the situation in two aforesaid examples is not same as my requirement.

THanks

We have several samples for processing frames. FYR.
gl APIs

tegra_multimedia_api\argus\samples\openglBox

CUeglFrame

tegra_multimedia_api\argus\samples\cudaHistogram

NvBuffer

tegra_multimedia_api\samples

tegra_multimedia_api\samples\09_camera_jpeg_capture

9_camera_jpeg_capture