I intend to use Composer class in my pipeline. In one of nvidia example application - argus_camera (i.e. ~~/tegra_multimedia_api/argus/apps/camera/ui/camera/Main.cpp) this class is used for camera preview. I would like to use this class to put up my ‘processed images/video’ on screen.
In the examples you have provided - cameras are directly feeding frames for preview/composer. In my situation I am looking to put up ‘post processed’ video frames on composer/screen. I am looking for way to do that. To me the situation in two aforesaid examples is not same as my requirement.