I need to blend a frame buffer into a portion of a 4k 30fps frame captured from a RAW interfaced camera sensor. The frame buffer is a static background for blending and then there will be some basic 2d animations of gauge needles and texts overlaid into this frame buffer. I need to do this at 30fps for the background render and at least 10fps for the animated objects, whose values are received from an SPI interfaced sensor. This needs to be H.265 encoded and stored in AVC container.
Can this be done with existing GStreamer plugins, or do I need to get down and dirty with CUDA, OpenGL. etc.?
What would be a typical pipeline for something like this?