I need to use JETSON TX2 as an encoder (H264) to video stream to large video(5120x5120)
I know that GStreamer h264 encoder can get max res 4096x4096 so I need to split the frame to tiles
if there is another solution - I will glad to hear
I need to know when the hardware encoder take place. I know that I can use video convert with NVMM memory,
(something like gst-launch-1.0 … videocont (memory:NVMM)
but I can’t see any effect when I looking CPU preformace, Do I need to do any special configuration to
operate the hardware encoder?
If I have coming stream from camera (5120x5120) so I use appsrc as a gstreamer source
and I need to pipeline encoding what I use today is the following pipeline (The videotestsrc will be replaced by appsrc):
1.Is omxh264enc is HW encoder or SW encoder?
2. If I Use in omxh264enc caps filter NVMM instead of only x-raw is it means that I use CPU buffer and HW encoding or SW encoding?
I cant see any documentation that clarify these issues