I am trying to use GStreamer1.0 and omxh264enc plugin to encode 1920x1080 video streams on Tegra TK1. When I run 2 instances on GStreamer to encode each stream independently, then I can get ~28FPS if I encode one stream and 26FPS if I encode 2 streams. But If I merge 2 HD streams into one by stacking frames on each other to get 1920x2160 resolution frame, then the streaming slows down to ~15FPS. So my questions are:
Does Tegra TK1 H264 encoder has upper limit on resolution after which encoding does not run at ~30FPS?
Why can’t I get to exactly 30 FPS with just 1 video stream? (I maxed out perf on CPU and GPU). The Tegra docs claim that I should be able to encode 1080 video stream at 30FPS.
What is the best way (on Tegra TK1) to do the following: get 2 HD streams from 2 USB3 cameras, encode them in H264 in a synchronized fashion and stream them out? It is important for me to have synchronized frames from cameras. If I use several instances of gstreamers, then it is hard to sync frames on the receiving side.