I bought a Jetson TK1 board, and I want to use OpenMax or Gstreamer to encode a raw yuv420 format stream to H264 or VP8 frame-by-frame.
Indeed, I just need the hardware OpenMax frame data, but some guys said that OpenMax is not recommended to use on TK1 platform. I’ve asked Kulve and he give me some very useful suggestions about how to pull the data to Gstbuffer from pipeline by using appsink in Gstreamer. But I’m still a little confused in the relationships between “Buffer” and “Enocded frame”. So could someone tell me the relationships and difference between “Gstbuffer(using in Appsink)” and “Encoded Frame”? Or could anyone give me some advice on how to extract the correct encoded frame-by-frame data from a gstreamer stream?
Thanks a lot!
GStreamer on Jetson TK1 is implemented on top of OpenMAX but only the GStreamer API is supported and the lower parts may change without notice.
If you have your appsink right after the video encoder plugin, try what happens if you assume that one buffer is one frame?
In my code I’m piping the encoded data to the rtph264pay plugin and I think that knows when the frame changes, so maybe you can check from there how to do it?