In order to save the raw frame data, I’m trying:
$ gst-launch-1.0 nvcamerasrc num-buffers=10 fpsRange=“30.0 30.0” ! ‘video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)I420, framerate=(fraction)30/1’ ! filesink location=video.raw
Unfortunately the generated file only saves 480 bytes per frame, instead of the expected 1920*1080.
In fact, if I try to use gstreamer programmatically and set up an appsink, the new_sample callback gets called 30 times per second, but the size of the buffer is invariably 480. I’d post the code but trying to first make the simple pipeline above should be a first step.
Interestingly, the following works for h264:
$ gst-launch-1.0 nvcamerasrc num-buffers=100 fpsRange=“30.0 30.0” ! ‘video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)I420, framerate=(fraction)30/1’ ! omxh264enc ! h264parse ! qtmux ! filesink location=video.mp4
Breaking up the pipeline:
nvcamerasrc - data source is CSI camera
omxh264enc - encodes to h.264
h264parse - no sure what it actually does, but it is needed
qtmux - puts the video into a quicktime container
filesink - saves the quicktime file
If I then:
$ vlc video.mp4
It plays just fine.
So somehow omxh264enc can get the real frame data, and is not stuck with the measly 480 bytes that I get when trying to save to file.
How to get the raw frame data?