Convert Image

I have a 12MP camera that create images at several formats:
VYUY - V4L2_PIX_FMT_VYUY /* 16 YUV 4:2:2 /
RG16 - V4L2_PIX_FMT_SRGGB16 /
16 RGRG… GBGB… */

I would like to record the frames to SD card at frame rate of 20fps .
each frame is 24mb (470mb/s).
I would like to convert the frame to RGB and than convert it to H264 using gstreamer.
I succeed to convert the frame to RGB using openCV but it takes ~80ms per frame.
There is any hardware solutions for my problems?
my platform is jetson Xavier AGX

You would try a pipeline such as:

gst-launch-1.0 -e v4l2src device=/dev/video0 ! video/x-raw, format=VYUY, width=640, height=480, framerate=20/1 ! videoconvert ! nvvidconv ! nvv4l2h264enc ! h264parse ! qtmux ! filesink location=output_H264.mp4

You would adjust device to the video node of your camera, and adjust resolution for one supported at 20 fps with VYUY format. You can check available modes with:

v4l2-ctl -d /dev/video0 --list-formats-ext

(v4l2-ctl is provided by apt package v4l-utils).

Hey,
Thank you for your answer!
As i understand, the manufacturer doesn’t support gstreamer at high rate fps,
So I wrote a cpp code that grub frames from the camera at wanted fps.
I would like to support nvidia’s hardware accelerator, how can i do it using cpp program (without gstreamer)?

You may either link your program to gstreamer and use a similar pipeline starting with appsrc (your program).

Or you may use MMAPI. It is located in /usr/src/jetson_multimedia_api, if not available you can install with:

sudo apt install nvidia-l4t-jetson-multimedia-api

You would first try the samples, such as /usr/src/jetson_multimedia_api/samples/01_video_encode, but not sure it will provide container such as mp4.

I tried to use BAYER8 (RGGB)

using gstreamer pipeline:
appsrc name=app_src blocksize=12368896 ! video/x-bayer, width=4112, height=3008, format=rggb, framerate=1/1 ! bayer2rgb ! nvvidconv ! video/x-raw, width=4112, height=3008 ! videoscale method=0 n-threads=4 ! video/x-raw, width=2000, height=1500 ! omxh264enc bitrate=8000 ! qtmux name=app_mux ! filesink name=app_sink
and received those messages:

nvbuf_utils: Could not get EGL display connection
0:00:00.499332924 30754   0x55559fc6a0 WARN                     omx gstomx.c:2826:plugin_init: Failed to load configuration file: Valid key file could not be found in search dirs (searched in: /home/nvidia/.config:/etc/xdg as per GST_OMX_CONFIG_DIR environment variable, the xdg user config directory (or XDG_CONFIG_HOME) and the system config directory (or XDG_CONFIG_DIRS)
./sd/VIS_BayerH264Encoded_Movies_11/03_02_2021-19_21_20.mp4
0:00:00.779349316 30754   0x55557ac5e0 FIXME                default gstutils.c:3981:gst_pad_create_stream_id_internal:<app_src:src> Creating random stream-id, consider implementing a deterministic way of creating a stream-id
0:00:03.369791665 30754   0x55557ac5e0 FIXME           videoencoder gstvideoencoder.c:661:gst_video_encoder_setcaps:<omxh264enc-omxh264enc0> GstVideoEncoder::reset() is deprecated
Framerate set to : 1 at NvxVideoEncoderSetParameterNvMMLiteOpen : Block : BlockType = 4 
===== NVMEDIA: NVENC =====
NvMMLiteBlockCreate : Block : BlockType = 4 
0:00:03.373431434 30754   0x55557ac5e0 WARN             omxvideoenc gstomxvideoenc.c:1860:gst_omx_video_enc_set_format:<omxh264enc-omxh264enc0> Error setting temporal_tradeoff 0 : Vendor specific error (0x00000001)
H264: Profile = 66, Level = 40 
0:00:03.523632425 30754   0x55557ac5e0 WARN                 basesrc gstbasesrc.c:3055:gst_base_src_loop:<app_src> error: Internal data stream error.
0:00:03.523681547 30754   0x55557ac5e0 WARN                 basesrc gstbasesrc.c:3055:gst_base_src_loop:<app_src> error: streaming stopped, reason error (-5)

(Chimera:30754): GStreamer-CRITICAL **: 19:21:23.548: gst_segment_to_running_time: assertion 'segment->format == format' failed
0:00:03.524101213 30754   0x55557ac5e0 FIXME               basesink gstbasesink.c:3145:gst_base_sink_default_event:<app_sink> stream-start event without group-id. Consider implementing group-id handling in the upstream elements

I am pushing the frame at RGGB format and receiving a file of 0 bytes.

I have tried to run it at console:

gst-launch-1.0 multifilesrc location="/media/sd/data/tmp/almog/frame_4112x3008_RGGB_0.bin"  ! video/x-bayer, width=4112, height=3008, format=rggb, framerate=1/1 ! bayer2rgb ! videoconvert ! video/x-raw, width=4112, height=3008 ! videoscale method=0 n-threads=4 ! video/x-raw, width=2000, height=1500 ! omxh264enc bitrate=120000000 ! queue ! qtmux name=app_mux ! filesink location="a.mp4" name=app_sink

and got:

nvbuf_utils: Could not get EGL display connection
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
Framerate set to : 1 at NvxVideoEncoderSetParameterNvMMLiteOpen : Block : BlockType = 4 
===== NVMEDIA: NVENC =====
NvMMLiteBlockCreate : Block : BlockType = 4 
H264: Profile = 66, Level = 40 
_ValidateEncodeParams: level not supported 

(gst-launch-1.0:375): GStreamer-CRITICAL **: 19:25:54.280: gst_segment_to_running_time: assertion 'segment->format == format' failed
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
ERROR: from element /GstPipeline:pipeline0/GstQTMux:app_mux: Could not multiplex stream.
Additional debug info:
gstqtmux.c(4561): gst_qt_mux_add_buffer (): /GstPipeline:pipeline0/GstQTMux:app_mux:
Buffer has no PTS.
Execution ended after 0:00:00.123680485
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...

in case of tring videotestsrc all works fine and i got (and a.mp4 get larger and larger):

gst-launch-1.0 videotestsrc ! video/x-bayer, width=4112, height=3008, format=rggb, framerate=1/1 ! bayer2rgb ! videoconvert ! video/x-raw, width=4112, height=3008 ! videoscale method=0 n-threads=4 ! video/x-raw, width=2000, height=1500 ! omxh264enc bitrate=120000000 ! queue ! qtmux name=app_mux ! filesink location=“a.mp4” name=app_sink

nvbuf_utils: Could not get EGL display connection
Setting pipeline to PAUSED …
Pipeline is PREROLLING …
Framerate set to : 1 at NvxVideoEncoderSetParameterNvMMLiteOpen : Block : BlockType = 4
===== NVMEDIA: NVENC =====
NvMMLiteBlockCreate : Block : BlockType = 4
H264: Profile = 66, Level = 40
_ValidateEncodeParams: level not supported
Pipeline is PREROLLED …
Setting pipeline to PLAYING …
New clock: GstSystemClock
^Chandling interrupt.
Interrupt: Stopping pipeline …
Execution ended after 0:00:06.234759190
Setting pipeline to PAUSED …
Setting pipeline to READY …
Setting pipeline to NULL …
Freeing pipeline …

what do i am doing wrong?

I don’t have your camera nor software so it’s hard to advice, but for what I can try:

  • You may tell more about your camera, how it is connected (CSI or USB or else), if other formats are available for better advice.
  • First try to get correct display from your camera. Since I don’t have your camera, I generate 150 frames (10s @ 15 fps) files from videotestsrc with:
# For bayer rggb
gst-launch-1.0 videotestsrc num-buffers=150 ! video/x-bayer, format=rggb, width=4112, height=3008, framerate=15/1 ! filesink location=test_4412x3008.rggb

# For VYUY
gst-launch-1.0 videotestsrc num-buffers=150 ! video/x-raw, format=VYUY, width=4112, height=3008, framerate=15/1 ! filesink location=test_4112x3008.VYUY
  • Then display these for checking:
# For bayer rggb
gst-launch-1.0 -v filesrc location=test_4112x3008.rggb blocksize=12368896 ! video/x-bayer, width=4112, height=3008, format=rggb, framerate=15/1 ! bayer2rgb ! videoconvert ! xvimagesink

# For VYUY
gst-launch-1.0 -e filesrc location=test_4112x3008.VYUY ! videoparse format=vyuy width=4112 height=3008 framerate=15/1 ! videoconvert ! xvimagesink
  • So now you want to scale into 2000x1500 resolution. Videoscale, as videotestsrc or videoconvert, is CPU only and can quickly be short on Jetson’s CPU for high resolutions*framerate, especially if the plugin doesn’t support mult-threading. You may save a CPU core doing that with nvvidconv, and convert into NV12 format in NVMM memory as expected by HWENC. You may measure fps before encoder with:
# For bayer rggb
gst-launch-1.0 -v filesrc location=test_4112x3008.rggb blocksize=12368896 ! video/x-bayer, width=4112, height=3008, format=rggb, framerate=15/1 ! bayer2rgb ! video/x-raw, format=BGRx ! nvvidconv ! 'video/x-raw(memory:NVMM), format=NV12, width=2000, height=1500' ! fpsdisplaysink text-overlay=0 video-sink=fakesink

# For VYUY
gst-launch-1.0 -v filesrc location=test_4112x3008.VYUY ! videoparse format=vyuy width=4112 height=3008 framerate=15/1 ! videoconvert ! video/x-raw, format=NV12 ! nvvidconv ! 'video/x-raw(memory:NVMM), width=2000, height=1500' ! fpsdisplaysink text-overlay=0 video-sink=fakesink

You may also run tegrastats and check usage in both cases.

  • If you think it’s ok, you would encode into H264 and store into container. From bayer format, although encoding works, extra work would be needed for container. For VYUY you would try:
gst-launch-1.0 -e filesrc location=test_4112x3008.VYUY ! videoparse format=vyuy width=4112 height=3008 framerate=15/1 ! videoconvert ! video/x-raw, format=NV12 ! nvvidconv ! 'video/x-raw(memory:NVMM), width=2000, height=1500' ! nvv4l2h264enc ! h264parse ! qtmux ! filesink location=test_VYUY_resized_to_2000x1500_h264.mov

# Decode and check with:
gst-launch-1.0 filesrc location=test_VYUY_resized_to_2000x1500_h264.mov ! qtdemux ! h264parse ! omxh264dec ! nvvidconv ! xvimagesink

Hey,
I am using camera ALVIUM 1800 C-1236c 2-3c connected via CSI.
v4l2-ctl -d /dev/video0 --all

ioctl: VIDIOC_ENUM_FMT
Index : 0
Type : Video Capture
Pixel Format: ‘TP31’
Name : 0x31 MIPI DATATYPE
Size: Discrete 4112x3008
Interval: Discrete 0.073s (13.722 fps)

Index : 1
Type : Video Capture
Pixel Format: ‘RGGB’
Name : 8-bit Bayer RGRG/GBGB
Size: Discrete 4112x3008
Interval: Discrete 0.073s (13.722 fps)

Index : 2
Type : Video Capture
Pixel Format: ‘RG16’
Name : 16-bit Bayer RGRG/GBGB (Exp.)
Size: Discrete 4112x3008
Interval: Discrete 0.073s (13.722 fps)

Index : 3
Type : Video Capture
Pixel Format: ‘RG16’
Name : 16-bit Bayer RGRG/GBGB (Exp.)
Size: Discrete 4112x3008
Interval: Discrete 0.073s (13.722 fps)

Index : 4
Type : Video Capture
Pixel Format: ‘BX24’
Name : 32-bit XRGB 8-8-8-8
Size: Discrete 4112x3008
Interval: Discrete 0.073s (13.722 fps)

Index : 5
Type : Video Capture
Pixel Format: ‘XR24’
Name : 32-bit BGRX 8-8-8-8
Size: Discrete 4112x3008
Interval: Discrete 0.073s (13.722 fps)

Index : 6
Type : Video Capture
Pixel Format: ‘VYUY’
Name : VYUY 4:2:2
Size: Discrete 4112x3008
Interval: Discrete 0.073s (13.722 fps)

the camera support formats:

v4l2-ctl -d /dev/video0 --list-formats-ext
ioctl: VIDIOC_ENUM_FMT
Index : 0
Type : Video Capture
Pixel Format: ‘TP31’
Name : 0x31 MIPI DATATYPE
Size: Discrete 4112x3008
Interval: Discrete 0.073s (13.722 fps)

Index : 1
Type : Video Capture
Pixel Format: ‘RGGB’
Name : 8-bit Bayer RGRG/GBGB
Size: Discrete 4112x3008
Interval: Discrete 0.073s (13.722 fps)

Index : 2
Type : Video Capture
Pixel Format: ‘RG16’
Name : 16-bit Bayer RGRG/GBGB (Exp.)
Size: Discrete 4112x3008
Interval: Discrete 0.073s (13.722 fps)

Index : 3
Type : Video Capture
Pixel Format: ‘RG16’
Name : 16-bit Bayer RGRG/GBGB (Exp.)
Size: Discrete 4112x3008
Interval: Discrete 0.073s (13.722 fps)

Index : 4
Type : Video Capture
Pixel Format: ‘BX24’
Name : 32-bit XRGB 8-8-8-8
Size: Discrete 4112x3008
Interval: Discrete 0.073s (13.722 fps)

Index : 5
Type : Video Capture
Pixel Format: ‘XR24’
Name : 32-bit BGRX 8-8-8-8
Size: Discrete 4112x3008
Interval: Discrete 0.073s (13.722 fps)

Index : 6
Type : Video Capture
Pixel Format: ‘VYUY’
Name : VYUY 4:2:2
Size: Discrete 4112x3008
Interval: Discrete 0.073s (13.722 fps)

to see video from the camera i have used this pipeline:
gst-launch-1.0 v4l2src device=/dev/video0 ! videoscale ! video/x-raw,width=800,height=600 ! videoconvert ! queue ! ximagesink sync=false -vvv
and i works well.

While trying to grub frames from the camera using gstreamer’s v4lsrc, the camera send frames with low fps, so i wrote an cpp code that using v4l and I received the expected FPS (as described at the camera’s datasheet).
The framerates that I achieved are:
RGGB - V4L2_PIX_FMT_SRGGB8 22 fps
RG16 - V4L2_PIX_FMT_SRGGB1 22 fps
BX24 - V4L2_PIX_FMT_XRGB32 13 fps
XR24 - V4L2_PIX_FMT_XBGR32 13 fps
VYUY - V4L2_PIX_FMT_VYUY 20 fps
so now i have buffer of VYUY/RGGB/RG16 frame and i would like to save it as a video with high rate fps.
(I can’t save it as raw without convert/compression cause it takes 4112×3008×22÷1024÷1024 = 260mb/s for pixel format of 1 byte or 520mb/s for pixel format of 2 bytes).
cause this limitation i want to convert it to h264 and save it as mp4 file with lossless compression.

running suggested pipeline fails cause of high resolution so i scaled it down:>

gst-launch-1.0 -v filesrc location=frame_4112x3008_RGGB_0.bin blocksize=12368896 ! video/x-bayer, width=4112, height=3008, format=rggb, framerate=15/1 ! bayer2rgb ! videoconvert ! videoscale method=0 n-threads=4 ! video/x-raw,height=2160,width=2160 ! xvimagesink

And i received a image shown.

now i would like to convert is:

gst-launch-1.0 -v filesrc location=frame_4112x3008_RGGB_0.bin blocksize=12368896 ! video/x-bayer, width=4112, height=3008, format=rggb, framerate=15/1 ! bayer2rgb ! video/x-raw, format=BGRx ! nvvidconv ! ‘video/x-raw(memory:NVMM), format=NV12, width=2000, height=1500’ ! fpsdisplaysink text-overlay=0 video-sink=fakesink
nvbuf_utils: Could not get EGL display connection
Setting pipeline to PAUSED …
Pipeline is PREROLLING …
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstFakeSink:fakesink0: sync = true
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-bayer, width=(int)4112, height=(int)3008, format=(string)rggb, framerate=(fraction)15/1
/GstPipeline:pipeline0/GstBayer2RGB:bayer2rgb0.GstPad:src: caps = video/x-raw, width=(int)4112, height=(int)3008, framerate=(fraction)15/1, format=(string)BGRx
/GstPipeline:pipeline0/GstCapsFilter:capsfilter1.GstPad:src: caps = video/x-raw, width=(int)4112, height=(int)3008, framerate=(fraction)15/1, format=(string)BGRx
/GstPipeline:pipeline0/Gstnvvconv:nvvconv0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)2000, height=(int)1500, framerate=(fraction)15/1, format=(string)NV12, pixel-aspect-ratio=(fraction)771/752
/GstPipeline:pipeline0/GstCapsFilter:capsfilter2.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)2000, height=(int)1500, framerate=(fraction)15/1, format=(string)NV12, pixel-aspect-ratio=(fraction)771/752
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0.GstGhostPad:sink.GstProxyPad:proxypad0: caps = video/x-raw(memory:NVMM), width=(int)2000, height=(int)1500, framerate=(fraction)15/1, format=(string)NV12, pixel-aspect-ratio=(fraction)771/752
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstFakeSink:fakesink0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)2000, height=(int)1500, framerate=(fraction)15/1, format=(string)NV12, pixel-aspect-ratio=(fraction)771/752
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0.GstGhostPad:sink: caps = video/x-raw(memory:NVMM), width=(int)2000, height=(int)1500, framerate=(fraction)15/1, format=(string)NV12, pixel-aspect-ratio=(fraction)771/752
/GstPipeline:pipeline0/GstCapsFilter:capsfilter2.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)2000, height=(int)1500, framerate=(fraction)15/1, format=(string)NV12, pixel-aspect-ratio=(fraction)771/752
/GstPipeline:pipeline0/Gstnvvconv:nvvconv0.GstPad:sink: caps = video/x-raw, width=(int)4112, height=(int)3008, framerate=(fraction)15/1, format=(string)BGRx
/GstPipeline:pipeline0/GstCapsFilter:capsfilter1.GstPad:sink: caps = video/x-raw, width=(int)4112, height=(int)3008, framerate=(fraction)15/1, format=(string)BGRx
/GstPipeline:pipeline0/GstBayer2RGB:bayer2rgb0.GstPad:sink: caps = video/x-bayer, width=(int)4112, height=(int)3008, format=(string)rggb, framerate=(fraction)15/1
Pipeline is PREROLLED …
Setting pipeline to PLAYING …
New clock: GstSystemClock
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstFakeSink:fakesink0: sync = true

** (gst-launch-1.0:25603): WARNING **: 10:41:19.941: bayer2rgb0: size 144384 is not a multiple of unit size 12368896
ERROR: from element /GstPipeline:pipeline0/GstFileSrc:filesrc0: Internal data stream error.
Additional debug info:
gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline0/GstFileSrc:filesrc0:
streaming stopped, reason error (-5)
Execution ended after 0:00:00.000430217
Setting pipeline to PAUSED …
Setting pipeline to READY …
Setting pipeline to NULL …
Freeing pipeline …

BTW, i don’t want to scale it down, but lets keep the flow.

a convert to NV12 is also failes cause a reason that i don’t undrestand:

nvidia@nvidia:~$ gst-launch-1.0 -v filesrc location=frame_4112x3008_RGGB_0.bin blocksize=12368896 ! video/x-bayer, width=4112, height=3008, format=rggb, framerate=15/1 ! bayer2rgb ! video/x-raw, format=BGRx ! nvvidconv ! ‘video/x-raw(memory:NVMM), format=NV12, width=2000, height=1500’ ! fpsdisplaysink text-overlay=0 video-sink=fakesink
nvbuf_utils: Could not get EGL display connection
Setting pipeline to PAUSED …
Pipeline is PREROLLING …
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstFakeSink:fakesink0: sync = true
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-bayer, width=(int)4112, height=(int)3008, format=(string)rggb, framerate=(fraction)15/1
/GstPipeline:pipeline0/GstBayer2RGB:bayer2rgb0.GstPad:src: caps = video/x-raw, width=(int)4112, height=(int)3008, framerate=(fraction)15/1, format=(string)BGRx
/GstPipeline:pipeline0/GstCapsFilter:capsfilter1.GstPad:src: caps = video/x-raw, width=(int)4112, height=(int)3008, framerate=(fraction)15/1, format=(string)BGRx
/GstPipeline:pipeline0/Gstnvvconv:nvvconv0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)2000, height=(int)1500, framerate=(fraction)15/1, format=(string)NV12, pixel-aspect-ratio=(fraction)771/752
/GstPipeline:pipeline0/GstCapsFilter:capsfilter2.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)2000, height=(int)1500, framerate=(fraction)15/1, format=(string)NV12, pixel-aspect-ratio=(fraction)771/752
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0.GstGhostPad:sink.GstProxyPad:proxypad0: caps = video/x-raw(memory:NVMM), width=(int)2000, height=(int)1500, framerate=(fraction)15/1, format=(string)NV12, pixel-aspect-ratio=(fraction)771/752
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstFakeSink:fakesink0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)2000, height=(int)1500, framerate=(fraction)15/1, format=(string)NV12, pixel-aspect-ratio=(fraction)771/752
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0.GstGhostPad:sink: caps = video/x-raw(memory:NVMM), width=(int)2000, height=(int)1500, framerate=(fraction)15/1, format=(string)NV12, pixel-aspect-ratio=(fraction)771/752
/GstPipeline:pipeline0/GstCapsFilter:capsfilter2.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)2000, height=(int)1500, framerate=(fraction)15/1, format=(string)NV12, pixel-aspect-ratio=(fraction)771/752
/GstPipeline:pipeline0/Gstnvvconv:nvvconv0.GstPad:sink: caps = video/x-raw, width=(int)4112, height=(int)3008, framerate=(fraction)15/1, format=(string)BGRx
/GstPipeline:pipeline0/GstCapsFilter:capsfilter1.GstPad:sink: caps = video/x-raw, width=(int)4112, height=(int)3008, framerate=(fraction)15/1, format=(string)BGRx
/GstPipeline:pipeline0/GstBayer2RGB:bayer2rgb0.GstPad:sink: caps = video/x-bayer, width=(int)4112, height=(int)3008, format=(string)rggb, framerate=(fraction)15/1
Pipeline is PREROLLED …
Setting pipeline to PLAYING …
New clock: GstSystemClock
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstFakeSink:fakesink0: sync = true

** (gst-launch-1.0:31485): WARNING **: 10:56:25.500: bayer2rgb0: size 144384 is not a multiple of unit size 12368896
ERROR: from element /GstPipeline:pipeline0/GstFileSrc:filesrc0: Internal data stream error.
Additional debug info:
gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline0/GstFileSrc:filesrc0:
streaming stopped, reason error (-5)
Execution ended after 0:00:00.000439625
Setting pipeline to PAUSED …
Setting pipeline to READY …
Setting pipeline to NULL …
Freeing pipeline …

so i have tried to do it withot downscale:

nvidia@nvidia:~$ gst-launch-1.0 -v filesrc location=frame_4112x3008_RGGB_0.bin blocksize=12368896 ! video/x-bayer, width=4112, height=3008, format=rggb, framerate=15/1 ! bayer2rgb ! video/x-raw, format=BGRx ! nvvidconv ! ‘video/x-raw(memory:NVMM), format=NV12, width=4112, height=3008’ ! fpsdisplaysink text-overlay=0 video-sink=fakesink
nvbuf_utils: Could not get EGL display connection
Setting pipeline to PAUSED …
Pipeline is PREROLLING …
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstFakeSink:fakesink0: sync = true
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-bayer, width=(int)4112, height=(int)3008, format=(string)rggb, framerate=(fraction)15/1
/GstPipeline:pipeline0/GstBayer2RGB:bayer2rgb0.GstPad:src: caps = video/x-raw, width=(int)4112, height=(int)3008, framerate=(fraction)15/1, format=(string)BGRx
/GstPipeline:pipeline0/GstCapsFilter:capsfilter1.GstPad:src: caps = video/x-raw, width=(int)4112, height=(int)3008, framerate=(fraction)15/1, format=(string)BGRx
/GstPipeline:pipeline0/Gstnvvconv:nvvconv0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)4112, height=(int)3008, framerate=(fraction)15/1, format=(string)NV12
/GstPipeline:pipeline0/GstCapsFilter:capsfilter2.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)4112, height=(int)3008, framerate=(fraction)15/1, format=(string)NV12
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0.GstGhostPad:sink.GstProxyPad:proxypad0: caps = video/x-raw(memory:NVMM), width=(int)4112, height=(int)3008, framerate=(fraction)15/1, format=(string)NV12
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstFakeSink:fakesink0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)4112, height=(int)3008, framerate=(fraction)15/1, format=(string)NV12
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0.GstGhostPad:sink: caps = video/x-raw(memory:NVMM), width=(int)4112, height=(int)3008, framerate=(fraction)15/1, format=(string)NV12
/GstPipeline:pipeline0/GstCapsFilter:capsfilter2.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)4112, height=(int)3008, framerate=(fraction)15/1, format=(string)NV12
/GstPipeline:pipeline0/Gstnvvconv:nvvconv0.GstPad:sink: caps = video/x-raw, width=(int)4112, height=(int)3008, framerate=(fraction)15/1, format=(string)BGRx
/GstPipeline:pipeline0/GstCapsFilter:capsfilter1.GstPad:sink: caps = video/x-raw, width=(int)4112, height=(int)3008, framerate=(fraction)15/1, format=(string)BGRx
/GstPipeline:pipeline0/GstBayer2RGB:bayer2rgb0.GstPad:sink: caps = video/x-bayer, width=(int)4112, height=(int)3008, format=(string)rggb, framerate=(fraction)15/1
Pipeline is PREROLLED …
Setting pipeline to PLAYING …
New clock: GstSystemClock
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstFakeSink:fakesink0: sync = true

** (gst-launch-1.0:31541): WARNING **: 10:56:39.655: bayer2rgb0: size 144384 is not a multiple of unit size 12368896
ERROR: from element /GstPipeline:pipeline0/GstFileSrc:filesrc0: Internal data stream error.
Additional debug info:
gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline0/GstFileSrc:filesrc0:
streaming stopped, reason error (-5)
Execution ended after 0:00:00.000539051
Setting pipeline to PAUSED …
Setting pipeline to READY …
Setting pipeline to NULL …
Freeing pipeline …

trying to convert the VYUY works:

gst-launch-1.0 -v filesrc location=frame_4112x3008_VYUY_0.bin ! videoparse format=vyuy width=4112 height=3008 framerate=15/1 ! videoconvert ! video/x-raw, format=NV12 ! nvvidconv ! ‘video/x-raw(memory:NVMM), width=2000, height=1500’ ! fpsdisplaysink text-overlay=0 video-sink=fakesink
nvbuf_utils: Could not get EGL display connection
Setting pipeline to PAUSED …
Pipeline is PREROLLING …
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstFakeSink:fakesink0: sync = true
/GstPipeline:pipeline0/GstVideoParse:videoparse0/GstRawVideoParse:inner_rawvideoparse.GstPad:src: caps = video/x-raw, format=(string)VYUY, width=(int)4112, height=(int)3008, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)mpeg2, colorimetry=(string)bt2020, framerate=(fraction)15/1
/GstPipeline:pipeline0/GstVideoParse:videoparse0.GstGhostPad:src: caps = video/x-raw, format=(string)VYUY, width=(int)4112, height=(int)3008, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)mpeg2, colorimetry=(string)bt2020, framerate=(fraction)15/1
/GstPipeline:pipeline0/GstVideoConvert:videoconvert0.GstPad:src: caps = video/x-raw, width=(int)4112, height=(int)3008, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)15/1, format=(string)NV12
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-raw, width=(int)4112, height=(int)3008, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)15/1, format=(string)NV12
/GstPipeline:pipeline0/Gstnvvconv:nvvconv0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)2000, height=(int)1500, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)771/752, framerate=(fraction)15/1, format=(string)NV12
/GstPipeline:pipeline0/GstCapsFilter:capsfilter1.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)2000, height=(int)1500, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)771/752, framerate=(fraction)15/1, format=(string)NV12
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0.GstGhostPad:sink.GstProxyPad:proxypad2: caps = video/x-raw(memory:NVMM), width=(int)2000, height=(int)1500, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)771/752, framerate=(fraction)15/1, format=(string)NV12
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstFakeSink:fakesink0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)2000, height=(int)1500, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)771/752, framerate=(fraction)15/1, format=(string)NV12
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0.GstGhostPad:sink: caps = video/x-raw(memory:NVMM), width=(int)2000, height=(int)1500, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)771/752, framerate=(fraction)15/1, format=(string)NV12
/GstPipeline:pipeline0/GstCapsFilter:capsfilter1.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)2000, height=(int)1500, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)771/752, framerate=(fraction)15/1, format=(string)NV12
/GstPipeline:pipeline0/Gstnvvconv:nvvconv0.GstPad:sink: caps = video/x-raw, width=(int)4112, height=(int)3008, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)15/1, format=(string)NV12
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-raw, width=(int)4112, height=(int)3008, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)15/1, format=(string)NV12
/GstPipeline:pipeline0/GstVideoConvert:videoconvert0.GstPad:sink: caps = video/x-raw, format=(string)VYUY, width=(int)4112, height=(int)3008, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)mpeg2, colorimetry=(string)bt2020, framerate=(fraction)15/1
/GstPipeline:pipeline0/GstVideoParse:videoparse0.GstGhostPad:src.GstProxyPad:proxypad1: caps = video/x-raw, format=(string)VYUY, width=(int)4112, height=(int)3008, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)mpeg2, colorimetry=(string)bt2020, framerate=(fraction)15/1
Pipeline is PREROLLED …
Setting pipeline to PLAYING …
New clock: GstSystemClock
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstFakeSink:fakesink0: sync = true
Got EOS from element “pipeline0”.
Execution ended after 0:00:00.066547851
Setting pipeline to PAUSED …
Setting pipeline to READY …
Setting pipeline to NULL …
Freeing pipeline …

tring to work with VYUY format (without rescale) failed:

gst-launch-1.0 -e filesrc location=frame_4112x3008_VYUY_0.bin ! videoparse format=vyuy width=4112 height=3008 framerate=15/1 ! videoconvert ! video/x-raw, format=NV12 ! nvvidconv ! ‘video/x-raw(memory:NVMM), width=4112, height=3008’ ! nvv4l2h264enc ! h264parse ! qtmux ! filesink location=test_VYUY_resized_to_2000x1500_h264.mov
nvbuf_utils: Could not get EGL display connection
Setting pipeline to PAUSED …
Opening in BLOCKING MODE
Pipeline is PREROLLING …
Redistribute latency…
NvMMLiteOpen : Block : BlockType = 4
===== NVMEDIA: NVENC =====
NvMMLiteBlockCreate : Block : BlockType = 4
H264: Profile = 66, Level = 0
_ValidateEncodeParams: Invalid encode width

creating video without downscale (as wanted) works:

gst-launch-1.0 -e filesrc location=frame_4112x3008_VYUY_0.bin ! videoparse format=vyuy width=4112 height=3008 framerate=15/1 ! videoconvert ! video/x-raw, format=NV12 ! nvvidconv ! ‘video/x-raw(memory:NVMM), width=4096’ ! nvv4l2h264enc ! h264parse ! qtmux ! filesink location=test_VYUY.mov
nvbuf_utils: Could not get EGL display connection
Setting pipeline to PAUSED …
Opening in BLOCKING MODE
Pipeline is PREROLLING …
Redistribute latency…
NvMMLiteOpen : Block : BlockType = 4
===== NVMEDIA: NVENC =====
NvMMLiteBlockCreate : Block : BlockType = 4
H264: Profile = 66, Level = 0
Pipeline is PREROLLED …
Setting pipeline to PLAYING …
New clock: GstSystemClock
Got EOS from element “pipeline0”.
Execution ended after 0:00:00.000583500
Setting pipeline to PAUSED …
Setting pipeline to READY …
Setting pipeline to NULL …
Freeing pipeline …

Thank you for your detailed answer, it’s very helpful!

Sorry I’m in hurry and cannot make a detailed answer now.

For bayer format, you may have a truncated file with less than one frame. Regenerate it, it should give:

ls -l test_4112x3008.rggb 
-rw-r--r-- 1 nvidia nvidia 1855334400 mars   2 22:18 test_4112x3008.rggb

Second, there is a limitation in encoder. It doesn’t support width higher than 4096. This explains why it fails without resizing. Try resizing to 4096x3008 or less.

thank you!
Yes, i assumed that there is an problem with the frame size.
when i trying to convert this VYUY image i get image with aliasing, maybe it’s stride problem? but it not made sense cause gstreamer know VYUY format…
3 files are attached

  1. test.vyuy vyuy file grubbed using video4linux
    test.vyuy (23.6 MB)
  2. test.jpeg

    created by:

gst-launch-1.0 -e filesrc location=test.vyuy ! videoparse format=vyuy width=4112 height=3008 framerate=15/1 ! videoconvert ! jpegenc ! filesink location=“test.jpeg”

  1. screenshot from v4linux → ximagesink

    created by:
    gst-launch-1.0 v4l2src device=/dev/video0 ! videoscale ! video/x-raw,width=800,height=600 ! videoconvert ! queue ! ximagesink sync=false

any guess why it’s happening?

Not sure, but seems to me that the v4l capture had aliasing from the file:

gst-launch-1.0 filesrc location=test.vyuy ! videoparse format=vyuy width=4112 height=3008 framerate=0/1 ! videoconvert ! imagefreeze ! xvimagesink

while when using v4l2src, videoscale made some smoothing.

You may also use interpolation-method property if resizing with nvvidconv, but I’d suggest to first find out the root cause and correct configuration.

This may go far from initial post. You may create another topic for each specific case, so that it may help other users in similar case in the future.