Ffmpeg pixel conversion

Hi, I’m taking up an old project from some former colleagues. Trying to get an image from an Allied Vision 1800 c-319 camera. The camera is connected to a Jetson AGX flashed with a yocto build.

v4l2-ctl --list-formats-ext

produces the following output:

ioctl: VIDIOC_ENUM_FMT
	Type: Video Capture

	[0]: 'TP31' (0x31 MIPI DATATYPE)
		Size: Discrete 1536x1264
			Interval: Discrete 0.033s (30.000 fps)
	[1]: 'GREY' (8-bit Greyscale)
		Size: Discrete 1536x1264
			Interval: Discrete 0.033s (30.000 fps)
	[2]: 'RGGB' (8-bit Bayer RGRG/GBGB)
		Size: Discrete 1536x1264
			Interval: Discrete 0.033s (30.000 fps)
	[3]: 'JXY0' (10-bit/16-bit Greyscale)
		Size: Discrete 1536x1264
			Interval: Discrete 0.033s (30.000 fps)
	[4]: 'JXR0' (10-bit/16-bit Bayer RGRG/GBGB)
		Size: Discrete 1536x1264
			Interval: Discrete 0.033s (30.000 fps)
	[5]: 'Y12 ' (12-bit Greyscale)
		Size: Discrete 1536x1264
			Interval: Discrete 0.033s (30.000 fps)
	[6]: 'Y16 ' (16-bit Greyscale)
		Size: Discrete 1536x1264
			Interval: Discrete 0.033s (30.000 fps)
	[7]: 'JXY2' (12-bit/16-bit Greyscale)
		Size: Discrete 1536x1264
			Interval: Discrete 0.033s (30.000 fps)
	[8]: 'JXR2' (12-bit/16-bit Bayer RGRG/GBGB)
		Size: Discrete 1536x1264
			Interval: Discrete 0.033s (30.000 fps)
	[9]: 'BX24' (32-bit XRGB 8-8-8-8)
		Size: Discrete 1536x1264
			Interval: Discrete 0.033s (30.000 fps)
	[10]: 'XR24' (32-bit BGRX 8-8-8-8)
		Size: Discrete 1536x1264
			Interval: Discrete 0.033s (30.000 fps)
	[11]: 'VYUY' (VYUY 4:2:2)
		Size: Discrete 1536x1264
			Interval: Discrete 0.033s (30.000 fps)

I then use the following to get one frame:

v4l2-ctl -d /dev/video0 --set-fmt-video=width=1536,height=1264,pixelformat=BX24 --stream-mmap --stream-count=1 --stream-to=img.raw

I now move the image to my host computer using scp. On my host computer I now have “img.raw” with a size of 7.5M. So far so good.

Problem is, I have no idea how to convert it so I’m able to view that image.

I have this so far:

ffmpeg -f rawvideo -pix_fmt yuyv422 -s:v 1536x1264 -r 30 -i img.raw -c:v v210 out.jpeg

but it produces the image with some weird colors.

Any help appreciated!

Hi,
We would suggest try capture frame data in GRAY or VYUY. Other formats are special and additional conversion shall be required. Please check steps in Jetson Nano FAQ for launching USB cameras (v4l2 sources)

Hi Dane, it looks like I’m able to capture frames using VYUY as well. Still not able to view the image. This is where I assumed I needed ffmpeg to convert it before viewing?

Hi,
VYUY is not supported in v4l2src plugin. You may try the jetson_multimedua_api sample:
Jetson Linux API Reference: 12_camera_v4l2_cuda (camera capture CUDA processing) | NVIDIA Docs

Please try and see if camera preview is good.

This is not the answer to the ffmpeg question, but the gstreamer alternative answer.
For just checking your image, assuming you’ve captured in VYUY format, you may try:

gst-launch-1.0 filesrc location=test_1536x1254.VYUY ! queue ! videoparse format=vyuy width=1536 height=1264 framerate=0/1 ! videoconvert ! imagefreeze ! xvimagesink

If ok, then for camera preview you may try:

  1. Make a capture with v4l2-ctl in VYUY format.
  2. Try launching this pipeline :
gst-launch-1.0 nvv4l2camerasrc ! 'video/x-raw(memory:NVMM),format=UYVY,width=1536,height=1264,framerate=30/1' ! nvvidconv ! video/x-raw,format=UYVY ! queue ! shmsink socket-path=/tmp/testFakeUYVY.sock

Not sure this will run, it may rely on something that may have been fixed in recent releases of nvv4l2camerasrc.

If it runs, you would use this for camera preview from a second terminal:

gst-launch-1.0 shmsrc socket-path=/tmp/testFakeUYVY.sock do-timestamp=1 ! video/x-raw,format=VYUY,width=1536,height=1264,framerate=30/1 ! videoconvert ! xvimagesink

Note that this way of cheating is weird, it breaks the coherence of gstreamer, but it may be useful for such case of color inversion. Also note that it may result in CPU overhead.

It may be worth checking if there is a gstreamer source plugin available for your camera.

[EDIT: looking at v4l2-ctl listed formats, you may simply try BGRx format:

gst-launch-1.0 v4l2src ! video/x-raw,format=BGRx,width=1536,height=1264,framerate=30/1 ! videoconvert ! xvimagesink 

]

Thanks for your reply! Not sure I understand 100%. Gstreamer is not installed on my Jetson and it would be really difficult for me to install it since a former colleague built it with Yocto. That’s why I use V4L2 to record and then send it to another computer. I just haven’t found a way of viewing the image and I assume that’s because of the pixel format.

Ok… if you’ve recorded one frame with XR24 format, you may try this for BGRx → JPG conversion:

ffmpeg -f rawvideo -pix_fmt bgr0 -s:v 1536x1264 -i img.raw -frames:v 1 out.jpg

Probably the best output I’ve had so far, but the color is still off. Looks kind of blue ish. Here’s the result of the ffmpeg command:

ffmpeg -f rawvideo -pix_fmt bgr0 -s:v 1536x1264 -i bx24.raw -frames:v 1 out.jpg

ffmpeg version 3.4.11-0ubuntu0.1 Copyright (c) 2000-2022 the FFmpeg developers
built with gcc 7 (Ubuntu 7.5.0-3ubuntu1~18.04)
configuration: --prefix=/usr --extra-version=0ubuntu0.1 --toolchain=hardened --libdir=/usr/lib/x86_64-linux-gnu --incdir=/usr/include/x86_64-linux-gnu --enable-gpl --disable-stripping --enable-avresample --enable-avisynth --enable-gnutls --enable-ladspa --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libmp3lame --enable-libmysofa --enable-libopenjpeg --enable-libopenmpt --enable-libopus --enable-libpulse --enable-librubberband --enable-librsvg --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx265 --enable-libxml2 --enable-libxvid --enable-libzmq --enable-libzvbi --enable-omx --enable-openal --enable-opengl --enable-sdl2 --enable-libdc1394 --enable-libdrm --enable-libiec61883 --enable-chromaprint --enable-frei0r --enable-libopencv --enable-libx264 --enable-shared
libavutil 55. 78.100 / 55. 78.100
libavcodec 57.107.100 / 57.107.100
libavformat 57. 83.100 / 57. 83.100
libavdevice 57. 10.100 / 57. 10.100
libavfilter 6.107.100 / 6.107.100
libavresample 3. 7. 0 / 3. 7. 0
libswscale 4. 8.100 / 4. 8.100
libswresample 2. 9.100 / 2. 9.100
libpostproc 54. 7.100 / 54. 7.100
[rawvideo @ 0x5556b02668e0] Estimating duration from bitrate, this may be inaccurate
Input #0, rawvideo, from ‘bx24.raw’:
Duration: 00:00:00.04, start: 0.000000, bitrate: 1553203 kb/s
Stream #0:0: Video: rawvideo (BGR[0] / 0x524742), bgr0, 1536x1264, 1553203 kb/s, 25 tbr, 25 tbn, 25 tbc
Stream mapping:
Stream #0:0#0:0 (rawvideo (native) → mjpeg (native))
Press [q] to stop, [?] for help
[swscaler @ 0x5556b02813e0] deprecated pixel format used, make sure you did set range correctly
Output #0, image2, to ‘out.jpg’:
Metadata:
encoder : Lavf57.83.100
Stream #0:0: Video: mjpeg, yuvj444p(pc), 1536x1264, q=2-31, 200 kb/s, 25 fps, 25 tbn, 25 tbc
Metadata:
encoder : Lavc57.107.100 mjpeg
Side data:
cpb: bitrate max/min/avg: 0/0/200000 buffer size: 0 vbv_delay: -1
frame= 1 fps=0.0 q=5.7 Lsize=N/A time=00:00:00.04 bitrate=N/A speed=1.05x
video:151kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: unknown

Try capturing into XR24…

or for BX24, try

-pix_fmt 0rgb

XR24 did it… thank you so much!

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.