The only pixel format of the camera being recognized by the os (configured on the dual MIPI connection tho I have the same issues on the 4 lane) is RG10 bayer at two different dimensions and frame rates. From what I have gathered very few gstreamer elements support this pixel format. I would like the device tree overlay to support more such as YUYV. I have tried to decompile /boot/tegra234-p3767-camera-p3768-imx477-dual.dtbo and add a mode2 section for a different pixel format but that causes the same types of gstreamer errors as well (that is if it happens to load without dmesg errors).
Yields: WARNING: erroneous pipeline: could not link nvv4l2camerasrc0 to filesink0, nvv4l2camerasrc0 can't handle caps video/x-bayer, format=(string)rggb, width=(int)3840, height=(int)2160, framerate=(fraction)30/1
Confirmation of video format: v4l2-ctl -d /dev/video0 --list-formats-ext
Ridgerun has some support for this for earlier versions of Jetpack but doesn’t include the Orin NX.
Any help or direction would be very much appreciated!
Trying to get YUYV directly from the camera, requires you to first make sure that the camera is able to provide the output in that format. Then you would need to create a capture mode for that on the device tree and add the respective register table for it in your driver.
However, the pipeline I mentioned, should help you capture and store into a mp4 file using h264 encoding.
If you are trying to capture raw for a specific reason, we could work on a different pipeline for you.
Would you mind sharing a bit of detail on what you want to achieve?
I get the error WARNING: erroneous pipeline: no element "nvarguscamerasrc". I was under the impression this was installed with the entire jetpack SDK which was performed on the orin nx in question.
You’re right that the IMX477 sensor will only output bayer directly. At the end of the day I’m just looking for a way to ingest with gstreamer without a secondary application or daemon running (i.e. some sort of cv2 solution).
According to ChatGBT: “On the Jetson Orin NX with JetPack 6.0, you can use the NVIDIA Image Signal Processor (ISP) to handle debayering from the IMX477 and convert it to YUV. The NVIDIA ISP is highly optimized for this task and can efficiently convert raw Bayer formats (like RGGB) into YUV formats such as NV12, which is commonly used for video processing.”
Yeah, I am ssh’d on the board directly and it was flashed last week with latest greatest :/ (L4T JP 6 (36.3)). I’m checking around now to see if there is some way to confirm the install/sdk status on the board. After the flash I compiled opencv2 for cuda and gstreamer support which might have caused something to be overwritten… not sure yet.
I have started to reflash the board now but will try that command afterwards if needed. Either way strange it wouldn’t know how to find it whilst it can find some of the other nv* elements.
The following pipeline now works after reinstalling the entire jetpack 6.0 with full CUDA and SDK components. Thanks for your help! gst-launch-1.0 nvarguscamerasrc sensor-id=0 ! 'video/x-raw(memory:NVMM), width=3840, height=2160, format=NV12, framerate=30/1' ! nvvidconv ! nvv4l2h264enc bitrate=4000000 ! h264parse ! mp4mux ! filesink location=output.mp4