Adding a New sensor driver to the JP6 Orin NX

i am trying to add my sensor driver keeping tegra234-p3768-camera-rbpcv2-imx219.dtsi as reference. Since i could see that the driver for imx219 gets loaded and it depends on this dts file.

I could see that tegra234-p3768-0000+p3767-0000-dynamic.dts has an include to the above dtsi file. So i replaced that with my driver

"
//include “tegra234-p3768-camera-rbpcv2-imx219.dtsi”
include “tegra234-p3768-camera-rbpcv2-ap1302.dtsi”

/ {
overlay-name = “Tegra234 p3768-0000+p3767-xxxx Dynamic Overlay”;
};

"
Now i tried building the full sdk and then flashed on the device . When flashing am getting error like

Enter to continue boot.
…EFI stub: Booting Linux Kernel…
EFI stub: ERROR: Invalid header detected on UEFI supplied FDT, ignoring …
EFI stub: Generating empty DTB
EFI stub: Loaded initrd from LINUX_EFI_INITRD_MEDIA_GUID device path
EFI stub: Exiting boot services…

I don’ t know if this applies to your case (it could be completely wrong, I am speculating), but I think of an overlay as modifying or extending an existing device tree node. Generally speaking, it seems incorrect to add a new (previously non-existent) node as an overlay. Have you tried directly editing the existing device tree and replacing it?

Also, if the driver is not also ported to boot stages, then I would not expect the driver to load.

if am to make the above change i am not sure why exactly it does not boot. is there any more places i need to make modifications ?

Maybe reference to this topic for device tree.

You would need a serial console boot log to find out why. You might need to follow the docs to enable more boot messages (definitely remove any “quiet” in “/boot/extlinux/extlinux.conf”). Also, if the tree is pulled from a binary partition the reason might differ from if the dtb is pulled from somewhere under “/boot”. How are you installing your modified dtb in the failure case?

i checked and found that the ./rootfs/boot/tegra234-p3768-0000+p3767-0000-dynamic.dtbo was not getting (the /rootfs/ entry) updated even after building, so i manually copied from (./kernel/dtb/tegra234-p3768-0000+p3767-0000-dynamic.dtbo or ./bootloader/tegra234-p3768-0000+p3767-0000-dynamic.dtbo) to that location and now the device bootup with my own dtsi file

thanks for the support

After making changes i was able to get frames from the sensor. The same driver am using for both JP5.1 and JP6. however in JP6 the video have some lines seen. But video from JP5.1 does not have any such issue.

Apply the patch from below to fix the problem.

Nice .
These changes fixed it !! … thanks a lot.

Will be doing more tests and keep informed if facing any issue

Currently i was using pixel_phase as uyvy for yuv422. I am in need of having yuv420 .
Is this supported on the orin nx platform ?
pixel_phase = “YU12”;

Don’t support YU12.

Thanks

oh.
Could you please tell which all formats are supported by the orin NX ?

Current support YUV422-8bit, Bayer RAW 8/10/12/14 and PWL 16/20.

Is there support for RGB888, RGB565, RGB555 ?

i have made a driver for AP1302 for 30fps, i tried capturing the yuv frame using v4l2 and also gstreamer.

When configured as 30fps, am getting 28.80
When configured as 20fps , am getting19.20

Command used :

v4l2-ctl --set-fmt-video=width=1920,height=1080,pixelformat=“UYVY” --set-ctrl bypass_mode=0 --device=/dev/video0 --stream-skip=0 --stream-count=1000 --stream-mmap --stream-to=/dev/null
For 30FPS configuration :


For 20FPS configuration :
image

Here is the dtsi change made for the 20fps

active_w = “1640”;
active_h = “1232”;
mode_type = “yuv”;
pixel_phase = “uyvy”;
csi_pixel_bit_depth = “16”;
readout_orientation = “90”;
line_length = “3448”;
inherent_gain = “1”;
mclk_multiplier = “9.33”;
pix_clk_hz = “300000000”;
gain_factor = “16”;
framerate_factor = “1000000”;
exposure_factor = “1000000”;
min_gain_val = “16”; /* 1.00x /
max_gain_val = “170”; /
10.66x /
step_gain_val = “1”;
default_gain = “16”; /
1.00x /
min_hdr_ratio = “1”;
max_hdr_ratio = “1”;
min_framerate = “2000000”; /
2.0 fps /
max_framerate = “20000000”; /
60.0 fps /
step_framerate = “1”;
default_framerate = “20000000”; /
60.0 fps /
min_exp_time = “13”; /
us /
max_exp_time = “683709”; /
us /
step_exp_time = “1”;
default_exp_time = “2495”; /
us */

Is there any changes that need to be done somewhere.

RGB888 is supported but RGB565, RGB555

Boost the clocks to check the fps.

echo 1 > /sys/kernel/debug/bpmp/debug/clk/vi/mrq_rate_locked
echo 1 > /sys/kernel/debug/bpmp/debug/clk/isp/mrq_rate_locked
echo 1 > /sys/kernel/debug/bpmp/debug/clk/nvcsi/mrq_rate_locked
echo 1 > /sys/kernel/debug/bpmp/debug/clk/emc/mrq_rate_locked
cat /sys/kernel/debug/bpmp/debug/clk/vi/max_rate |tee /sys/kernel/debug/bpmp/debug/clk/vi/rate
cat /sys/kernel/debug/bpmp/debug/clk/isp/max_rate | tee  /sys/kernel/debug/bpmp/debug/clk/isp/rate
cat /sys/kernel/debug/bpmp/debug/clk/nvcsi/max_rate | tee /sys/kernel/debug/bpmp/debug/clk/nvcsi/rate
cat /sys/kernel/debug/bpmp/debug/clk/emc/max_rate | tee /sys/kernel/debug/bpmp/debug/clk/emc/rate

i have tried boosting the clocks are mentioned above but still am only getting 28.80 at max, is there anything i need to change on the dtsi
dts.txt (15.1 KB)

root@tegra-ubuntu:/home# echo 1 > /sys/kernel/debug/bpmp/debug/clk/vi/mrq_rate_locked
root@tegra-ubuntu:/home# echo 1 > /sys/kernel/debug/bpmp/debug/clk/isp/mrq_rate_locked
root@tegra-ubuntu:/home# echo 1 > /sys/kernel/debug/bpmp/debug/clk/nvcsi/mrq_rate_locked
root@tegra-ubuntu:/home# echo 1 > /sys/kernel/debug/bpmp/debug/clk/emc/mrq_rate_locked
root@tegra-ubuntu:/home# cat /sys/kernel/debug/bpmp/debug/clk/vi/max_rate |tee /sys/kernel/debug/bpmp/debug/clk/vi/rate
832000000
root@tegra-ubuntu:/home# cat /sys/kernel/debug/bpmp/debug/clk/isp/max_rate | tee /sys/kernel/debug/bpmp/debug/clk/isp/rate
1011200000
root@tegra-ubuntu:/home# cat /sys/kernel/debug/bpmp/debug/clk/nvcsi/max_rate | tee /sys/kernel/debug/bpmp/debug/clk/nvcsi/rate
642900000
root@tegra-ubuntu:/home# cat /sys/kernel/debug/bpmp/debug/clk/emc/max_rate | tee /sys/kernel/debug/bpmp/debug/clk/emc/rate
3199000000

RGB888 is supported but RGB565, RGB555–>
Do you mean only RGB888 is supported and the other two are not supported ?

Attaching trace log also (cat /sys/kernel/debug/tracing/trace)
tracelog.txt (18.2 KB)

I would suspect the sensor output fps is 30?
Yes, only support RGB888.

so i checked whether the frames were coming from the sensor and it appears to be that that sensor was not outputting 30fps to start with .There was some for registers which i had to change to get the 30fps working. As of now am getting solid 30fps with the v4l2 command.

My next goal was to achieve h264 encoding of the data received, for this is used gstreamer. Here i am facing an issue where am telling the gstreamer to capture 300 frames , since the fps is 30 it should stop in 10 seconds.

Here is the command am using

gst-launch-1.0 -v v4l2src device=/dev/video0 num-buffers=300 ! video/x-raw,width=3840,height=2160,framerate=30/1,format=UYVY ! nvvidconv ! nvv4l2h264enc ! h264parse ! filesink location=a.h264

When running the above i can see this print from gstreamer as “Execution ended after 0:00:17.355232199”

Now if i do recording for 640p of 1080p then the gstreamer print is “Execution ended after 0:00:10.127889299” and “Execution ended after 0:00:10.161490469” respectively…
This seems correct as 300 frames/ 30fps = 10Seconds… Somehow the 4K one is taking longer time to complete 300frame capture

So my next exercise was checking with the videotestsrc and 4k

gst-launch-1.0 -v videotestsrc num-buffers=300 ! video/x-raw,width=3840,height=2160,framerate=30/1,format=UYVY ! nvvidconv ! nvv4l2h264enc ! h264parse ! filesink location=a.h264

to my surprise this also took " Execution ended after 0:00:20.899741052"
is this expected?

i was looking for something like capture the yuv frame - > encode to h264 → live stream that on display (or record)

i was checking the orin nx and it mentions that formats mentioned below are supported , could you please cross confirm whether the YUV420 is supported ?