i am trying to add my sensor driver keeping tegra234-p3768-camera-rbpcv2-imx219.dtsi as reference. Since i could see that the driver for imx219 gets loaded and it depends on this dts file.
I could see that tegra234-p3768-0000+p3767-0000-dynamic.dts has an include to the above dtsi file. So i replaced that with my driver
"
//include “tegra234-p3768-camera-rbpcv2-imx219.dtsi” include “tegra234-p3768-camera-rbpcv2-ap1302.dtsi”
I don’ t know if this applies to your case (it could be completely wrong, I am speculating), but I think of an overlay as modifying or extending an existing device tree node. Generally speaking, it seems incorrect to add a new (previously non-existent) node as an overlay. Have you tried directly editing the existing device tree and replacing it?
Also, if the driver is not also ported to boot stages, then I would not expect the driver to load.
You would need a serial console boot log to find out why. You might need to follow the docs to enable more boot messages (definitely remove any “quiet” in “/boot/extlinux/extlinux.conf”). Also, if the tree is pulled from a binary partition the reason might differ from if the dtb is pulled from somewhere under “/boot”. How are you installing your modified dtb in the failure case?
i checked and found that the ./rootfs/boot/tegra234-p3768-0000+p3767-0000-dynamic.dtbo was not getting (the /rootfs/ entry) updated even after building, so i manually copied from (./kernel/dtb/tegra234-p3768-0000+p3767-0000-dynamic.dtbo or ./bootloader/tegra234-p3768-0000+p3767-0000-dynamic.dtbo) to that location and now the device bootup with my own dtsi file
After making changes i was able to get frames from the sensor. The same driver am using for both JP5.1 and JP6. however in JP6 the video have some lines seen. But video from JP5.1 does not have any such issue.
Currently i was using pixel_phase as uyvy for yuv422. I am in need of having yuv420 .
Is this supported on the orin nx platform ?
pixel_phase = “YU12”;
i have tried boosting the clocks are mentioned above but still am only getting 28.80 at max, is there anything i need to change on the dtsi dts.txt (15.1 KB)
so i checked whether the frames were coming from the sensor and it appears to be that that sensor was not outputting 30fps to start with .There was some for registers which i had to change to get the 30fps working. As of now am getting solid 30fps with the v4l2 command.
My next goal was to achieve h264 encoding of the data received, for this is used gstreamer. Here i am facing an issue where am telling the gstreamer to capture 300 frames , since the fps is 30 it should stop in 10 seconds.
When running the above i can see this print from gstreamer as “Execution ended after 0:00:17.355232199”
Now if i do recording for 640p of 1080p then the gstreamer print is “Execution ended after 0:00:10.127889299” and “Execution ended after 0:00:10.161490469” respectively…
This seems correct as 300 frames/ 30fps = 10Seconds… Somehow the 4K one is taking longer time to complete 300frame capture
So my next exercise was checking with the videotestsrc and 4k
i was checking the orin nx and it mentions that formats mentioned below are supported , could you please cross confirm whether the YUV420 is supported ?