The above dts file is the current version that I have applied on my jetson nano. It seems that I need to change the port index and I am wondering if I should change it to port 5 or 6. I am not sure which port should I choose even though i have read Tegra Linux Driver as a reference. Below is the error message when I use dmesg:
Update:
After i change the default_framerate to 30 it solve the bug and it work for v4l2-ctl --stream-mmap --stream-skip=100 --stream-count=1 -d /dev/video0 --stream-to=new-6.raw but gst-launch-1.0 don’t. The following log is the information of the testing:log2.txt (2.3 KB)
It seems that I haven’t added the camera to the plug-in manager so the devname still is “imx219 7-0010” and proc-device-tree is " /proc/device-tree/cam_i2cmux/i2c@0/rbpcv2@0/rbpcv2_imx219_a@10". For using the plugin-manager I am trying to use https://github.com/veyeimaging/veye327_jetson_nano/blob/master/sources/dts/kernel-dts/tegra210-porg-p3448-common.dtsi as an example. However, it seems it is a bit outdated with JetPack4.6 and l4t32.6. Could you please suggest any examples e.g. imx477 to follow to make a Loadable Kernel Module (LKM) like Tegra Linux Driver? Because I am not very sure how to change the dts files in porg-platform.
Does you sensor need enable any regulator? If yes you need add it to your device tree if no you need remove the regulator_enable() function call in ov2740_power_on()
Thanks! I slove it by disable the regulator and now the gst-launch-1.0 can run but still having error and the error are:
~$ gst-launch-1.0 nvarguscamerasrc num-buffers=20 ! fakesink -e -vSetting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstNvArgusCameraSrc:nvarguscamerasrc0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstFakeSink:fakesink0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1
GST_ARGUS: Creating output stream
CONSUMER: Waiting until producer is connected...
GST_ARGUS: Available Sensor modes :
GST_ARGUS: 1920 x 1080 FR = 59.999999 fps Duration = 16666667 ; Analog Gain range min 0.000000, max 3.000000; Exposure Range min 1000, max 430000000;
GST_ARGUS: Running with following settings:
Camera index = 0
Camera mode = 0
Output Stream W = 1920 H = 1080
seconds to Run = 0
Frame Rate = 59.999999
GST_ARGUS: Setup Complete, Starting captures for 0 seconds
GST_ARGUS: Starting repeat capture requests.
CONSUMER: Producer has connected; continuing.
nvbuf_utils: dmabuf_fd -1 mapped entry NOT found
nvbuf_utils: Can not get HW buffer from FD... Exiting...
CONSUMER: ERROR OCCURRED
ERROR: from element /GstPipeline:pipeline0/GstNvArgusCameraSrc:nvarguscamerasrc0: CANCELLED
Additional debug info:
Argus Error Status
EOS on shutdown enabled -- waiting for EOS after Error
Waiting for EOS...
(Argus) Error EndOfFile: Unexpected error in reading socket (in src/rpc/socket/client/ClientSocketManager.cpp, function recvThreadCore(), line 266)
(Argus) Error EndOfFile: Receiving thread terminated with error (in src/rpc/socket/client/ClientSocketManager.cpp, function recvThreadWrapper(), line 368)
I have tried to set tegra_sinterface to “serial_c” or “serial_e”. However, both do not work and the error is similar. For more information, the following are the driver and error log from dmesg and
gst-launch-1.0. extracted_proc2.txt (305.4 KB) full_log.txt (57.8 KB)
Also the i2cdetect:
Is it possible that the hardware is broken or the ov2740.c/ ov2740.h/ ov2740_mode_tbls.h should edited for jetson nano? Could you suggest which part should I check? Thanks!
please review the configuration of "line_length" in sensor driver
try to increase the delay between video lines data send via CSI
review the sensor datasheet to check the settings of FPS range
I have tried to change line_length to 1920/1934/3840 but it still has the same error. Is there any number should I try?