I’ll revise the pixel_clk_hz, as we use a slightly different crystal frequency… something like 27MHz instead of 25MHz.
Should the right pixel_clk_hz be used… do I still need to boost the clocks?
Seems that currently the specified MCLK is 37125000 Hz, the specified multiplier is 80.80, whcih gives 2999700000 Hz, but they specify pix_clk_hz as 3000000000 Hz.
Seems that there has to be some tolerances for that.
Have to check the crystal that we are actually using, but not sure about the precision of that multiplier…
hello david.fernandez,
as you can see, here’re several formulas to obtain Sensor Pixel Clock.
I’m usually checking with… pixel_clk_hz = sensor data rate per lane (Mbps) * number of lanes / bits per pixel
for clock settings.
BTW,
it’s true pixel_clk_hz
has ignored when you running commands to boost all system clocks.
Would the sensor data rate per lane be the one specified in tegra_camera_platform?
For some reason each sensor mode specifies the same pix_clk_hz regardless of the bits per pixel.
So I checked the crystal frequency, and it seems to be the one specified in the DT for the sensor and modes (37125000 Hz). They specify pix_clk_hz of 3000000000 Hz for all modes (with 80.80 multiplier).
Hi @JerryChang,
So rebooting and trying the commands again, without any boosting, I can see that:
- If I run v4l2-ctl streaming first, it does 1.5-fps, which happens to be the minimum frame rate for mode 8.
- Running the gstreamer command for mode 8, fails as usual.
- Running again the v4l2-ctl command (restarting the nvargus-daemon), I then get 30-fps, which is what the gstreamer command sets.
- Running the gstream again with 40-fps framerate (that fails because the nvbuf_utils errors), then the v4l2-ctl again, I get now 40-fps, etc.
- Using modes 5 or 2 with gstreamer works, only mode 8 fails.
So it seems that the v4l2-ctl works, but because the -p option is, somehow, not supported, the only known way to set the framerate is the gstreamer video spec for nvarguscamerasrc.
Still I would like to know how to make nvarguscamerasrc to work here.
hello david.fernandez,
-
did you meant gst pipeline works normally after boosting the clocks?
-
you may try general/common options to toggle sensor operation via v4l2.
for instance,
$ v4l2-ctl -d /dev/video0 --set-fmt-video=width=1920,height=1080,pixelformat=RG10 --set-ctrl bypass_mode=0 --set-ctrl=frame_rate=<num> --stream-mmap --stream-count=100
-
did you meant below failure by launching mode-8?
for instance,nvbuf_utils: dmabuf_fd -1 mapped entry NOT found
Hi @JerryChang,
- No, I mean v4l2-ctl works the same, wether I boost the clocks or not. I tested again without boosting the clocks and I got the same results.
Only problem is that, as the drivers do not appear to support the -p/–set-parm then the only way to set the framerate (that I know) is using gstreamer first. - I checked that frame_rate control to see if I can get the frame rate set that way., and yes, that works.
So with that, I can see that v4l2-ctl works for any frame rate just fine, without boosting the clocks. - Yes, I need to get gstreamer working properly for all modes, including mode 8. That was the reason I opened this issue in the first place.
hello david.fernandez,
in general, it’s sensor device tree issues since it works with v4l utility,
nvarguscamerasrc
took the DT properties, especially the settings of Property-Value Pairs to launch camera stream with gst pipeline.
here’re several of suggestions…
(1) I would like to double confirm the Jetpack release version you’re working with.
you may examine release tag, $ cat /etc/nv_tegra_release
for confirmation.
(2) is it with output data-rate larger than 1.5Gbps? please also share the calculation results by checking Sensor Pixel Clock.
(3) let’s give it a try with Infinite Timeout Support,
you may see-also Topic 284939 to apply r35.5.0 pre-built update to enable infinite timeout property.
Hi @JerryChang,
(1)
$ cat /etc/nv_tegra_release
# R32 (release), REVISION: 7.1, GCID: 29818004, BOARD: t186ref, EABI: aarch64, DATE: Sat Feb 19 17:07:00 UTC 2022
(2) For mode8:
$ for prop in mclk_khz mclk_multiplier num_lanes phy_mode dynamic_pixel_bit_depth csi_pixel_bit_depth pix_clk_hz; do printf '%25s : ' $prop; cat /sys/devices/3190000.i2c/i2c-3/3-001a/of_node/mode8/$prop | tr '\0' '\n'; done
mclk_khz : 37125
mclk_multiplier : 80.80
num_lanes : 4
phy_mode : DPHY
dynamic_pixel_bit_depth : 8
csi_pixel_bit_depth : 8
pix_clk_hz : 3000000000
All modes use pix_clk_hz of 3000000000
(3) Let me work that out… I’ll come with the result in a while.
Regards
Hi @JerryChang,
(3) Running sudo enableCamInfiniteTimeout=1 nvargus-daemon
on a window, then the gstreamer command in another, I get from nvargus-daemon:
=== NVIDIA Libargus Camera Service (0.98.3)=== Listening for connections...=== gst-launch-1.0[19722]: Connection established (7FA2CCA1D0)OFParserListModules: module list: /proc/device-tree/tegra-camera-platform/modules/module0
OFParserListModules: module list: /proc/device-tree/tegra-camera-platform/modules/module2
OFParserListModules: module list: /proc/device-tree/tegra-camera-platform/modules/module4
NvPclHwGetModuleList: WARNING: Could not map module to ISP config string
NvPclHwGetModuleList: No module data found
NvPclHwGetModuleList: WARNING: Could not map module to ISP config string
NvPclHwGetModuleList: No module data found
NvPclHwGetModuleList: WARNING: Could not map module to ISP config string
NvPclHwGetModuleList: No module data found
OFParserGetVirtualDevice: NVIDIA Camera virtual enumerator not found in proc device-tree
---- imager: No override file found. ----
(NvCamV4l2) Error ModuleNotPresent: V4L2Device not available (in /dvs/git/dirty/git-master_linux/camera/utils/nvcamv4l2/v4l2_device.cpp, function findDevice(), line 256)
(NvCamV4l2) Error ModuleNotPresent: (propagating from /dvs/git/dirty/git-master_linux/camera/utils/nvcamv4l2/v4l2_device.cpp, function initialize(), line 60)
(NvOdmDevice) Error ModuleNotPresent: (propagating from dvs/git/dirty/git-master_linux/camera-partner/imager/src/devices/V4L2SensorViCsi.cpp, function initialize(), line 107)
NvPclDriverInitializeData: Unable to initialize driver v4l2_sensor
NvPclInitializeDrivers: error: Failed to init camera sub module v4l2_sensor
NvPclStartPlatformDrivers: Failed to start module drivers
NvPclStateControllerOpen: Failed ImagerGUID 0. (error 0xA000E)
NvPclOpen: PCL Open Failed. Error: 0xf
SCF: Error BadParameter: Sensor could not be opened. (in src/services/capture/CaptureServiceDeviceSensor.cpp, function getSourceFromGuid(), line 593)
SCF: Error BadParameter: (propagating from src/services/capture/CaptureService.cpp, function addSourceByGuid(), line 437)
SCF: Error BadParameter: (propagating from src/api/CameraDriver.cpp, function addSourceByIndex(), line 305)
SCF: Error BadParameter: (propagating from src/api/CameraDriver.cpp, function getSource(), line 471)
Acquiring SCF Camera device source via index 0 has failed. ---- imager: No override file found. ----
LSC: LSC surface is not based on full res!
---- imager: No override file found. ----
(NvCamV4l2) Error ModuleNotPresent: V4L2Device not available (in /dvs/git/dirty/git-master_linux/camera/utils/nvcamv4l2/v4l2_device.cpp, function findDevice(), line 256)
(NvCamV4l2) Error ModuleNotPresent: (propagating from /dvs/git/dirty/git-master_linux/camera/utils/nvcamv4l2/v4l2_device.cpp, function initialize(), line 60)
(NvOdmDevice) Error ModuleNotPresent: (propagating from dvs/git/dirty/git-master_linux/camera-partner/imager/src/devices/V4L2SensorViCsi.cpp, function initialize(), line 107)
NvPclDriverInitializeData: Unable to initialize driver v4l2_sensor
NvPclInitializeDrivers: error: Failed to init camera sub module v4l2_sensor
NvPclStartPlatformDrivers: Failed to start module drivers
NvPclStateControllerOpen: Failed ImagerGUID 4. (error 0xA000E)
NvPclOpen: PCL Open Failed. Error: 0xf
SCF: Error BadParameter: Sensor could not be opened. (in src/services/capture/CaptureServiceDeviceSensor.cpp, function getSourceFromGuid(), line 593)
SCF: Error BadParameter: (propagating from src/services/capture/CaptureService.cpp, function addSourceByGuid(), line 437)
SCF: Error BadParameter: (propagating from src/api/CameraDriver.cpp, function addSourceByIndex(), line 305)
SCF: Error BadParameter: (propagating from src/api/CameraDriver.cpp, function getSource(), line 471)
Acquiring SCF Camera device source via index 2 has failed. === gst-launch-1.0[19722]: CameraProvider initialized (0x7f9c97dcb0)SCF: Error BadValue: NvPHSSendThroughputHints (in src/common/CameraPowerHint.cpp, function sendCameraPowerHint(), line 56)
LSC: LSC surface is not based on full res!
(NvCapture) Error BadParameter: Invalid NvCsi pixelBitDepth: 0x8 (in /dvs/git/dirty/git-master_linux/camera/capture/nvcapture/capture.c, function NvCaptureCsiGetDataType(), line 2212)
(NvCapture) Error BadParameter: Unsupported source color format 1509208a10 (in /dvs/git/dirty/git-master_linux/camera/capture/nvcapture/capture_t19x.c, function NvCaptureConfigSetSourceFormatT19x(), line 181)
(NvCapture) Error BadParameter: (propagating from /dvs/git/dirty/git-master_linux/camera/capture/nvcapture/capture.c, function NvCaptureRequestSetAttribute(), line 1770)
SCF: Error BadParameter: (propagating from src/services/capture/NvCaptureViCsiHw.cpp, function startCaptureInternal(), line 649)
SCF: Error BadParameter: (propagating from src/services/capture/CaptureRecord.cpp, function doCSItoMemCapture(), line 532)
SCF: Error BadParameter: (propagating from src/services/capture/CaptureRecord.cpp, function issueCapture(), line 469)
SCF: Error BadParameter: (propagating from src/services/capture/CaptureServiceDevice.cpp, function issueCaptures(), line 1295)
SCF: Error BadParameter: (propagating from src/services/capture/CaptureServiceDevice.cpp, function issueCaptures(), line 1126)
SCF: Error BadParameter: (propagating from src/common/Utils.cpp, function workerThread(), line 116)
SCF: Error BadParameter: Worker thread CaptureScheduler frameStart failed (in src/common/Utils.cpp, function workerThread(), line 133)
SCF: Error Timeout: (propagating from src/api/Buffer.cpp, function waitForUnlock(), line 643)
SCF: Error Timeout: (propagating from src/components/CaptureContainerImpl.cpp, function returnBuffer(), line 363)
From the gstreame:
$ gst-launch-1.0 -v -e nvarguscamerasrc sensor-id=0 sensor-mode=8 num-buffers=-1 do-timestamp=true silent=true ! 'video/x-raw(memory:NVMM),format=(string)NV12,framerate=(fraction)30/1' ! nvvidconv ! timeoverlay ! nvvidconv ! omxh265enc insert-vui=true insert-aud=true ! h265parse ! 'video/x-h265, stream-format=(string)byte-stream, framerate=30/1' ! queue ! mpegtsmux ! rtpmp2tpay ! udpsink host=192.168.55.100 port=5001
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstNvArgusCameraSrc:nvarguscamerasrc0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/Gstnvvconv:nvvconv0.GstPad:src: caps = video/x-raw, width=(int)1920, height=(int)1080, framerate=(fraction)30/1, format=(string)NV12
/GstPipeline:pipeline0/GstTimeOverlay:timeoverlay0.GstPad:src: caps = video/x-raw, width=(int)1920, height=(int)1080, framerate=(fraction)30/1, format=(string)NV12
/GstPipeline:pipeline0/Gstnvvconv:nvvconv1.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, framerate=(fraction)30/1, format=(string)NV12
Framerate set to : 30 at NvxVideoEncoderSetParameterNvMMLiteOpen : Block : BlockType = 8
===== NVMEDIA: NVENC =====
NvMMLiteBlockCreate : Block : BlockType = 8
NVMEDIA: H265 : Profile : 1
/GstPipeline:pipeline0/GstOMXH265Enc-omxh265enc:omxh265enc-omxh265enc0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, framerate=(fraction)30/1, format=(string)NV12
/GstPipeline:pipeline0/Gstnvvconv:nvvconv1.GstPad:sink: caps = video/x-raw, width=(int)1920, height=(int)1080, framerate=(fraction)30/1, format=(string)NV12
/GstPipeline:pipeline0/GstTimeOverlay:timeoverlay0.GstPad:video_sink: caps = video/x-raw, width=(int)1920, height=(int)1080, framerate=(fraction)30/1, format=(string)NV12
/GstPipeline:pipeline0/Gstnvvconv:nvvconv0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1
GST_ARGUS: Creating output stream
CONSUMER: Waiting until producer is connected...
GST_ARGUS: Available Sensor modes :
GST_ARGUS: 4128 x 3008 FR = 28.999999 fps Duration = 34482760 ; Analog Gain range min 1.000000, max 251.188705; Exposure Range min 11000, max 660000000;
GST_ARGUS: 3840 x 2160 FR = 40.000000 fps Duration = 25000000 ; Analog Gain range min 1.000000, max 251.188705; Exposure Range min 11000, max 660000000;
GST_ARGUS: 1920 x 1080 FR = 146.000001 fps Duration = 6849315 ; Analog Gain range min 1.000000, max 251.188705; Exposure Range min 6000, max 660000000;
GST_ARGUS: 4128 x 3008 FR = 34.000001 fps Duration = 29411764 ; Analog Gain range min 1.000000, max 251.188705; Exposure Range min 10000, max 660000000;
GST_ARGUS: 3840 x 2160 FR = 46.999999 fps Duration = 21276596 ; Analog Gain range min 1.000000, max 251.188705; Exposure Range min 10000, max 660000000;
GST_ARGUS: 1920 x 1080 FR = 169.999998 fps Duration = 5882353 ; Analog Gain range min 1.000000, max 251.188705; Exposure Range min 5000, max 660000000;
GST_ARGUS: 4128 x 3008 FR = 42.000000 fps Duration = 23809524 ; Analog Gain range min 1.000000, max 251.188705; Exposure Range min 8000, max 660000000;
GST_ARGUS: 3840 x 2160 FR = 57.999998 fps Duration = 17241380 ; Analog Gain range min 1.000000, max 251.188705; Exposure Range min 8000, max 660000000;
GST_ARGUS: 1920 x 1080 FR = 204.999991 fps Duration = 4878049 ; Analog Gain range min 1.000000, max 251.188705; Exposure Range min 4000, max 660000000;
GST_ARGUS: Running with following settings:
Camera index = 0
Camera mode = 8
Output Stream W = 1920 H = 1080
seconds to Run = 0
Frame Rate = 204.999991
GST_ARGUS: Setup Complete, Starting captures for 0 seconds
GST_ARGUS: Starting repeat capture requests.
CONSUMER: Producer has connected; continuing.
ERROR: from element /GstPipeline:pipeline0/GstNvArgusCameraSrc:nvarguscamerasrc0: CANCELLED
Additional debug info:
Argus Error Status
EOS on shutdown enabled -- waiting for EOS after Error
Waiting for EOS...
nvbuf_utils: dmabuf_fd -1 mapped entry NOT found
nvbuf_utils: Can not get HW buffer from FD... Exiting...
CONSUMER: ERROR OCCURRED
ERROR: from element /GstPipeline:pipeline0/MpegTsMux:mpegtsmux0: Could not create handler for stream
Additional debug info:
mpegtsmux.c(996): mpegtsmux_create_streams (): /GstPipeline:pipeline0/MpegTsMux:mpegtsmux0
hello david.fernandez,
it’s quite old release version.
please examine you’ve below patch included to remove the 120-fps frame rate upper bound limitation
for instance,
diff --git a/gst-nvarguscamera/gstnvarguscamerasrc.cpp b/gst-nvarguscamera/gstnvarguscamerasrc.cpp
index 6d7df3e..29e4f06 100644
@@ -47,7 +47,7 @@
"width = (int) [ 1, MAX ], " \
"height = (int) [ 1, MAX ], " \
"format = (string) { NV12 }, " \
- "framerate = (fraction) [ 0/1, 120/1 ];"
+ "framerate = (fraction) [ 0, MAX ];"
as you can see,
please ignore pre-built update in Topic 284939 for your scenario as that’s pre-built update for r35.5.0
Hi @JerryChang,
Yes, it apears to be in:
#define CAPTURE_CAPS \
"video/x-raw(memory:NVMM), " \
"width = (int) [ 1, MAX ], " \
"height = (int) [ 1, MAX ], " \
"format = (string) { NV12, P010_10LE }, " \
"framerate = (fraction) [ 0, MAX ];"
hello david.fernandez,
please try following.
$ gst-launch-1.0 -e nvarguscamerasrc sensor-id=0 sensor-mode=8 ! 'video/x-raw(memory:NVMM), width=1920, height=1080, framerate=204/1, format=(string)NV12' ! nvvidconv ! 'video/x-raw(memory:NVMM), format=I420' ! fpsdisplaysink text-overlay=0 video-sink=fakesink sync=0 -v
Hi @JerryChang,
Here you are, first with v4l2-ctl then your command:
$ v4l2-ctl -v width=1920,height=1080,pixelformat=RGGB -c bypass_mode=0,sensor_mode=8,frame_rate=204000000 --stream-mmap --stream-skip=1000 --stream-count=1
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< 205.00 fps
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< 204.50 fps
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< 204.33 fps
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< 204.25 fps
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
$
$ gst-launch-1.0 -e nvarguscamerasrc sensor-id=0 sensor-mode=8 ! 'video/x-raw(memory:NVMM), width=1920, height=1080, framerate=204/1, format=(string)NV12' ! nvvidconv ! 'video/x-raw(memory:NVMM), format=I420' ! fpsdisplaysink text-overlay=0 video-sink=fakesink sync=0 -v
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstFakeSink:fakesink0: sync = false
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstNvArgusCameraSrc:nvarguscamerasrc0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)204/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)204/1
/GstPipeline:pipeline0/Gstnvvconv:nvvconv0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, framerate=(fraction)204/1, format=(string)I420
/GstPipeline:pipeline0/GstCapsFilter:capsfilter1.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, framerate=(fraction)204/1, format=(string)I420
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0.GstGhostPad:sink.GstProxyPad:proxypad0: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, framerate=(fraction)204/1, format=(string)I420
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstFakeSink:fakesink0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, framerate=(fraction)204/1, format=(string)I420
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0.GstGhostPad:sink: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, framerate=(fraction)204/1, format=(string)I420
/GstPipeline:pipeline0/GstCapsFilter:capsfilter1.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, framerate=(fraction)204/1, format=(string)I420
/GstPipeline:pipeline0/Gstnvvconv:nvvconv0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)204/1
GST_ARGUS: Creating output stream
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)204/1
CONSUMER: Waiting until producer is connected...
GST_ARGUS: Available Sensor modes :
GST_ARGUS: 4128 x 3008 FR = 28.999999 fps Duration = 34482760 ; Analog Gain range min 1.000000, max 251.188705; Exposure Range min 11000, max 660000000;
GST_ARGUS: 3840 x 2160 FR = 40.000000 fps Duration = 25000000 ; Analog Gain range min 1.000000, max 251.188705; Exposure Range min 11000, max 660000000;
GST_ARGUS: 1920 x 1080 FR = 146.000001 fps Duration = 6849315 ; Analog Gain range min 1.000000, max 251.188705; Exposure Range min 6000, max 660000000;
GST_ARGUS: 4128 x 3008 FR = 34.000001 fps Duration = 29411764 ; Analog Gain range min 1.000000, max 251.188705; Exposure Range min 10000, max 660000000;
GST_ARGUS: 3840 x 2160 FR = 46.999999 fps Duration = 21276596 ; Analog Gain range min 1.000000, max 251.188705; Exposure Range min 10000, max 660000000;
GST_ARGUS: 1920 x 1080 FR = 169.999998 fps Duration = 5882353 ; Analog Gain range min 1.000000, max 251.188705; Exposure Range min 5000, max 660000000;
GST_ARGUS: 4128 x 3008 FR = 42.000000 fps Duration = 23809524 ; Analog Gain range min 1.000000, max 251.188705; Exposure Range min 8000, max 660000000;
GST_ARGUS: 3840 x 2160 FR = 57.999998 fps Duration = 17241380 ; Analog Gain range min 1.000000, max 251.188705; Exposure Range min 8000, max 660000000;
GST_ARGUS: 1920 x 1080 FR = 204.999991 fps Duration = 4878049 ; Analog Gain range min 1.000000, max 251.188705; Exposure Range min 4000, max 660000000;
GST_ARGUS: Running with following settings:
Camera index = 0
Camera mode = 8
Output Stream W = 1920 H = 1080
seconds to Run = 0
Frame Rate = 204.999991
GST_ARGUS: Setup Complete, Starting captures for 0 seconds
GST_ARGUS: Starting repeat capture requests.
CONSUMER: Producer has connected; continuing.
ERROR: from element /GstPipeline:pipeline0/GstNvArgusCameraSrc:nvarguscamerasrc0: CANCELLED
Additional debug info:
Argus Error Status
EOS on shutdown enabled -- waiting for EOS after Error
Waiting for EOS...
nvbuf_utils: dmabuf_fd -1 mapped entry NOT found
nvbuf_utils: Can not get HW buffer from FD... Exiting...
CONSUMER: ERROR OCCURRED
hello david.fernandez,
is your v4l pipeline works without adding skip frames?
Here you are:
$ v4l2-ctl -v width=1920,height=1080,pixelformat=RGGB -c bypass_mode=0,sensor_mode=8,frame_rate=204000000 --stream-mmap --stream-count=100000
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< 205.00 fps
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< 204.50 fps
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< 204.33 fps
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< 204.25 fps
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< 204.20 fps
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< 204.16 fps
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<^C
hello david.fernandez,
all right, it looks this mode (1920x1080@204-fps) should works through v4l2 pipeline.
it might be device tree issues since it’s Argus for checking Property-Value Pairs to setup the stream,
let’s give it a quick try by increasing the pix_clk_hz
, to launch with gst pipeline.
Any particular value you want me to try?.. AFAIK, this is fixed in the camera hardware.
IMX565 supports (per lane): 1188-Mbps, 891-Mbps, and 594-Mbps (for the INCK that I have, i.e. 37.125-MHz). Operating in MIPI D-PHY
For all pixels (4096x3000, 8-bpp), frame rates are 42…6-fps, 32.2-fps, 21.7-fps.
For 1/2 sub-sampling (2048x1500, 8-bpp): 153.1-fps, 117.3-fps, 80.2-fps.
Calculating framerate as a linear function:
A + B x 4096x3000 = 42.6-fps
A + B x 2048x1500 = 153.1-fps
A = 42.6 - B x 4096x3000
B x (2048x1500 - 4096x3000) = 153.1 - 42.6
-3/4 x B x 4096x3000 = 110.5
B = -442/(4096x3000x3)
A = 42.6 + 442/3 = 569.8/3
A + B x 1920x1080 = 569.8/3 - 442/(4096x3000x3) x 1920x1080 = 165.07-fps
Calculating as ROI mode (agrees with e.g. all pixel frame rate):
GSDLY = 12
GMTWT = 26
GMRWT = 4
V = 1080
1H = 277.5 INCK
INCK = 37.125-MHz
1H = 7.48-us
FR = 1000000-us / ((1080 + 12 + 26 + 4 + 86) x 7.48-us) = 110.67-fps
Seems to me that the IMX565 can only output 204-fps if the 1H period varies with the pixels per line… but looking at the datasheet, it does not look like a linear function.
A + B x 4096 = 277.5
A + B x 2048 = 148.5
A = 277.5 - B x 4096
B x (4096 - 2048) = 277.5 -148.5
B x 4096 = 258
A = 19.5
19.5 + 258/4096 x 1920 = 140.44
1Hperiod = 3.783-us
FR = 1000000-us / ((1080 + 12 + 26 + 4 + 86) x 3.783-us) = 218.82-fps
That seems more like it… but not sure that 1H is entirely linear though (there might be other overheads that change differently).
Anyway, not sure which value would be good to try.