Failing to get input from CSI camera (IMX219-83)

Cant get input from an imx-219 stereo camera module into jetson nano.
Here are the commands/pipelines I’ve tried so far:
Command1 :
gst-launch-1.0 nvarguscamerasrc sensor_id=0 !
‘video/x-raw(memory:NVMM),width=1920, height=1080, framerate=30/1’ ! \ nvvidconv flip-method=0 ! ‘video/x-raw,width=960, height=540’ ! \ nvvidconv ! nvegltransform ! nveglglessink -e
Output:

nvbuf_utils: Could not get EGL display connection
Setting pipeline to PAUSED …

Using winsys: x11
ERROR: Pipeline doesn’t want to pause.
Setting pipeline to NULL …
Freeing pipeline …
gudboy007@ubuntubox:~$ gst-launch-1.0 nvarguscamerasrc sensor-id=0 ! ‘video/x-raw(memory:NVMM), width=3280, height=2464, format=(string)NV12, framerate=(fraction)20/1’ ! nvoverlaysink -e
nvbuf_utils: Could not get EGL display connection
Setting pipeline to PAUSED …
Pipeline is live and does not need PREROLL …
Setting pipeline to PLAYING …
New clock: GstSystemClock
GST_ARGUS: Creating output stream
CONSUMER: Waiting until producer is connected…
GST_ARGUS: Available Sensor modes :
GST_ARGUS: 3264 x 2464 FR = 21.000000 fps Duration = 47619048 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 3264 x 1848 FR = 28.000001 fps Duration = 35714284 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1920 x 1080 FR = 29.999999 fps Duration = 33333334 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1640 x 1232 FR = 29.999999 fps Duration = 33333334 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1280 x 720 FR = 59.999999 fps Duration = 16666667 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1280 x 720 FR = 120.000005 fps Duration = 8333333 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: Running with following settings:
Camera index = 0
Camera mode = 0
Output Stream W = 3264 H = 2464
seconds to Run = 0
Frame Rate = 21.000000
GST_ARGUS: Setup Complete, Starting captures for 0 seconds
GST_ARGUS: Starting repeat capture requests.
CONSUMER: Producer has connected; continuing.
^Z
[1]+ Stopped gst-launch-1.0 nvarguscamerasrc sensor-id=0 ! ‘video/x-raw(memory:NVMM), width=3280, height=2464, format=(string)NV12, framerate=(fraction)20/1’ ! nvoverlaysink -e
gudboy007@ubuntubox:~$ nvgstcapture-1.0 --orientation=2
Encoder null, cannot set bitrate!
Encoder Profile = High
Supported resolutions in case of ARGUS Camera
(2) : 640x480
(3) : 1280x720
(4) : 1920x1080
(5) : 2104x1560
(6) : 2592x1944
(7) : 2616x1472
(8) : 3840x2160
(9) : 3896x2192
(10): 4208x3120
(11): 5632x3168
(12): 5632x4224

Runtime ARGUS Camera Commands:

Help : ‘h’
Quit : ‘q’
Set Capture Mode:
mo:
(1): image
(2): video
Get Capture Mode:
gmo
Set sensor orientation:
so:
(0): none
(1): Rotate counter-clockwise 90 degrees
(2): Rotate 180 degrees
(3): Rotate clockwise 90 degrees
Get sensor orientation:
gso
Set sensor mode:
smo: e.g., smo:1
Get sensor mode:
gsmo
Set Whitebalance Mode:
wb:
(0): off
(1): auto
(2): incandescent
(3): fluorescent
(4): warm-fluorescent
(5): daylight
(6): cloudy-daylight
(7): twilight
(8): shade
(9): manual
Get Whitebalance Mode:
gwb
Set Saturation (0 to 2):
st: e.g., st:1.25
Get Saturation:
gst
Set Exposure Compensation (-2 to 2):
ec: e.g., ec:-2
Get Exposure Compensation:
gec
Set Auto Whitebalance Lock:
awbl: e.g., awbl:0
Get Auto Whitebalance Lock:
awbl
Set Auto Exposure Lock:
ael: e.g., ael:0
Get Auto Exposure Lock:
gael
Set TNR Mode:
tnrm: e.g., tnrm:1
(0): OFF
(1): FAST
(2): HIGH QUALITY
Get TNR Mode:
gtnrm
Set TNR Strength (-1 to 1):
tnrs: e.g., tnrs:0.5
Get TNR Strength:
gtnrs
Set EE Mode:
eem: e.g., eem:1
(0): OFF
(1): FAST
(2): HIGH QUALITY
Get EE Mode:
geem
Set EE Strength (-1 to 1):
ees: e.g., ees:0.5
Get EE Strength:
gees
Set Auto Exposure Anti-Banding (0 to 3):
aeab: e.g., aeab:2
(0): OFF
(1): MODE AUTO
(2): MODE 50HZ
(3): MODE 60HZ
Get Auto Exposure Anti-Banding:
gaeab
Set Gain Range:
gr: e.g., gr:1 16
Get Gain Range:
ggr
Set Exposure Time Range:
etr: e.g., etr:34000 35000
Get Exposure Time Range:
getr
Set ISP Digital Gain Range:
dgr: e.g., dgr:2 152
Get ISP Digital Gain Range:
gdgr
Capture: enter ‘j’ OR
followed by a timer (e.g., jx5000, capture after 5 seconds) OR
followed by multishot count (e.g., j:6, capture 6 images)
timer/multihot values are optional, capture defaults to single shot with timer=0s
Start Recording : enter ‘1’
Stop Recording : enter ‘0’
Video snapshot : enter ‘2’ (While recording video)
Get Preview Resolution:
gpcr
Get Image Capture Resolution:
gicr
Get Video Capture Resolution:
gvcr

Runtime encoder configuration options:

Set Encoding Bit-rate(in bytes):
br: e.g., br:4000000
Get Encoding Bit-rate(in bytes):
gbr
Set Encoding Profile(only for H.264):
ep: e.g., ep:1
(0): Baseline
(1): Main
(2): High
Get Encoding Profile(only for H.264):
gep
Force IDR Frame on video Encoder(only for H.264):
Enter ‘f’

nvbuf_utils: Could not get EGL display connection
bitrate = 4000000
Encoder Profile = High
Encoder control-rate = 1
Encoder EnableTwopassCBR = 0

Using winsys: x11

** (nvgstcapture-1.0:6325): CRITICAL **: 22:10:00.670: <create_capture_pipeline:4028> can’t set camera to playing

** (nvgstcapture-1.0:6325): CRITICAL **: 22:10:00.670: main:4651 Capture Pipeline creation failed
** Message: 22:10:00.671: main:4658 Capture completed
** Message: 22:10:00.672: main:4707 Camera application will now exit

Command 2: nvgstcapture-1.0 --orientation=2
Output:

Encoder null, cannot set bitrate!
Encoder Profile = High
Supported resolutions in case of ARGUS Camera
(2) : 640x480
(3) : 1280x720
(4) : 1920x1080
(5) : 2104x1560
(6) : 2592x1944
(7) : 2616x1472
(8) : 3840x2160
(9) : 3896x2192
(10): 4208x3120
(11): 5632x3168
(12): 5632x4224

Runtime ARGUS Camera Commands:

Help : ‘h’
Quit : ‘q’
Set Capture Mode:
mo:
(1): image
(2): video
Get Capture Mode:
gmo
Set sensor orientation:
so:
(0): none
(1): Rotate counter-clockwise 90 degrees
(2): Rotate 180 degrees
(3): Rotate clockwise 90 degrees
Get sensor orientation:
gso
Set sensor mode:
smo: e.g., smo:1
Get sensor mode:
gsmo
Set Whitebalance Mode:
wb:
(0): off
(1): auto
(2): incandescent
(3): fluorescent
(4): warm-fluorescent
(5): daylight
(6): cloudy-daylight
(7): twilight
(8): shade
(9): manual
Get Whitebalance Mode:
gwb
Set Saturation (0 to 2):
st: e.g., st:1.25
Get Saturation:
gst
Set Exposure Compensation (-2 to 2):
ec: e.g., ec:-2
Get Exposure Compensation:
gec
Set Auto Whitebalance Lock:
awbl: e.g., awbl:0
Get Auto Whitebalance Lock:
awbl
Set Auto Exposure Lock:
ael: e.g., ael:0
Get Auto Exposure Lock:
gael
Set TNR Mode:
tnrm: e.g., tnrm:1
(0): OFF
(1): FAST
(2): HIGH QUALITY
Get TNR Mode:
gtnrm
Set TNR Strength (-1 to 1):
tnrs: e.g., tnrs:0.5
Get TNR Strength:
gtnrs
Set EE Mode:
eem: e.g., eem:1
(0): OFF
(1): FAST
(2): HIGH QUALITY
Get EE Mode:
geem
Set EE Strength (-1 to 1):
ees: e.g., ees:0.5
Get EE Strength:
gees
Set Auto Exposure Anti-Banding (0 to 3):
aeab: e.g., aeab:2
(0): OFF
(1): MODE AUTO
(2): MODE 50HZ
(3): MODE 60HZ
Get Auto Exposure Anti-Banding:
gaeab
Set Gain Range:
gr: e.g., gr:1 16
Get Gain Range:
ggr
Set Exposure Time Range:
etr: e.g., etr:34000 35000
Get Exposure Time Range:
getr
Set ISP Digital Gain Range:
dgr: e.g., dgr:2 152
Get ISP Digital Gain Range:
gdgr
Capture: enter ‘j’ OR
followed by a timer (e.g., jx5000, capture after 5 seconds) OR
followed by multishot count (e.g., j:6, capture 6 images)
timer/multihot values are optional, capture defaults to single shot with timer=0s
Start Recording : enter ‘1’
Stop Recording : enter ‘0’
Video snapshot : enter ‘2’ (While recording video)
Get Preview Resolution:
gpcr
Get Image Capture Resolution:
gicr
Get Video Capture Resolution:
gvcr

Runtime encoder configuration options:

Set Encoding Bit-rate(in bytes):
br: e.g., br:4000000
Get Encoding Bit-rate(in bytes):
gbr
Set Encoding Profile(only for H.264):
ep: e.g., ep:1
(0): Baseline
(1): Main
(2): High
Get Encoding Profile(only for H.264):
gep
Force IDR Frame on video Encoder(only for H.264):
Enter ‘f’

nvbuf_utils: Could not get EGL display connection
bitrate = 4000000
Encoder Profile = High
Encoder control-rate = 1
Encoder EnableTwopassCBR = 0

Using winsys: x11

** (nvgstcapture-1.0:6325): CRITICAL **: 22:10:00.670: <create_capture_pipeline:4028> can’t set camera to playing

** (nvgstcapture-1.0:6325): CRITICAL **: 22:10:00.670: main:4651 Capture Pipeline creation failed
** Message: 22:10:00.671: main:4658 Capture completed
** Message: 22:10:00.672: main:4707 Camera application will now exit

I’ve tried other pipelines aswell, most of them giving similar if not the same error messages with ‘pipeline creation failed’ and “Failed to create capture session”
running ls /dev/video* returns the two cameras, (video0 and video1) which gives me hope because it regognizes them.

some other thoughts:

  • the manufacturers website listed only 2 pins, SDA and SCL to be connected to Pins 3 and 5 for I2C. First of all, do they have to be configured, and secondly,there are two other pins, FSYNC and INT, both of which are floating.

had you setup environment variable $ export DISPLAY=:0 before running the command-line?

please also try below pipeline to disable preview and shows frame-rate only.
$ gst-launch-1.0 nvarguscamerasrc sensor-id=0 ! 'video/x-raw(memory:NVMM),width=1920, height=1080, framerate=30/1, format=NV12' ! nvvidconv ! 'video/x-raw(memory:NVMM),format=I420' ! fpsdisplaysink text-overlay=0 video-sink=fakesink sync=0 -v

you may using V4L2 IOCTL to verify basic camera functionality. if that’s still not working,
please further narrow down this with Applications Using V4L2 IOCTL Directly.

I tried both the commands. The export DISPLAY one didn’t return anything, but I think the framerate one did work. Heres a log of the command line:

gudboy007@ubuntubox:~$ export DISPLAY=:0

gudboy007@ubuntubox:~$ gst-launch-1.0 nvarguscamerasrc sensor-id=0 ! ‘video/x-raw(memory:NVMM),width=1920, height=1080, framerate=30/1, format=NV12’ ! nvvidconv ! ‘video/x-raw(memory:NVMM),format=I420’ ! fpsdisplaysink text-overlay=0 video-sink=fakesink sync=0 -v
No protocol specified
nvbuf_utils: Could not get EGL display connection
Setting pipeline to PAUSED …
Pipeline is live and does not need PREROLL …
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstFakeSink:fakesink0: sync = false
Setting pipeline to PLAYING …
New clock: GstSystemClock
/GstPipeline:pipeline0/GstNvArgusCameraSrc:nvarguscamerasrc0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, execute:751 Failed to create CaptureSession
/GstPipeline:pipeline0/Gstnvvconv:nvvconv0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, framerate=(fraction)30/1, format=(string)I420
/GstPipeline:pipeline0/GstCapsFilter:capsfilter1.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, framerate=(fraction)30/1, format=(string)I420
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0.GstGhostPad:sink.GstProxyPad:proxypad0: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, framerate=(fraction)30/1, format=(string)I420
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstFakeSink:fakesink0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, framerate=(fraction)30/1, format=(string)I420
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0.GstGhostPad:sink: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, framerate=(fraction)30/1, format=(string)I420
/GstPipeline:pipeline0/GstCapsFilter:capsfilter1.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, framerate=(fraction)30/1, format=(string)I420
/GstPipeline:pipeline0/Gstnvvconv:nvvconv0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstFakeSink:fakesink0: sync = false
Got EOS from element “pipeline0”.
Execution ended after 0:00:00.010223946
Setting pipeline to PAUSED …
Setting pipeline to READY …
Setting pipeline to NULL …
Freeing pipeline …

Both the problems (highlighted) still persist. I don’t understand why this is happening.

It says V4L2 supports CSI, but doesn’t have any code specific to that. What should I do?

hello aranis.das,

no… it’s still failed with create CaptureSession.

let’s narrow down the issue,
please give it a try to fetch the stream via v4l standard control.
here’re sample pipelines for your reference,
$ v4l2-ctl -d /dev/video0 --list-formats-ext
$ v4l2-ctl -d /dev/video0 --set-fmt-video=width=1920,height=1080,pixelformat=RG10 --set-ctrl bypass_mode=0 --stream-mmap --stream-count=100

Here’s what I got:
gudboy007@ubuntubox:~$ v4l2-ctl -d /dev/video0 --list-formats-ext
ioctl: VIDIOC_ENUM_FMT
Index : 0
Type : Video Capture
Pixel Format: ‘RG10’
Name : 10-bit Bayer RGRG/GBGB
Size: Discrete 3264x2464
Interval: Discrete 0.048s (21.000 fps)
Size: Discrete 3264x1848
Interval: Discrete 0.036s (28.000 fps)
Size: Discrete 1920x1080
Interval: Discrete 0.033s (30.000 fps)
Size: Discrete 1640x1232
Interval: Discrete 0.033s (30.000 fps)
Size: Discrete 1280x720
Interval: Discrete 0.017s (60.000 fps)

gudboy007@ubuntubox:~$ v4l2-ctl -d /dev/video0 --set-fmt-video=width=1920,height=1080,pixelformat=RG10 --set-ctrl bypass_mode=0 --stream-mmap --stream-count=100
<<<<<<<<<<<<<<<<<<<<<<< 21.35 fps
<<<<<<<<<<<<<<<<<<<<< 21.28 fps
<<<<<<<<<<<<<<<<<<<<< 21.19 fps
<<<<<<<<<<<<<<<<<<<<< 21.19 fps
<<<<<<<<<<<<<<

hello aranis.das,

it looks v4l2 able to fetch the frames. however, the output frame-rate cannot reach 30-fps even though it’s set format to 1920x1080.

in order to match the real test results.
let’s check revise the gst pipeline to reduce the framerate property,
please try again with following.
$ gst-launch-1.0 nvarguscamerasrc sensor-id=0 ! ‘video/x-raw(memory:NVMM),width=1920, height=1080, framerate=21/1, format=NV12’ ! nvvidconv ! ‘video/x-raw(memory:NVMM),format=I420’ ! fpsdisplaysink text-overlay=0 video-sink=fakesink sync=0 -v

gudboy007@ubuntubox:~$ gst-launch-1.0 nvarguscamerasrc sensor-id=0 ! ‘video/x-raw(memory:NVMM),width=1920, height=1080, framerate=21/1, format=NV12’ ! nvvidconv ! ‘video/x-raw(memory:NVMM),format=I420’ ! fpsdisplaysink text-overlay=0 video-sink=fakesink sync=0 -v
nvbuf_utils: Could not get EGL display connection
Setting pipeline to PAUSED …
Pipeline is live and does not need PREROLL …
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstFakeSink:fakesink0: sync = false
Setting pipeline to PLAYING …
New clock: GstSystemClock
/GstPipeline:pipeline0/GstNvArgusCameraSrc:nvarguscamerasrc0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)21/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)21/1
/GstPipeline:pipeline0/Gstnvvconv:nvvconv0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, framerate=(fraction)21/1, format=(string)I420
/GstPipeline:pipeline0/GstCapsFilter:capsfilter1.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, framerate=(fraction)21/1, format=(string)I420
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0.GstGhostPad:sink.GstProxyPad:proxypad0: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, framerate=(fraction)21/1, format=(string)I420
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstFakeSink:fakesink0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, framerate=(fraction)21/1, format=(string)I420
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0.GstGhostPad:sink: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, framerate=(fraction)21/1, format=(string)I420
/GstPipeline:pipeline0/GstCapsFilter:capsfilter1.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, framerate=(fraction)21/1, format=(string)I420
/GstPipeline:pipeline0/Gstnvvconv:nvvconv0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)21/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)21/1
GST_ARGUS: Creating output stream
CONSUMER: Waiting until producer is connected…
GST_ARGUS: Available Sensor modes :
GST_ARGUS: 3264 x 2464 FR = 21.000000 fps Duration = 47619048 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 3264 x 1848 FR = 28.000001 fps Duration = 35714284 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1920 x 1080 FR = 29.999999 fps Duration = 33333334 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1640 x 1232 FR = 29.999999 fps Duration = 33333334 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1280 x 720 FR = 59.999999 fps Duration = 16666667 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1280 x 720 FR = 120.000005 fps Duration = 8333333 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: Running with following settings:
Camera index = 0
Camera mode = 2
Output Stream W = 1920 H = 1080
seconds to Run = 0
Frame Rate = 29.999999
GST_ARGUS: Setup Complete, Starting captures for 0 seconds
GST_ARGUS: Starting repeat capture requests.
CONSUMER: Producer has connected; continuing.
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstFakeSink:fakesink0: sync = false
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 12, dropped: 0, current: 23.72, average: 23.72
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 23, dropped: 0, current: 20.99, average: 22.33
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 34, dropped: 0, current: 20.96, average: 21.87
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 45, dropped: 0, current: 21.04, average: 21.66
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 56, dropped: 0, current: 21.02, average: 21.53
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 67, dropped: 0, current: 21.05, average: 21.45
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 78, dropped: 0, current: 20.81, average: 21.36
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 89, dropped: 0, current: 21.01, average: 21.31
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 100, dropped: 0, current: 21.15, average: 21.30
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 111, dropped: 0, current: 20.95, average: 21.26
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 122, dropped: 0, current: 21.07, average: 21.24
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 133, dropped: 0, current: 20.84, average: 21.21
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 144, dropped: 0, current: 21.00, average: 21.19
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 155, dropped: 0, current: 21.14, average: 21.19
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 166, dropped: 0, current: 20.99, average: 21.18
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 177, dropped: 0, current: 21.07, average: 21.17
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 188, dropped: 0, current: 20.82, average: 21.15
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 199, dropped: 0, current: 20.99, average: 21.14
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 210, dropped: 0, current: 21.10, average: 21.14
^Z
[1]+ Stopped gst-launch-1.0 nvarguscamerasrc sensor-id=0 ! ‘video/x-raw(memory:NVMM),width=1920, height=1080, framerate=21/1, format=NV12’ ! nvvidconv ! ‘video/x-raw(memory:NVMM),format=I420’ ! fpsdisplaysink text-overlay=0 video-sink=fakesink sync=0 -v
gudboy007@ubuntubox:~$

Why is it at 20fps?

hello aranis.das,

all right, it looks gst pipeline with identical results as v4l2 to stream 1920x1080 at 21-fps.
may I know which Jetpack release you’re now using, had you apply kernel patches to update camera driver?

Im running Version: 4.6.1-b110
No, I have not done any kernel patches yet for the driver

hello aranis.das,

please disassembler the dtb file into text file to examine the mode settings.
for example, $ dtc -I dtb -O dts -o temp.txt tegra210-xxx.dtb

okay, its in a text file
to reiterate, i went to boot/dtc/ and found a file named kernel_tegra210-p3448-0000-p3449-0000-b00.dtb, ran the command with sudo (didn’t work without), and now i have a big text file. I cant paste it here because it is too large. let me know which section you’d like to see

hello aranis.das,

you may upload the text file instead of copy-n-paste the whole content.
would like to review the sensor mode settings for your IMX219 camera device.

temp.txt (318.9 KB)

hello aranis.das,

I doubt it’s always enable mode0, (3264x2464@21-fps) for outputting the stream.
since there’s device tree property to enable mode selection, i.e. use_sensor_mode_id = "true";
please give it a try as following,
$ v4l2-ctl -d /dev/video0 --set-fmt-video=width=1920,height=1080,pixelformat=RG10 --set-ctrl bypass_mode=0 --set-ctrl sensor_mode=2 --stream-mmap --stream-count=100

~$ v4l2-ctl -d /dev/video0 --set-fmt-video=width=1920,height=1080,pixelformat=RG10 --set-ctrl bypass_mode=0 --set-ctrl sensor_mode=2 --stream-mmap --stream-count=100
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< 30.09 fps
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< 30.04 fps
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< 30.03 fps
<<<<<<<<
looks like it does work. however, it stops after 3 readings. is this supposed to happen?

So it stopped after 100 frames. Remove this option and it may further run.

Hi can I get an update on what to do next?

hello aranis.das,

so… that’s not a bug.
since you’re able to fetch 1920x1080 sensor stream at 30-fps after enable the mode selection.

Alright. So what should I do to get input? I want to get a camera feed

as you can see… you’ve already get the output the stream via v4l standard IOCTL and gst pipeline.