Orin Nano error with IMX219 Camera

Hello,
I have installed the Orin Nano 8GB with SDK Manager and have two IMX219 cameras connnected.
As explained in the howto (Jetson Orin Nano Developer Kit User Guide - How-to | NVIDIA Developer), when I try to stream an image with:
gst-launch-1.0 nvarguscamerasrc !
‘video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080,
format=(string)NV12, framerate=(fraction)30/1’ ! nvvidconv !
video/x-raw, format=I420 ! x264enc !
h264parse ! qtmux ! filesink
location=output.mp4 -e

I get this error:

Setting pipeline to PAUSED …
Pipeline is live and does not need PREROLL …
Setting pipeline to PLAYING …
New clock: GstSystemClock
Redistribute latency…
GST_ARGUS: Creating output stream
(Argus) Error BadParameter: (propagating from src/eglstream/FrameConsumerImpl.cpp, function initialize(), line 89)
(Argus) Error BadParameter: (propagating from src/eglstream/FrameConsumerImpl.cpp, function create(), line 44)
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, threadInitialize:320 Failed to create FrameConsumer
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, threadFunction:241 (propagating)
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, waitRunning:203 Invalid thread state 3
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, execute:806 (propagating)
Got EOS from element “pipeline0”.
Execution ended after 0:00:02.169703918
Setting pipeline to NULL …
Freeing pipeline …

In the syslog; I get this error:
nvargus-daemon[804]: OFParserListModules: module list: /proc/device-tree/tegra-camera-platform/modules/module1
nvargus-daemon[804]: OFParserGetVirtualDevice: NVIDIA Camera virtual enumerator not found in proc device-tree
nvargus-daemon[804]: ---- imager: No override file found. ----
nvargus-daemon[804]: ---- imager: No override file found. ----
nvargus-daemon: E/ libnvphs:socket: Error[2]: socket connection /var/lib/nvphs/nvphsd.ctl to PHS failed: No such file or directory
nvargus-daemon: D/ libnvphs:socket: Warning: connecting to Power Hinting Service failed. Is PHS running?
nvargus-daemon[804]: === gst-launch-1.0[3620]: CameraProvider initialized (0xffff906bf890)SCF: Error BadValue: NvPHSSendThroughputHints (in src/common/CameraPowerHint.cpp, function sendCameraPowerHint(), line 56)

Any idea whats the problem here ?
I have also configured with: sudo /opt/nvidia/jetson-io/jetson-io.py
→ IMX219 Dual

Here some additional info on the system:

##Cameras found and connected
v4l2-ctl --list-devices

vi-output, imx219 9-0010 (platform:tegra-capture-vi:1):
/dev/video0

vi-output, imx219 10-0010 (platform:tegra-capture-vi:2):
/dev/video1

#####Camera connection is okay
v4l2-ctl --set-fmt-video=width=1920,height=1080,pixelformat=RG10 --stream-mmap --stream-count=300 -d /dev/video0
<<<<<<<<<<<<<<<<<<<<<<< 21.19 fps
<<<<<<<<<<<<<<<<<<<<< 21.19 fps
<<<<<<<<<<<<<<<<<<<<< 21.19 fps
<<<<<<<<<<<<<<<<<<<<< 21.19 fps
<<<<<<<<<<<<<<<<<<<<< 21.19 fps
<<<<<<<<<<<<<<<<<<<<<< 21.19 fps
<<<<<<<<<<<<<<<<<<<<< 21.19 fps
<<<<<<<<<<<<<<<<<<<<< 21.19 fps
<<<<<<<<<<<<<<<<<<<<< 21.19 fps
<<<<<<<<<<<<<<<<<<<<< 21.19 fps
<<<<<<<<<<<<<<<<<<<<<< 21.19 fps
<<<<<<<<<<<<<<<<<<<<< 21.19 fps
<<<<<<<<<<<<<<<<<<<<< 21.19 fps
<<<<<<<<<<<<<<<<<<<<< 21.19 fps
<<

###Installed Jetpack
apt show nvidia-jetpack
Package: nvidia-jetpack
Version: 5.1.1-b56
Priority: standard
Section: metapackages
Maintainer: NVIDIA Corporation
Installed-Size: 199 kB
Depends: nvidia-jetpack-runtime (= 5.1.1-b56), nvidia-jetpack-dev (= 5.1.1-b56)
Homepage: Jetson - Embedded AI Computing Platform | NVIDIA Developer
Download-Size: 29.3 kB
APT-Sources: https://repo.download.nvidia.com/jetson/common r35.3/main arm64 Packages
Description: NVIDIA Jetpack Meta Package

1 Like

Please confirm below pipeline first.

gst-launch-1.0 nvarguscamerasrc ! 'video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080' ! nvvidconv ! fpsdisplaysink video-sink=fakesink

Setting pipeline to PAUSED …
Pipeline is live and does not need PREROLL …
Setting pipeline to PLAYING …
New clock: GstSystemClock
GST_ARGUS: Creating output stream
(Argus) Error BadParameter: (propagating from src/eglstream/FrameConsumerImpl.cpp, function initialize(), line 89)
(Argus) Error BadParameter: (propagating from src/eglstream/FrameConsumerImpl.cpp, function create(), line 44)
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, threadInitialize:320 Failed to create FrameConsumer
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, threadFunction:241 (propagating)
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, waitRunning:203 Invalid thread state 3
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, execute:806 (propagating)
WARNING: from element /GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstFakeSink:fakesink0: Pipeline construction is invalid, please add queues.
Additional debug info:
gstbasesink.c(1209): gst_base_sink_query_latency (): /GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstFakeSink:fakesink0:
Not enough buffering available for the processing deadline of 0:00:00.020000000, add enough queues to buffer 0:00:00.020000000 additional data. Shortening processing latency to 0:00:00.000000000.
Got EOS from element “pipeline0”.
Execution ended after 0:00:02.067857088
Setting pipeline to NULL …
Freeing pipeline …

Syslog:
nvargus-daemon[804]: OFParserListModules: module list: /proc/device-tree/tegra-camera-platform/modules/module1
nvargus-daemon[804]: OFParserGetVirtualDevice: NVIDIA Camera virtual enumerator not found in proc device-tree
nvargus-daemon[804]: ---- imager: No override file found. ----
nvargus-daemon[804]: ---- imager: No override file found. ----
nvargus-daemon[804]: === gst-launch-1.0[4121]: CameraProvider initialized (0xffff9073c910)(Argus) Error InvalidState: Unknown stream deleted. (in src/api/CaptureSessionImpl.cpp, function outputStreamDeleted(), line 1094)

Try kill nvargus-daemon.

sudo pkill nvargus-daemon
sudo nvargus-daemon &

NO success:
What means the error in the syslog that the file nvphsd.ctl was not found ?

sudo pkill nvargus-daemon
sudo nvargus-daemon &

gst-launch-1.0 nvarguscamerasrc ! ‘video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080’ ! nvvidconv ! fpsdisplaysink video-sink=fakesink
=== gst-launch-1.0[4868]: Connection established (FFFF868A3900)OFParserListModules: module list: /proc/device-tree/tegra-camera-platform/modules/module0
OFParserListModules: module list: /proc/device-tree/tegra-camera-platform/modules/module1
OFParserGetVirtualDevice: NVIDIA Camera virtual enumerator not found in proc device-tree
---- imager: No override file found. ----
---- imager: No override file found. ----
=== gst-launch-1.0[4868]: CameraProvider initialized (0xffff807a2c90)Setting pipeline to PAUSED …
Pipeline is live and does not need PREROLL …
Setting pipeline to PLAYING …
New clock: GstSystemClock
SCF: Error BadValue: NvPHSSendThroughputHints (in src/common/CameraPowerHint.cpp, function sendCameraPowerHint(), line 56)
GST_ARGUS: Creating output stream
(Argus) Error NotSupported: Failed to create EGLStream (in src/api/EGLOutputStreamImpl.cpp, function initialize(), line 121)
(Argus) Error NotSupported: (propagating from src/api/CaptureSessionImpl.cpp, function createEGLOutputStream(), line 977)
(Argus) Error InvalidState: Unknown stream deleted. (in src/api/CaptureSessionImpl.cpp, function outputStreamDeleted(), line 1094)
(Argus) Error NotSupported: (propagating from src/api/CaptureSessionImpl.cpp, function createOutputStreamInternal(), line 839)
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, execute:799 Failed to create OutputStream
WARNING: from element /GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstFakeSink:fakesink0: Pipeline construction is invalid, please add queues.
Additional debug info:
gstbasesink.c(1209): gst_base_sink_query_latency (): /GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstFakeSink:fakesink0:
Not enough buffering available for the processing deadline of 0:00:00.020000000, add enough queues to buffer 0:00:00.020000000 additional data. Shortening processing latency to 0:00:00.000000000.
Got EOS from element “pipeline0”.
Execution ended after 0:00:08.277136518
Setting pipeline to NULL …
Freeing pipeline …

syslog:
nvargus-daemon: E/ libnvphs:socket: Error[2]: socket connection /var/lib/nvphs/nvphsd.ctl to PHS failed: No such file or directory
nvargus-daemon: D/ libnvphs:socket: Warning: connecting to Power Hinting Service failed. Is PHS running?
nvargus-daemon: E/ libnvphs:socket: Error[2]: socket connection /var/lib/nvphs/nvphsd.ctl to PHS failed: No such file or directory
nvargus-daemon: D/ libnvphs:socket: Warning: connecting to Power Hinting Service failed. Is PHS running?
nvargus-daemon: E/ libnvphs: Error: NvPHSSendThroughputHints[usecase=camera, hint=MinCPU, value=4294967295, timeout_ms=1000]: queue_or_send() failed

Does your system have display?
Did you export DISPLAY=:0 before launch the gst-launch-1.0 command?

1 Like

The system has a display, but when I want to stream the image into an image, I do not need a display ? With the following command, I killed and restarted nv-argus before:
gst-launch-1.0 nvarguscamerasrc sensor-id=0 ! ‘video/x-raw(memory:NVMM),width=3280,height=2464,framerate=25/1’ ! nvvidconv flip-method=3 ! nvjpegenc ! multifilesink location=./camera.jpeg

I get this error, no idea whats the problem.
=== gst-launch-1.0[6976]: Connection established (FFFF9249D900)OFParserListModules: module list: /proc/device-tree/tegra-camera-platform/modules/module0
OFParserListModules: module list: /proc/device-tree/tegra-camera-platform/modules/module1
OFParserGetVirtualDevice: NVIDIA Camera virtual enumerator not found in proc device-tree
---- imager: No override file found. ----
---- imager: No override file found. ----
=== gst-launch-1.0[6976]: CameraProvider initialized (0xffff8c7a1f70)Setting pipeline to PAUSED …
Pipeline is live and does not need PREROLL …
Setting pipeline to PLAYING …
New clock: GstSystemClock
SCF: Error BadValue: NvPHSSendThroughputHints (in src/common/CameraPowerHint.cpp, function sendCameraPowerHint(), line 56)
GST_ARGUS: Creating output stream
(Argus) Error NotSupported: Failed to create EGLStream (in src/api/EGLOutputStreamImpl.cpp, function initialize(), line 121)
(Argus) Error NotSupported: (propagating from src/api/CaptureSessionImpl.cpp, function createEGLOutputStream(), line 977)
(Argus) Error InvalidState: Unknown stream deleted. (in src/api/CaptureSessionImpl.cpp, function outputStreamDeleted(), line 1094)
(Argus) Error NotSupported: (propagating from src/api/CaptureSessionImpl.cpp, function createOutputStreamInternal(), line 839)
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, execute:799 Failed to create OutputStream
Got EOS from element “pipeline0”.
Execution ended after 0:00:08.186094293
Setting pipeline to NULL …
Freeing pipeline …

Below message looks like the display driver didn’t initial.

Thanks for your reply.
How is this possible when I put a fresh SDK install on the device ?
Any ideas or proposals how to fix ?
Thanks so much in advance.

Did you run the gst-launch-1.0 command by remote to Orin Nano?
Could you just verify the gst-launch-1.0 command by connect keyboard to Orin Nano.

I did another clean SDK install and launch it with display connected:

gst-launch-1.0 nvarguscamerasrc ! ‘video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080’ ! nvvidconv ! fpsdisplaysink video-sink=fakesink
Setting pipeline to PAUSED …
Pipeline is live and does not need PREROLL …
Setting pipeline to PLAYING …
New clock: GstSystemClock
GST_ARGUS: Creating output stream
CONSUMER: Waiting until producer is connected…
GST_ARGUS: Available Sensor modes :
GST_ARGUS: 3280 x 2464 FR = 21,000000 fps Duration = 47619048 ; Analog Gain range min 1,000000, max 10,625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 3280 x 1848 FR = 28,000001 fps Duration = 35714284 ; Analog Gain range min 1,000000, max 10,625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1920 x 1080 FR = 29,999999 fps Duration = 33333334 ; Analog Gain range min 1,000000, max 10,625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1640 x 1232 FR = 29,999999 fps Duration = 33333334 ; Analog Gain range min 1,000000, max 10,625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1280 x 720 FR = 59,999999 fps Duration = 16666667 ; Analog Gain range min 1,000000, max 10,625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: Running with following settings:
Camera index = 1
Camera mode = 2
Output Stream W = 1920 H = 1080
seconds to Run = 0
Frame Rate = 29,999999
GST_ARGUS: Setup Complete, Starting captures for 0 seconds
GST_ARGUS: Starting repeat capture requests.
CONSUMER: Producer has connected; continuing.
WARNING: from element /GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstFakeSink:fakesink0: Pipeline construction is invalid, please add queues.
Additional debug info:
gstbasesink.c(1209): gst_base_sink_query_latency (): /GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstFakeSink:fakesink0:
Not enough buffering available for the processing deadline of 0:00:00.020000000, add enough queues to buffer 0:00:00.020000000 additional data. Shortening processing latency to 0:00:00.000000000.

You may try:

gst-launch-1.0 nvarguscamerasrc ! 'video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080' ! nvvidconv ! queue ! fpsdisplaysink text-overlay=0 video-sink=fakesink -v

gst-launch-1.0 nvarguscamerasrc ! ‘video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080’ ! nvvidconv ! queue ! fpsdisplaysink text-overlay=0 video-sink=fakesink -v
Setting pipeline to PAUSED …
Pipeline is live and does not need PREROLL …
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstFakeSink:fakesink0: sync = true
Setting pipeline to PLAYING …
New clock: GstSystemClock
/GstPipeline:pipeline0/GstNvArgusCameraSrc:nvarguscamerasrc0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/Gstnvvconv:nvvconv0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstQueue:queue0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/Gstnvvconv:nvvconv0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstQueue:queue0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0.GstGhostPad:sink.GstProxyPad:proxypad0: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstFakeSink:fakesink0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0.GstGhostPad:sink: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1
GST_ARGUS: Creating output stream
CONSUMER: Waiting until producer is connected…
GST_ARGUS: Available Sensor modes :
GST_ARGUS: 3280 x 2464 FR = 21,000000 fps Duration = 47619048 ; Analog Gain range min 1,000000, max 10,625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 3280 x 1848 FR = 28,000001 fps Duration = 35714284 ; Analog Gain range min 1,000000, max 10,625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1920 x 1080 FR = 29,999999 fps Duration = 33333334 ; Analog Gain range min 1,000000, max 10,625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1640 x 1232 FR = 29,999999 fps Duration = 33333334 ; Analog Gain range min 1,000000, max 10,625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1280 x 720 FR = 59,999999 fps Duration = 16666667 ; Analog Gain range min 1,000000, max 10,625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: Running with following settings:
Camera index = 0
Camera mode = 2
Output Stream W = 1920 H = 1080
seconds to Run = 0
Frame Rate = 29,999999
GST_ARGUS: Setup Complete, Starting captures for 0 seconds
GST_ARGUS: Starting repeat capture requests.
CONSUMER: Producer has connected; continuing.
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstFakeSink:fakesink0: sync = true
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 17, dropped: 0, current: 31,98, average: 31,98
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 33, dropped: 0, current: 30,08, average: 31,03
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 48, dropped: 0, current: 29,93, average: 30,68
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 64, dropped: 0, current: 30,05, average: 30,52
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 79, dropped: 0, current: 29,98, average: 30,41
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 94, dropped: 0, current: 29,98, average: 30,34
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 110, dropped: 0, current: 30,04, average: 30,30
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 125, dropped: 0, current: 29,98, average: 30,26
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 141, dropped: 0, current: 30,01, average: 30,23

Looks like add text-overlay=0 able to run the pipeline without problem.
Try use nvargus_nvraw to capture raw or JPEG to confirm the image.

So your camera works with gstreamer. Is there another issue ? You should be able to view your cam with :

gst-launch-1.0 nvarguscamerasrc ! 'video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080' ! nvvidconv ! queue ! nv3dsink -v

and record as in your original post with :

gst-launch-1.0 nvarguscamerasrc ! 'video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080' ! nvvidconv ! queue ! video/x-raw,format=I420 ! x264enc ! h264parse ! qtmux ! filesink location=output.mp4 -ev

Can you provide me an example with appsink ?

What is the appsink ? What format does it expects ?
For example, if appsink is opencv looking for BGR format, a pipeline would be:

nvarguscamerasrc ! video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080 ! nvvidconv ! video/x-raw,format=BGRx ! videoconvert ! video/x-raw,format=BGR ! queue ! appsink drop=1

That works, thanks !!!

When I run it on the device with display, it works. But when I run the command with ssh connected, I get this error:
gst-launch-1.0 nvarguscamerasrc sensor-id=0 ! ‘video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080’ ! nvvidconv ! video/x-raw,format=BGRx ! videoconvert ! video/x-raw,format=BGR ! queue ! appsink drop=1
Setting pipeline to PAUSED …
Pipeline is live and does not need PREROLL …
Setting pipeline to PLAYING …
New clock: GstSystemClock
GST_ARGUS: Creating output stream
(Argus) Error BadParameter: (propagating from src/eglstream/FrameConsumerImpl.cpp, function initialize(), line 89)
(Argus) Error BadParameter: (propagating from src/eglstream/FrameConsumerImpl.cpp, function create(), line 44)
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, threadInitialize:320 Failed to create FrameConsumer
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, threadFunction:241 (propagating)
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, waitRunning:203 Invalid thread state 3
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, execute:806 (propagating)
Got EOS from element “pipeline0”.
Execution ended after 0:00:02.097940929
Setting pipeline to NULL …
Freeing pipeline …