Headless Orin Nano won't export Argus CSI camera stream to remote screen

I need to export a video stream from IMX219-83 Stereo Cameras to a remote display.

  • I have no physical display connected on the Orin Nano - as it is running headless.

  • Configuration:apt show nvidia-jetpackapt show nvidia-jetpack
    Package: nvidia-jetpack
    Version: 5.1.1-b56
    Priority: standard
    Section: metapackages
    Maintainer: NVIDIA Corporation
    Installed-Size: 199 kB
    Depends: nvidia-jetpack-runtime (= 5.1.1-b56), nvidia-jetpack-dev (= 5.1.1-b56)
    Homepage: Jetson - Embedded AI Computing Platform | NVIDIA Developer
    Download-Size: 29.3 kB
    APT-Sources: https://repo.download.nvidia.com/jetson/common r35.3/main arm64 Packages

on local (22.04) ubuntu machine, I log into Orin Nano: ssh -X username@machinename

on local machine, I also: export DISPLAY=localhost:10.0

Then, in order to test the connection, I run cheese (usb camera plugged in to Orin Nano USB port). This runs successfully, and shows exported video stream.

Now, I run:
gst-launch-1.0 nvarguscamerasrc ! nvvidconv ! ximagesink
libEGL warning: DRI3: failed to query the version
libEGL warning: DRI2: failed to authenticate
Setting pipeline to PAUSED …
Pipeline is live and does not need PREROLL …
Setting pipeline to PLAYING …
New clock: GstSystemClock
GST_ARGUS: Creating output stream
(Argus) Error BadParameter: (propagating from src/eglstream/FrameConsumerImpl.cpp, function initialize(), line 89)
(Argus) Error BadParameter: (propagating from src/eglstream/FrameConsumerImpl.cpp, function create(), line 44)
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, threadInitialize:320 Failed to create FrameConsumer
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, threadFunction:241 (propagating)
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, waitRunning:203 Invalid thread state 3
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, execute:806 (propagating)
WARNING: from element /GstPipeline:pipeline0/GstXImageSink:ximagesink0: Pipeline construction is invalid, please add queues.
Additional debug info:
gstbasesink.c(1209): gst_base_sink_query_latency (): /GstPipeline:pipeline0/GstXImageSink:ximagesink0:
Not enough buffering available for the processing deadline of 0:00:00.015000000, add enough queues to buffer 0:00:00.015000000 additional data. Shortening processing latency to 0:00:00.000000000.
Got EOS from element “pipeline0”.
Execution ended after 0:00:00.288199999
Setting pipeline to NULL …
Freeing pipeline …

/var/log/syslog shows:

May 27 11:49:12 auvai1 nvargus-daemon[949]: === NVIDIA Libargus Camera Service (0.99.33)=== Listening for connections…=== gst-launch-1.0[4072]: Connection established (FFFF9A41D900)OFParserListModules: module list: /proc/device-tree/tegra-camera-platform/modules/module0
May 27 11:49:12 auvai1 nvargus-daemon[949]: OFParserListModules: module list: /proc/device-tree/tegra-camera-platform/modules/module1
May 27 11:49:12 auvai1 nvargus-daemon[949]: (NvCamV4l2) Error ModuleNotPresent: V4L2Device not available (in /dvs/git/dirty/git-master_linux/camera/utils/nvcamv4l2/v4l2_device.cpp, function verifyDeviceCaps(), line 206)
May 27 11:49:12 auvai1 nvargus-daemon[949]: (NvCamV4l2) Error ModuleNotPresent: (propagating from /dvs/git/dirty/git-master_linux/camera/utils/nvcamv4l2/v4l2_device.cpp, function initialize(), line 66)
May 27 11:49:12 auvai1 nvargus-daemon[949]: (NvCamV4l2) Error ModuleNotPresent: V4L2Device not available (in /dvs/git/dirty/git-master_linux/camera/utils/nvcamv4l2/v4l2_device.cpp, function verifyDeviceCaps(), line 206)
May 27 11:49:12 auvai1 nvargus-daemon[949]: (NvCamV4l2) Error ModuleNotPresent: (propagating from /dvs/git/dirty/git-master_linux/camera/utils/nvcamv4l2/v4l2_device.cpp, function initialize(), line 66)
May 27 11:49:12 auvai1 nvargus-daemon[949]: OFParserGetVirtualDevice: NVIDIA Camera virtual enumerator not found in proc device-tree
May 27 11:49:12 auvai1 nvargus-daemon[949]: ---- imager: No override file found. ----
May 27 11:49:12 auvai1 nvargus-daemon[949]: message repeated 2 times: [ ---- imager: No override file found. ----]
May 27 11:49:12 auvai1 nvargus-daemon[949]: initializeDevNode: Failed to open dev node ‘/dev/camera/video2’; No such file or directory, trying alternate default location
May 27 11:49:12 auvai1 nvargus-daemon[949]: ---- imager: No override file found. ----
May 27 11:49:12 auvai1 nvargus-daemon[949]: initializeDevNode: Failed to open dev node ‘/dev/camera/video3’; No such file or directory, trying alternate default location
May 27 11:49:12 auvai1 nvargus-daemon[949]: (NvCamV4l2) Error NotSupported: Must be capture/output devices (in /dvs/git/dirty/git-master_linux/camera/utils/nvcamv4l2/v4l2_device.cpp, function setActiveBufferType(), line 813)
May 27 11:49:12 auvai1 nvargus-daemon[949]: (NvCamV4l2) Error NotSupported: (propagating from /dvs/git/dirty/git-master_linux/camera/utils/nvcamv4l2/v4l2_device.cpp, function initialize(), line 68)
May 27 11:49:12 auvai1 nvargus-daemon[949]: (NvOdmDevice) Error NotSupported: (propagating from dvs/git/dirty/git-master_linux/camera-partner/imager/src/devices/V4L2SensorNonViCsi.cpp, function initialize(), line 146)
May 27 11:49:12 auvai1 nvargus-daemon[949]: NvPclDriverInitializeData: Unable to initialize driver v4l2_sensor_usb
May 27 11:49:12 auvai1 nvargus-daemon[949]: NvPclInitializeDrivers: error: Failed to init camera sub module v4l2_sensor_usb
May 27 11:49:12 auvai1 nvargus-daemon[949]: NvPclStartPlatformDrivers: Failed to start module drivers
May 27 11:49:12 auvai1 nvargus-daemon[949]: NvPclStateControllerOpen: Failed ImagerGUID 3. (error 0x2)
May 27 11:49:12 auvai1 nvargus-daemon[949]: NvPclOpen: PCL Open Failed. Error: 0xf
May 27 11:49:12 auvai1 nvargus-daemon[949]: SCF: Error BadParameter: Sensor could not be opened. (in src/services/capture/CaptureServiceDeviceSensor.cpp, function getSourceFromGuid(), line 689)
May 27 11:49:12 auvai1 nvargus-daemon[949]: SCF: Error BadParameter: (propagating from src/services/capture/CaptureService.cpp, function addSourceByGuid(), line 453)
May 27 11:49:12 auvai1 nvargus-daemon[949]: SCF: Error BadParameter: (propagating from src/api/CameraDriver.cpp, function addSourceByIndex(), line 333)
May 27 11:49:12 auvai1 nvargus-daemon[949]: SCF: Error BadParameter: (propagating from src/api/CameraDriver.cpp, function getSource(), line 505)
May 27 11:49:12 auvai1 nvargus-daemon: E/ libnvphs:socket: Error[2]: socket connection /var/lib/nvphs/nvphsd.ctl to PHS failed: No such file or directory
May 27 11:49:12 auvai1 nvargus-daemon: D/ libnvphs:socket: Warning: connecting to Power Hinting Service failed. Is PHS running?
May 27 11:49:12 auvai1 nvargus-daemon[949]: === gst-launch-1.0[4072]: CameraProvider initialized (0xffff947a1260)SCF: Error BadValue: NvPHSSendThroughputHints (in src/common/CameraPowerHint.cpp, function sendCameraPowerHint(), line 56)
May 27 11:49:12 auvai1 nvargus-daemon: E/ libnvphs:socket: Error[2]: socket connection /var/lib/nvphs/nvphsd.ctl to PHS failed: No such file or directory
May 27 11:49:12 auvai1 nvargus-daemon: D/ libnvphs:socket: Warning: connecting to Power Hinting Service failed. Is PHS running?
May 27 11:49:12 auvai1 nvargus-daemon: E/ libnvphs: Error: NvPHSSendThroughputHints[usecase=camera, hint=MinCPU, value=4294967295, timeout_ms=1000]: queue_or_send() failed
May 27 11:49:13 auvai1 nvargus-daemon[949]: (Argus) Error InvalidState: Unknown stream deleted. (in src/api/CaptureSessionImpl.cpp, function outputStreamDeleted(), line 1094)

Please also note that I have run gst-launch-1.0 nvarguscamerasrc ! nvvidconv ! ximagesink in the past, with physical display plugged into Orin Nano Display Port connection. This runs as expected, showing video stream on local Display. I simply cannot get this video stream to ever show up remotely when there is no display physically connected to Orin Nano.

I’m trying to understand this error better:
May 27 11:49:13 auvai1 nvargus-daemon[949]: (Argus) Error InvalidState: Unknown stream deleted. (in src/api/CaptureSessionImpl.cpp, function outputStreamDeleted(), line 1094)

I would be deeply appreciative of any assistance you can provide,
Thankyou

Just a note, not sure if there might be other problems: When you use ssh with -X or -Y, this takes care of any export of DISPLAY when you run the program on command line from the ssh connection. You do not need to run another export, and this would probably cause failure.

Also, from the perspective of the ssh session and some non-ssh session on the remote system, the DISPLAY value is probably not what you expect. I would bet that using “localhost” in the DISPLAY points to the wrong computer. Does it work with a clean “ssh -X” session? If not, once you’ve reached the remote Jetson with “ssh -X”, what do you see from “echo $DISPLAY”?

Hi, @linuxdev, thankyou for your reply.

After I reboot Orin Nano, I do:

~$ ssh -X username@machinename

Then
~$ echo $DISPLAY
localhost:10.0

I see. $DISPLAY gets set automatically?. When I test with gedit, it correctly displays on the local machine.

when I then run

gst-launch-1.0 nvarguscamerasrc ! nvvidconv ! ximagesink

A display does briefly flash up on the screen, and then the errors get reported. Unfortunately, there is no video stream. Help would be appreciated.

I think this is working as expected (but there is a twist on that story, explained below). DISPLAY is indeed set up correctly and automatically.

However, remote display is not what most people think it is. This is quite different compared to a virtual desktop.

A virtual desktop runs everything on the remote system, but displays on the local system. Remove display runs the non-GUI components on the remote system, but uses the graphics (and libraries and so on) on the local system. This might seem like it is the same, but it is not.

ssh remote display of a program which does not access GUI abilities or audio abilities will be indistinguishable. Let’s pretend though that you use the GPU. It is no longer the GPU of the remote system performing that work, it is instead the local GPU. The libraries of the local system are also used, and not those of the remote system. If the remote program is for example OpenGL graphics, and it requires a particular release, and all of that exists on the remote system, but not on the local system, then you get a failure. Same for audio: If the remote system has everything needed for audio which the particular program requires, but you display remotely to another system, then it is that other system which must have both the audio hardware and audio software. CUDA programs also match this description…if you use a powerful desktop PC and run a remote CUDA program on the Jetson, then one of two things happen: Either the PC has what is needed, and the Jetson seems very powerful, or the program will crash and burn (even if you are able to run the program directly on the Jetson when not using remote display).

I’m thinking your program would not run directly on the computer you are using directly, but would run on the remote system if you were directly/locally at that system’s keyboard/monitor.

Do both computers have the ability to run the program without remote access?

If you want the other end to do everything, and only remote display, then you need a virtual desktop. The ssh remote display relays events to the host, and the host interprets them, whereas a virtual desktop runs everything on that other system, and merely relays the results of those events (not the events themselves).

@linuxdev, thankyou for your detailed explanation.

I do not completely understand this yet, but your explanation goes a log way to help me to understand the differences. It is easy to use some of these features and take for granted their functionality without understanding the underlying mechanisms!

In my case, since I am using the Orin Nano in a robotics application, most everything does need to run remotely on the Orin Nano. I do need local access to one of the camera video streams from the IMX219-83 stereo CSI (not USB) cameras. It does not need to be strict real-time access, but close to real-time.

Also, please note that when I do plug a display into Orin Nano Display port, and run gst-launch-1.0 nvarguscamerasrc ! nvvidconv ! ximagesink, this does work as expected. In practice, since Orin Nano is part of a remote autonomous system, no display will be physically connected.

The part I struggle with, (at least in terms of my limited understanding of these underlying mechanisms) is this:

When running video stream from the remote Orin to the local machine using cheese application, this works perfectly with USB camera plugged in to Orin Nano USB port. The video stream displays perfectly on the local screen.

However, in the case of Gstreamer, camera streams originating from the Gstreamer nvarguscamerasrc pipeline always fail when I attempt to display them on the local screen(when nothing is physically connected to Orin display port). My interpretation of your explanation leads me to believe that it is failing because more processing would need to be done on my local machine, and that processing infrastructure is simply not there.

I’m trying to understand the difference in terms of processing requirements between these two use cases.

Regarding your questions:

I’m thinking your program would not run directly on the computer you are using directly, but would run on the remote system if you were directly/locally at that system’s keyboard/monitor.

Yes - exactly.
gst-launch-1.0 nvarguscamerasrc ! …
will only run on The Orin Nano (remotely), since that is where video is being captured and processed in real-time.

Do both computers have the ability to run the program without remote access?

It would not make sense for gst-launch-1.0 nvarguscamerasrc ! … to be directly run on the local computer, as this local computer is only being used for observing video streams originating from the stereo camera on Orin, and for providing some executive instruction back to the Orin Nano.

I would suggest using RTSP to check the camera preview.

More information about X forwarding versus a virtual desktop can be found here:
https://forums.developer.nvidia.com/t/topic/68681/7

Incidentally, the RTSP mentioned by @ShaneCCC is probably more efficient and also more portable.

2 Likes

Thankyou @ShaneCCC, @linuxdev, for your help so far. I compiled test-launch.c using:

gcc test-launch.c -o test-launch $(pkg-config --cflags --libs gstreamer-1.0 gstreamer-rtsp-server-1.0)

and get warnings:

In file included from /usr/include/gstreamer-1.0/gst/rtsp/gstrtsp.h:24,
from /usr/include/gstreamer-1.0/gst/rtsp/rtsp.h:27,
from /usr/include/gstreamer-1.0/gst/rtsp-server/rtsp-media.h:21,
from /usr/include/gstreamer-1.0/gst/rtsp-server/rtsp-session.h:58,
from /usr/include/gstreamer-1.0/gst/rtsp-server/rtsp-session-pool.h:33,
from /usr/include/gstreamer-1.0/gst/rtsp-server/rtsp-server-object.h:32,
from /usr/include/gstreamer-1.0/gst/rtsp-server/rtsp-server.h:28,
from test-launch.c:22:
/usr/include/gstreamer-1.0/gst/rtsp/gstrtspconnection.h:79:1: warning: ‘GTimeVal’ is deprecated: Use ‘GDateTime’ instead [-Wdeprecated-declarations]
79 | GstRTSPResult gst_rtsp_connection_connect (GstRTSPConnection * conn, GTimeVal * timeout);

This does compile, and I then start the server process:
./test-launch “nvarguscamerasrc ! nvvidconv ! nvv4l2h264enc ! h264parse ! rtph264pay name=pay0 pt=96”

The message then is printed:
stream ready at rtsp://127.0.0.1:8554/test

When I invoke:
sudo ss -ltnp
It shows:

LISTEN 0 5 0.0.0.0:8554 0.0.0.0:* users:((“test-launch”,pid=3874,fd=4))

I notice that the listener shows 0.0.0.0:8554 instead of 127.0.0.1:8554
Is this ok? It may be listening to everything locally, and just port 8554 for remote?

Then, I open VLC Media Player on PC on same local network, and attempt to Open Media via Network with rtsp://192.168.254.170:8554/test

VLC Shows Error:
Your input can’t be opened:
VLC is unable to open the MRL ‘rtsp://192.168.254.170:8554/test’. Check the log for details.

VLC Tools → Messages Show:
live555 error: Failed to connect with rtsp://192.168.254.170:8554/test

satip error: Failed to setup RTSP session

access_realrtsp warning: Cseq mismatch, got 1, assumed 0

access_realrtsp warning: only real/helix rtsp servers supported for now

The information from /var/log/syslog file seems to indicate that a connection was initially made to 192.168.254.170:8554, but that it seems to have closed immediately:

Jun 6 23:26:30 auvai1 nvargus-daemon[954]: === test-launch[3874]: CameraProvider initialized (0xffff847309c0)=== test-launch[3874]: Connection closed (FFFF8BD94900)=== test-launch[3874]: WARNING: CameraProvider was not destroyed before client connection terminated.=== test-launch[3874]: The client may have abnormally terminated. Destroying CameraProvider…=== test-launch[3874]: CameraProvider destroyed (0xffff847309c0)=== test-launch[3874]: Connection cleaned up (FFFF8BD94900)=== test-launch[4072]: Connection established (FFFF8BD94900)OFParserListModules: module list: /proc/device-tree/tegra-camera-platform/modules/module0
Jun 6 23:26:30 auvai1 nvargus-daemon[954]: OFParserListModules: module list: /proc/device-tree/tegra-camera-platform/modules/module1
Jun 6 23:26:30 auvai1 nvargus-daemon[954]: OFParserGetVirtualDevice: NVIDIA Camera virtual enumerator not found in proc device-tree
Jun 6 23:26:30 auvai1 nvargus-daemon[954]: ---- imager: No override file found. ----

Please confirm by videotestsrc

Ok, thankyou @ShaneCCC for your quick response!

I try this:
./test-launch “videotestsrc is-live=1 ! nvvidconv ! nvv4l2h264enc ! h264parse ! rtph264pay name=pay0 pt=96”
stream ready at rtsp://127.0.0.1:8554/test
libEGL warning: DRI2: failed to authenticate

When I try opening RTSP again on VLC, I get similar error messages

Hi,
There’s no hardware encoders in Orin Nano, so please use software encoder such as x264enc.

videotestsrc is-live=1 ! x264enc ! h264parse ! rtph264pay name=pay0 pt=96

Thankyou @DaneLLL!
./test-launch “videotestsrc is-live=1 ! x264enc ! h264parse ! rtph264pay name=pay0 pt=96”
stream ready at rtsp://127.0.0.1:8554/test

successfully generates the Test Pattern. So Now RTSP path is confirmed.

Now, to use Argus and IMX219-83 stereo camera

./test-launch “nvarguscamerasrc ! nvvidconv ! x264enc ! h264parse ! rtph264pay name=pay0 pt=96”
stream ready at rtsp://127.0.0.1:8554/test

When attempting to stream from VLC media player,
Orin throws the following:

libEGL warning: DRI2: failed to authenticate
GST_ARGUS: Creating output stream
(Argus) Error BadParameter: (propagating from src/eglstream/FrameConsumerImpl.cpp, function initialize(), line 89)
(Argus) Error BadParameter: (propagating from src/eglstream/FrameConsumerImpl.cpp, function create(), line 44)
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, threadInitialize:320 Failed to create FrameConsumer
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, threadFunction:241 (propagating)
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, waitRunning:203 Invalid thread state 3
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, execute:806 (propagating)

Here is an update:

test-launch “nvarguscamerasrc sensor-id=0 ! nvvidconv ! videoflip method=2 ! x264enc ! h264parse ! rtph264pay name=pay0 pt=96”

will now stream camera video to VLC application, but only when I have a physical display port display plugged in to Orin Nano.

Thanks for your help with this!

My question is: how can I also get this to work with Orin Nano running completely headless (i.e., with nothing plugged in to Orin Nano?

Hi,
We would need to replicate the issue and check. Do you use Orin Nano developer kit and observe the issue? Please check if you can run the command on developer kit without connecting to a display device:

$ gst-launch-1.0 nvarguscamerasrc ! fakesink

Thankyou @DaneLLL

Yes, exactly. I am using Orin Nano Developer Kit. Here are command line output, and syslog output when running the command with nothing connected to the Orin Nano:
gst-launch-1.0 nvarguscamerasrc ! fakesink

libEGL warning: DRI2: failed to authenticate
Setting pipeline to PAUSED …
Pipeline is live and does not need PREROLL …
Setting pipeline to PLAYING …
New clock: GstSystemClock
GST_ARGUS: Creating output stream
(Argus) Error BadParameter: (propagating from src/eglstream/FrameConsumerImpl.cpp, function initialize(), line 89)
(Argus) Error BadParameter: (propagating from src/eglstream/FrameConsumerImpl.cpp, function create(), line 44)
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, threadInitialize:320 Failed to create FrameConsumer
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, threadFunction:241 (propagating)
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, waitRunning:203 Invalid thread state 3
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, execute:806 (propagating)
Got EOS from element “pipeline0”.
Execution ended after 0:00:00.081717921
Setting pipeline to NULL …
Freeing pipeline …

Here is /var/log/syslog

Jun 7 18:40:50 auvai1 nvargus-daemon[927]: === gst-launch-1.0[4178]: CameraProvider destroyed (0xffffa8d62040)=== gst-launch-1.0[4178]: Connection closed (FFFFAD791900)=== gst-launch-1.0[4178]: Connection cleaned up (FFFFAD791900)=== gst-launch-1.0[4242]: Connection established (FFFFAD791900)OFParserListModules: module list: /proc/device-tree/tegra-camera-platform/modules/module0
Jun 7 18:40:50 auvai1 nvargus-daemon[927]: OFParserListModules: module list: /proc/device-tree/tegra-camera-platform/modules/module1
Jun 7 18:40:50 auvai1 nvargus-daemon[927]: OFParserGetVirtualDevice: NVIDIA Camera virtual enumerator not found in proc device-tree
Jun 7 18:40:50 auvai1 nvargus-daemon[927]: ---- imager: No override file found. ----
Jun 7 18:40:50 auvai1 nvargus-daemon[927]: ---- imager: No override file found. ----
Jun 7 18:40:50 auvai1 nvargus-daemon[927]: === gst-launch-1.0[4242]: CameraProvider initialized (0xffffa8d62040)(Argus) Error InvalidState: Unknown stream deleted. (in src/api/CaptureSessionImpl.cpp, function outputStreamDeleted(), line 1094)

@dspfpga
Does your system boot to console instead of ubuntu Desktop?
If so you may need run the xinit& command and export DISPLAY=:0 for loading EGL driver.

libEGL warning: DRI2: failed to authenticate

Hello @ShaneCCC,
I currently boot into the desktop, and through XRDP, I log in using Remote Desktop Connection from a PC. It brings up (nearly) the same desktop environment that I see when logging in directly to Orin Nano when display is plugged in.

DISPLAY is currently :10.0

DISPLAY=:0 gst-launch-1.0 nvarguscamerasrc ! fakesink
No protocol specified
No protocol specified
No protocol specified
No protocol specified
nvbufsurftransform: Could not get EGL display connection
Setting pipeline to PAUSED …
Pipeline is live and does not need PREROLL …
Setting pipeline to PLAYING …
New clock: GstSystemClock
GST_ARGUS: Creating output stream
No protocol specified
No protocol specified
No protocol specified
No protocol specified
(Argus) Error NotSupported: Failed to initialize EGLDisplay (in src/eglutils/EGLUtils.cpp, function getDefaultDisplay(), line 77)
(Argus) Error BadParameter: (propagating from src/eglstream/FrameConsumerImpl.cpp, function initialize(), line 89)
(Argus) Error BadParameter: (propagating from src/eglstream/FrameConsumerImpl.cpp, function create(), line 44)
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, threadInitialize:320 Failed to create FrameConsumer
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, threadFunction:241 (propagating)
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, waitRunning:203 Invalid thread state 3
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, execute:806 (propagating)
Got EOS from element “pipeline0”.
Execution ended after 0:00:00.092035309
Setting pipeline to NULL …
Freeing pipeline …

Hi,
We try Orin Nano devkit + Raspberry Pi camera v2/Jetapack 5.1.1, and do not observe the issue. We don’t connect a TV to DP output and boot the system, and we can ssh login remotely and run the commands successfully:

$ gst-launch-1.0 nvarguscamerasrc ! fakesink
$ gst-launch-1.0 nvarguscamerasrc num-buffers=166 ! nvvidconv ! video/x-raw ! x264enc ! matroskamux ! filesink location=test.mkv

Could you try the commands? We don’t hit any issue and you should be able to run the command successfully.

ssh works! I can now stream to the remote machine.

When I ssh in, and run gst-launch-1.0 nvarguscamerasrc ! fakesink
This runs without error.

gst-launch-1.0 nvarguscamerasrc num-buffers=166 ! nvvidconv ! video/x-raw ! x264enc ! matroskamux ! filesink location=test.mkv

This also runs without error.

In these instances, DISPLAY is not set.

Thankyou!

I wonder why It does not work through PC Remote Desktop Connection. When I bring up a terminal window in PC Remote desktop, this way causes the EGL connection to fail.

Right, Remote Desktop is virtual DISPLAY that would cause the EGL driver failed.

2 Likes