Export video stream from Orin Nano to a remote display

Hi, I am trying to remotely display a camera stream from CSI port on Orin Nano using gstreamer:

gst-launch-1.0 nvarguscamerasrc sensor-id=0 ! ‘video/x-raw(memory:NVMM), width=3280, height=2464, framerate=21/1, format=NV12’ ! nvvidconv flip-method=2 ! ‘video/x-raw, width=800, height=600, format=BGRx’ ! videoconvert ! ‘video/x-raw, format=BGR’ ! queue ! nveglglessink -e

I am running this command on a remote machine using VNC, and the camera video display always shows up on the display directly connected to the Orin Nano. I need to find a method to pipe the video stream to a display on my remote machine.

Thankyou for your help!

Log in remotely from a Linux host into Jeston using ssh with X11 forwarding (ssh -X Jeston_IP) and use xvimagesink:

gst-launch-1.0 nvarguscamerasrc sensor-id=0 ! ‘video/x-raw(memory:NVMM), width=3280, height=2464, framerate=21/1, format=NV12’ ! nvvidconv flip-method=2 ! ‘video/x-raw, width=800, height=600’ ! videoconvert ! queue ! xvimagesink

If you can’t use a host with X server and want to use VNC, you may try to set DISPLAY to your VNC server display. For OpenGL, it may be a bit more complex, maybe you’ll have to install something like VirtualGL if you intend to run headless.

Hello, Thankyou for your reply.

I started vnc server to run remotely

vncserver :2

Next, using your gstreamer command syntax.

gst-launch-1.0 nvarguscamerasrc sensor-id=0 ! ‘video/x-raw(memory:NVMM), width=3280, height=2464, framerate=21/1, format=NV12’ ! nvvidconv flip-method=2 ! ‘video/x-raw, width=800, height=600’ ! videoconvert ! queue ! xvimagesink
Setting pipeline to PAUSED …
ERROR: Pipeline doesn’t want to pause.
ERROR: from element /GstPipeline:pipeline0/GstXvImageSink:xvimagesink0: Could not initialise Xv output
Additional debug info:
xvimagesink.c(1773): gst_xv_image_sink_open (): /GstPipeline:pipeline0/GstXvImageSink:xvimagesink0:
XVideo extension is not available
Setting pipeline to NULL …
Freeing pipeline …

Does this mean there needs to be XVideo extention running on remote machine?

I should add that remote machine is Windows with X11, and that I do have a path that uses Xming

Thanks again for the help.

Not sure what you mean here…

If you just want the gstreamer output in a window of your Windows X server, you would configure X11 forwarding in puTTY or your other ssh terminal.

If you want to use VNC instead, then you would set DISPLAY to your VNC screen:

DISPLAY=:2  nvarguscamerasrc sensor-id=0 ! 'video/x-raw(memory:NVMM), width=3280, height=2464, framerate=21/1, format=NV12' ! nvvidconv flip-method=2 ! 'video/x-raw, width=800, height=600' ! videoconvert ! queue ! xvimagesink

I confused the text by mentioning Xming. I only want to run using VNC. When I do this, using

DISPLAY=:2 gst-launch-1.0 nvarguscamerasrc sensor-id=0 ! ‘video/x-raw(memor
y:NVMM), width=3280, height=2464, framerate=21/1, format=NV12’ ! nvvidconv flip-method=2 ! ‘vi
deo/x-raw, width=800, height=600’ ! videoconvert ! queue ! xvimagesink

It comes back with:
Setting pipeline to PAUSED …
ERROR: Pipeline doesn’t want to pause.
ERROR: from element /GstPipeline:pipeline0/GstXvImageSink:xvimagesink0: Could not initialise Xv output
Additional debug info:
xvimagesink.c(1773): gst_xv_image_sink_open (): /GstPipeline:pipeline0/GstXvImageSink:xvimagesink0:
XVideo extension is not available
Setting pipeline to NULL …
Freeing pipeline …

Thankyou.

Here is an update to the issue with CSI cameras;

I plugged in USB LifeCam, and can successfully export to the remote VNC display using:

DISPLAY=:2 gst-launch-1.0 v4l2src device=/dev/video2 ! videoconvert ! ximagesink

There is some latency and a reduced frame rate, but it works. So, the VNC connection to the remote display is correct.

Now, going back to the CSI camera with the issue; I also tried:
gst-launch-1.0 nvarguscamerasrc sensor-id=0 ! ‘video/x-raw(memor
y:NVMM), width=3280, height=2464, framerate=21/1, format=NV12’ ! nvvidconv flip-method=2 ! ‘vi
deo/x-raw, width=800, height=600’ ! videoconvert ! queue ! ximagesink

This works on the local display (DISPLAY=:1), but throws an error when I attempt to export to the remote display:
st-launch-1.0 nvarguscamerasrc sensor-id=0 ! ‘video/x-raw(memory:NVMM), width=3280, height=2464, framerate=21/1, format=NV12’ ! nvvidconv flip-method=2 ! ‘video/x-raw, width=800, height=600’ ! videoconvert ! queue ! ximagesink
Setting pipeline to PAUSED …
Pipeline is live and does not need PREROLL …
Setting pipeline to PLAYING …
New clock: GstSystemClock
GST_ARGUS: Creating output stream
(Argus) Error BadParameter: (propagating from src/eglstream/FrameConsumerImpl.cpp, function initialize(), line 89)
(Argus) Error BadParameter: (propagating from src/eglstream/FrameConsumerImpl.cpp, function create(), line 44)
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, threadInitialize:320 Failed to create FrameConsumer
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, threadFunction:241 (propagating)
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, waitRunning:203 Invalid thread state 3
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, execute:806 (propagating)
Got EOS from element “pipeline0”.
Execution ended after 0:00:00.089300740
Setting pipeline to NULL …
Freeing pipeline …

It may be an issue with your VNC server. I am not using VNC, but I can confirm that Desktop sharing with NoMachine can display nvarguscamerasrc into xvimagesink or nv3dsink when using the following pipeline on Xavier NX running R35.2:

gst-launch-1.0 nvarguscamerasrc ! nvvidconv ! queue ! xvimagesink

# Or use nv3dsink, but stop the pipeline from shell with Ctrl-C (only once and wait) rather than closing the window, otherwise it may spur errors
# If you face errors you may restart argus with: sudo service nvargus-daemon restart
gst-launch-1.0 nvarguscamerasrc ! nvvidconv ! queue ! nv3dsink

Thankyou for your reply. Your suggestion does work for me. I wonder if there is some limitation when using X11 with nvarguscamerasrc based pipelines. I can use your solution if I can find a way to run ORIN NANO completely headless with NoMAchine. Do you know if such a path exists with NoMachine? Thankyou.

I haven’t been playing with headless setup since a few years ago, but I think that it may help to have autologin and a default virtual screen such as this in Xorg.conf:

Section "Screen"
  Identifier "MY-SCREEN"
  DefaultDepth 24
  SubSection "Display"
    Virtual 1280 1024
    Depth 24
  EndSubSection
EndSection

Thankyou @Honey_Patouceul,

Here are my latest findings:
As I mentioned earlier, Video streaming works fine with a physical display connected. When I disconnect the physical display, and use a virtual display:

gst-launch-1.0 nvarguscamerasrc ! nvvidconv ! xvimagesink

tail -f /var/log/syslog shows:
May 24 13:26:06 auvai1 nvargus-daemon: E/ libnvphs:socket: Error[2]: socket connection /var/lib/nvphs/nvphsd.ctl to PHS failed: No such file or directory
May 24 13:26:06 auvai1 nvargus-daemon: D/ libnvphs:socket: Warning: connecting to Power Hinting Service failed. Is PHS running?
May 24 13:26:06 auvai1 nvargus-daemon: E/ libnvphs:socket: Error[2]: socket connection /var/lib/nvphs/nvphsd.ctl to PHS failed: No such file or directory
May 24 13:26:06 auvai1 nvargus-daemon: D/ libnvphs:socket: Warning: connecting to Power Hinting Service failed. Is PHS running?
May 24 13:26:06 auvai1 nvargus-daemon: E/ libnvphs: Error: NvPHSSendThroughputHints[usecase=camera, hint=MinCPU, value=4294967295, timeout_ms=1000]: queue_or_send() failed

Here is output from command line:

gst-launch-1.0 nvarguscamerasrc ! nvvidconv ! xvimagesink
libEGL warning: DRI2: failed to authenticate
=== gst-launch-1.0[4373]: Connection established (FFFF8B32D900)OFParserListModules: module list: /proc/device-tree/tegra-camera-platform/modules/module0
OFParserListModules: module list: /proc/device-tree/tegra-camera-platform/modules/module1
OFParserGetVirtualDevice: NVIDIA Camera virtual enumerator not found in proc device-tree
---- imager: No override file found. ----
---- imager: No override file found. ----
=== gst-launch-1.0[4373]: CameraProvider initialized (0xffff84730820)Setting pipeline to PAUSED …
Pipeline is live and does not need PREROLL …
Setting pipeline to PLAYING …
New clock: GstSystemClock
SCF: Error BadValue: NvPHSSendThroughputHints (in src/common/CameraPowerHint.cpp, function sendCameraPowerHint(), line 56)
GST_ARGUS: Creating output stream
No protocol specified
No protocol specified
No protocol specified
No protocol specified
(Argus) Error NotSupported: Failed to initialize EGLDisplay (in src/eglutils/EGLUtils.cpp, function getDefaultDisplay(), line 77)
(Argus) Error NotSupported: Failed to get default display (in src/api/EGLOutputStreamImpl.cpp, function initialize(), line 99)
(Argus) Error NotSupported: (propagating from src/api/CaptureSessionImpl.cpp, function createEGLOutputStream(), line 977)
(Argus) Error InvalidState: Unknown stream deleted. (in src/api/CaptureSessionImpl.cpp, function outputStreamDeleted(), line 1094)
(Argus) Error NotSupported: (propagating from src/api/CaptureSessionImpl.cpp, function createOutputStreamInternal(), line 839)
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, execute:799 Failed to create OutputStream
WARNING: from element /GstPipeline:pipeline0/GstXvImageSink:xvimagesink0: Pipeline construction is invalid, please add queues.
Additional debug info:
gstbasesink.c(1209): gst_base_sink_query_latency (): /GstPipeline:pipeline0/GstXvImageSink:xvimagesink0:
Not enough buffering available for the processing deadline of 0:00:00.015000000, add enough queues to buffer 0:00:00.015000000 additional data. Shortening processing latency to 0:00:00.000000000.
Got EOS from element “pipeline0”.
Execution ended after 0:00:00.057438200
Setting pipeline to NULL …

does anyone know how I can eliminate the error that originates from Power Hinting Service?

E/ libnvphs:socket: Error[2]: socket connection /var/lib/nvphs/nvphsd.ctl to PHS failed: No such file or directory

This error does not occur when the physical display is connected. I need to be able to run with only virtual display and without any physical display connected.
Thanks,

After a reboot, I run

gst-launch-1.0 nvarguscamerasrc ! nvvidconv ! ximagesink
libEGL warning: DRI2: failed to authenticate
Setting pipeline to PAUSED …
Pipeline is live and does not need PREROLL …
Setting pipeline to PLAYING …
New clock: GstSystemClock
GST_ARGUS: Creating output stream
(Argus) Error BadParameter: (propagating from src/eglstream/FrameConsumerImpl.cpp, function initialize(), line 89)
(Argus) Error BadParameter: (propagating from src/eglstream/FrameConsumerImpl.cpp, function create(), line 44)
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, threadInitialize:320 Failed to create FrameConsumer
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, threadFunction:241 (propagating)
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, waitRunning:203 Invalid thread state 3
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, execute:806 (propagating)
WARNING: from element /GstPipeline:pipeline0/GstXImageSink:ximagesink0: Pipeline construction is invalid, please add queues.
Additional debug info:
gstbasesink.c(1209): gst_base_sink_query_latency (): /GstPipeline:pipeline0/GstXImageSink:ximagesink0:
Not enough buffering available for the processing deadline of 0:00:00.015000000, add enough queues to buffer 0:00:00.015000000 additional data. Shortening processing latency to 0:00:00.000000000.
Got EOS from element “pipeline0”.
Execution ended after 0:00:00.173594800
Setting pipeline to NULL …
Freeing pipeline …

tail -f /var/log/syslog shows:

May 24 19:31:38 auvai1 nvargus-daemon[938]: === NVIDIA Libargus Camera Service (0.99.33)=== Listening for connections…=== gst-launch-1.0[3392]: Connection established (FFFF8FE12900)OFParserListModules: module list: /proc/device-tree/tegra-camera-platform/modules/module0
May 24 19:31:38 auvai1 nvargus-daemon[938]: OFParserListModules: module list: /proc/device-tree/tegra-camera-platform/modules/module1
May 24 19:31:38 auvai1 nvargus-daemon[938]: OFParserGetVirtualDevice: NVIDIA Camera virtual enumerator not found in proc device-tree
May 24 19:31:38 auvai1 nvargus-daemon[938]: ---- imager: No override file found. ----
May 24 19:31:38 auvai1 nvargus-daemon[938]: ---- imager: No override file found. ----
May 24 19:31:38 auvai1 nvargus-daemon: E/ libnvphs:socket: Error[2]: socket connection /var/lib/nvphs/nvphsd.ctl to PHS failed: No such file or directory
May 24 19:31:38 auvai1 nvargus-daemon: D/ libnvphs:socket: Warning: connecting to Power Hinting Service failed. Is PHS running?
May 24 19:31:38 auvai1 nvargus-daemon[938]: === gst-launch-1.0[3392]: CameraProvider initialized (0xffff886eb420)SCF: Error BadValue: NvPHSSendThroughputHints (in src/common/CameraPowerHint.cpp, function sendCameraPowerHint(), line 56)
May 24 19:31:38 auvai1 nvargus-daemon: E/ libnvphs:socket: Error[2]: socket connection /var/lib/nvphs/nvphsd.ctl to PHS failed: No such file or directory
May 24 19:31:38 auvai1 nvargus-daemon: D/ libnvphs:socket: Warning: connecting to Power Hinting Service failed. Is PHS running?
May 24 19:31:38 auvai1 nvargus-daemon: E/ libnvphs: Error: NvPHSSendThroughputHints[usecase=camera, hint=MinCPU, value=4294967295, timeout_ms=1000]: queue_or_send() failed
May 24 19:31:38 auvai1 nvargus-daemon[938]: (Argus) Error InvalidState: Unknown stream deleted. (in src/api/CaptureSessionImpl.cpp, function outputStreamDeleted(), line 1094)

You can see that it failed to create FrameConsumer, and at the same time it has the issue with Nvidia Power Hinting Service.

Hi,
For this use-case, we would suggest use RTSP or UDP. To set up Orin Nano as server. The remote display is run as client to receive and decode the stream. Please check the examples in
Jetson AGX Orin FAQ

Q: Is there any example of running RTSP streaming?
Q: Is there an example for running UDP streaming?

1 Like

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.