Headless Nano: RDP desktop and camera issues

Hi!

I’m trying to use Nano as a headless device, connecting to it via SSH. For camera applications though, I need a remote desktop connection. Since Nano has no monitor connected, the only option is to connect via RDP and let it create a new GUI session.

It works fine, until I want to use some app that would stream camera video to desktop. This doesn’t work, throwing a bunch of errors in CUDA functions like cudaGraphicsGLRegisterBuffer. I assume that’s because there’s no real display connected. When I create an X session with the connected display first, and then disconnect it and use the VNC session, everything works fine.

So, is it possible to use Nano in headless mode via RDP?

Thanks!
Alex

I am not familiar with any RDP usecase, but curious about what if you run our mmapi sample (from jetpack) and render a video? Would EGL render anything on virtual screen? EGLrenderer is actually running X API to get display.

Doesn’t work:

alex@jetson:~/Downloads/tegra_multimedia_api/samples/00_video_decode$ ./video_decode H264 ../../data/Video/sample_outdoor_car_1080p_10fps.h264
nvbuf_utils: Could not get EGL display connection
Set governor to performance before enabling profiler
Creating decoder in blocking mode 
Failed to query video capabilities: Inappropriate ioctl for device
Opening in BLOCKING MODE 
NvMMLiteOpen : Block : BlockType = 261 
NVMEDIA: Reading sys.display-size : status: 6 
NvMMLiteBlockCreate : Block : BlockType = 261 
Setting frame input mode to 1 
Starting decoder capture loop thread
Input file read complete
Video Resolution: 1920x1080
[INFO] (NvEglRenderer.cpp:110) <renderer0> Setting Screen width 1920 height 1080
[ERROR] (NvEglRenderer.cpp:197) <renderer0> Unable to get egl display
[ERROR] (NvEglRenderer.cpp:153) <renderer0> Got ERROR closing display
Error in setting up renderer. Check if X is running or run with --disable-rendering
NVMEDIA: NVMEDIABufferProcessing: 1096: Consume the extra signalling for EOS 
Error in query_and_set_capture
Exiting decoder capture loop thread
[ERROR] (NvV4l2ElementPlane.cpp:178) <dec0> Output Plane:Error while DQing buffer: Broken pipe
Error DQing buffer at output plane
Decoder is in error
App run failed

Only works with rendering disabled.

Not sure, but I think this EGL sink expects local display.
You may tell more about your RDP solution.
Have you tried another (X) sink ?
Tried setting DISPLAY environment variable ?

I’m using standard xrdp (Version: 0.9.5-2) on Nano; it starts on boot. Then I’m connecting to Nano over Ethernet with remmina (1.2.0) RDP client from Linux. All I want is to run some image detection apps. I know the video refresh rates over RDP won’t be perfect but that’s OK for me.

When I start an X session with the display connected and afterwards disconnect the display and login via VNC, everything works fine. It’s just inconvenient to plug the HDMI cable back and forth between Nano and my workstation, since I only have one display with one HDMI port at the moment.

DISPLAY is set correctly. I don’t know how to try different sinks with either gstreamer apps or MMAPI samples, and also don’t know what could be a suitable sink in this case – please advise.

Thank you!

I can’t tell for MMAPI, but for gstreamer you may try these:

# This may expect a local display
# You may have to set DISPLAY=:0
gst-launch-1.0 -ev videotestsrc ! nvvidconv ! nvoverlaysink

# With same settings:
gst-launch-1.0 -ev videotestsrc ! nvvidconv ! nvgeltransform ! nveglglessink

# This might work with X-forwarding...If you login with ssh -X or -Y...but you wouldn't have set DISPLAY. Not for this use case, but be aware that in such configuration some CUDA stuff may be executed on host...if it has a CUDA-capable GPU.
gst-launch-1.0 -ev videotestsrc ! xvimagesink

Thanks a lot for this tip. I had this very same issue and your advice solved it.

@puppinoo Which method did work for you?

I connect using:
ssh -X @

then I use this to capture RPI Camera:

gst-launch-1.0 nvarguscamerasrc ! ‘video/x-raw(memory:NVMM),width=1280, height=720, framerate=30/1, format=NV12’ ! nvvidconv flip-method=0 ! ‘video/x-raw,width=960, height=616’ ! nvvidconv ! xvimagesink -e

EDIT: Quality is not exceptional though, maybe because the camera is actually put in a very bad and dark place. I need further tests

1 Like

Fantastic! That works for me as well, thank you very much!

all credits go to @Honey_Patouceul

Still receiving some error in the shell like: nvbuf_utils: Could not get EGL display connection

Not expert enough to realize if it is relevant though.
For example you can omit latter nvvidconv so the command becomes:

gst-launch-1.0 nvarguscamerasrc ! ‘video/x-raw(memory:NVMM),width=1280, height=720, framerate=30/1, format=NV12’ ! nvvidconv flip-method=0 ! ‘video/x-raw,width=960, height=616’ ! xvimagesink -e

Error is still there but camera seems to work ok.

I believe this should go to some kind of FAQ as running Nano headless should be a typical setup.

I agree. Can’t test right now but I just hope this method still takes profit of HW aceleration and doesn’t run in software mode. Checking htop it didn’t seem to take that much resources but that few neither… I’m still not sure.

ssh -X is my favorite way to connect to things remotely. You don’t even have X running on the remote machine to do it, so it’s safe to do sudo systemctl isolate multi-user.target on the remote machine and things will still work. I’m kind of surprised video works as well as it does here, however because…

ssh -X works by intercepting calls to the display server and sending them over ssh to the display server running on your machine. You can launch individual apps withouht having to launch a full gnome session even.

There is a caveat here, however. It’s like vector drawing. You are reciving not bitmap images but a sequence of instrucions to draw things. That can use a huge amount of traffic and will likely only work well wired. Also, opengl will not use the GPU over ssh -X. Your local CPU will do the drawing. (CUDA will work fine, however, and use the Nano’s GPU as expected).

If you do a glxinfo (mesa-utils) or similar over ssh -X you will see the graphics are (unless there is some voodoo) rendered by a virtual device (OpenGL vendor string: VMware, Inc.). GL works, but slowly. Video will likely have the same issues but I confess to have never tried to watch a video over ssh -X.

If you need to stream video from the camera, and don’t care about doing it over remote desktop, you can follow the “Accelerated Gstreamer Users Guide” to accomplish that task and it will use the nano’s hardware acceleration to do it.