DeepStream running in SSH terminal

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU) Jetson Orin Nano Dev kit
• DeepStream Version 6.4
• JetPack Version (valid for Jetson only) 6.0
• TensorRT Version 8.6.2
• Cuda Version 12.2

• Issue Type( questions, new requirements, bugs)
I’m trying to run DeepStream on Jetson Orin Nano via SSH, but I’m failing. I have implemented a different way to get UDP video. I’m working with the following pipeline: get the UDP video → run DeepStream → extract DeepStream information.
I was extracting the masks of the objects when I encountered the error of not being able to run the application without a display connected to the Jetson. This is a serious problem for me because I would like to run it through an SSH terminal, to receive the output on another computer.
I have tried to run it without EGL by doing unset DISPLAY, but this doesn’t work for me because I would like to visualize the output, such as the masks and IDs. I have discarded RTSP output since I am trying to do real-time live image segmentation, and RTSP output introduces a significant delay to the stream. In addition to that, I have already also setted [osd] enable = 0
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing):
Run DeepStream app through SSH
The print has some different output prints but the DeepStream code is the same of “deepstream-app/”

Could you add the GST_DEBUG=3 in front of the command and attach the new log?

1 Like

export DISPLAY=:0

Try this in SSH shell

1 Like

output_log.txt (1.2 KB)
There’s the ouput with GST_DEBUG=3.
Thanks for the reply.

deepstream_app_source1_peoplesegnet.txt (3.3 KB)
I also attached the config file that i am using.

Could you try to modify your config file?

  1. We don’t support type=10 for source group.
  2. You can use the fakesink mode for sink group


#Type - 1=CameraV4L2 2=URI 3=MultiURI 4=RTSP
# (0): memtype_device   - Memory type Device
# (1): memtype_pinned   - Memory type Host Pinned
# (2): memtype_unified  - Memory type Unified

#Type - 1=FakeSink 2=EglSink/nv3dsink (Jetson only) 3=File

1 Like


For the first note, I understand that the DeepStream app does not support source type 10 by default. However, I created this type, and it works fine when I run it in the local terminal. I made changes to make it easier to debug the problem.
Deepstream_app_config.txt (2.8 KB)
(upload for the config file)
After the changes, the SSH output is as follows:

./deepstream-app -c Deepstream_app_config.txt
gstnvtracker: Loading low-level lib at /opt/nvidia/deepstream/deepstream/lib/
nvbufsurface: Failed to create EGLImage.
gstnvtracker: Got -1 mapping nvbufsurface
gstnvtracker: Failed to initialize ConvBufManager
gstnvtracker: Failed to initilaize surface transform buffer pool.
^C** ERROR: <_intr_handler:140>: User Interrupted..

Note: I know that if I unset DISPLAY , I can run the executable, but I would like to have some visual output from DeepStream, such as the DeepStream image with segmented people, or maybe an image implemented by OpenCV.

Thanks for the reply and sorry for the delayed response!

Do you have a monitor on the Jetson Orin Nano? If not, you should unset DISPLAY and set the sink type to rtsp mode.
Please refer to our Guide first sink-group.

I do have a monitor plugged to the jetson by the time, but in the future I would like to run the scripts without it, just by SSH terminals. As you said, i will unset the display and try the RTSP output.
If I have any problem then i am going to open another topic about the RTSP display.
Appreciated your time, thanks for helping me!

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.