I am running a deepstream-l4t:6.2-samples container on Jetpack 5.1.1.
I have followed the following documentation - Overview of Minimizing Steps - Guide to Minimizing Jetson Disk Usage to remove the GUI, docs/samples and dev packages using steps 1,2 and 3 in this documentation. I am currently accessing the jetson device from a terminal accessed using Setup mode at initial boot. I am interacting with the deepstream-l4t:6.2-samples container using the command -
sudo docker run -it --rm --net=host --runtime nvidia -w /opt/nvidia/deepstream/deepstream-6.2 -v /tmp/.X11-unix/:/tmp/.X11-unix nvcr.io/nvidia/deepstream-l4t:6.2-samples
I have installed dependencies of the deepstream-app mentioned in the files - /opt/nvidia/deepstream/deepstream-6.2/sources/apps/sample_apps/deepstream-app/README and /opt/nvidia/deepstream/deepstream-6.2/install.sh and README. However, when building the sources using make inside the deepstream-app directory, I get the error
fatal error:cuda_runtime_api.h: No such file or directory
32 | #include <cuda_runtime_api.h>
I am able to compile it without any issues on a full-fledged Jetpack 5.1.1. Can I compile it on a separate machine and run it here? Are there are any dependencies I am missing?
Also mentioned in the following link - Verification - Guide to Minimizing Jetson Disk Usage that deepstream-app doesn’t require X11 but uses Tegra’s overlay display. Are there any additional steps to enable. Also, is similar display functionality available in deepstream python applications?
What do you mean
full-fledged Jetpack 5.1.1 and
compile it on a separate machine and run it here?
The sample docker is just for understanding and exploring the DeepStream SDK using the provided samples.
Yes. but you need to rewrite the script to download our python demo and compile that and run.
By full-fledged Jetpack 5.1.1, I mean Jetpack downloaded using this link - JetPack SDK 5.1.1 | NVIDIA Developer, flashing it on a SD card and not making any changes to the OS after installing it on Jetson Xavier NX dev kit. And I am running the deep stream samples docker container in this environment.
By compile it on a separate machine, I mean compile the deepstream-app that’s present in the deepstream-6.2 sdk folder on a Jetson Xavier NX device that has Jetpack 5.1.1 installed without any modifications to the OS. And I am running the deepstream samples docker container in this environment.
By run it here, I mean run the above compiled app on a Jetson Xavier NX device that has Jetpack 5.1.1 installed and modifications made to the OS according to the documentation mentioned in this link - Overview of Minimizing Steps - Guide to Minimizing Jetson Disk Usage.
So what’s the difference of the
full-fledged Jetpack 5.1.1 and
a separate machine? They are both
Jetson Xavier NX device that has Jetpack 5.1.1 installed without any modifications to the OS. as you said. And you both worked on the same docker.
And can you compile successfully in one environment, but not in another environment?
I am able to successfully compile on a Xavier NX device that has Jetpack 5.1.1 installed without any OS modifications.
I am unable to compile on another Xavier NX device that has Jetpack 5.1.1 installed with OS modifications mentioned in this Nvidia AI IOT Github page - Overview of Minimizing Steps - Guide to Minimizing Jetson Disk Usage
They are both using the same deepstream docker container downloaded from NGC catalog.
It’s weird because theoretically, DeepStream in the sample docker cannot be compiled. Since your error report is as follows:
fatal error:cuda_runtime_api.h: No such file or directory . You can check if there is
cuda_runtime_api.h in the
/usr/local/cuda/include/ path. The cuda inside the Docker is mapped the host.
Thanks. I was able to copy the compiled files from host to container and run the deepstream-app in GUI environment. However, I am having trouble running the deepstream-app in docker container using the nvdrmvideosink in a non-GUI environment.
To replicate the issue:
- I modified Jetpack OS application according to Overview of Minimizing Steps - Guide to Minimizing Jetson Disk Usage.
- The config file I used is as follows -
- The sample deepstream app I use is available inside the docker container (DeepStream-l4t | NVIDIA NGC) in the dir /opt/nvidia/deepstream/deepstream-6.2/sources/apps/sample_apps/deepstream-app
- To enable the nvdrmvideosink, I followed the steps available in the link - Accelerated GStreamer — Jetson Linux<br/>Developer Guide 34.1 documentation
$ sudo systemctl stop gdm
$ sudo loginctl terminate-seat seat0
To Load the DRM driver:
$ sudo modprobe tegra_udrm modeset=1
- After step4, I am able use the following command to display video -
$ gst-launch-1.0 filesrc location=<filename.mp4> ! \
qtdemux! queue ! h264parse ! nvv4l2decoder ! nvdrmvideosink -e
The error message I get when I run the deepstream-app in docker container is as follows:
No EGL Display
nvbufsurftransform: Could not get EGL display connection
Could not open DRM failed
Error main:716: Failed to set pipeline to PAUSED
Error from sink_sub_bin_sink1: GStreamer Error: state change failed and some element failed to post a proper error message with the reason for failure
The Jetpack version I am using currently is 5.1.1 and deepstream-samples docker container is version 6.2 (https://catalog.ngc.nvidia.com/orgs/nvidia/containers/deepstream-l4t).
I know that deepstream 6.2 supports Jetpack 5.1. Do you recommend I switch to Jetpack 5.1? Or is there something in the docker environment I need to set to support nvdrmvideosink.
Just an update, I am able to run this dummy gstreamer pipeline on the host (non-GUI env) but not inside the docker container -
gst-launch-1.0 filesrc location=<filename.mp4> ! \
qtdemux! queue ! h264parse ! ! nvv4l2decoder ! nvdrmvideosink \
conn_id=0 plane_id=1 set_mode=0 -e
I see similar issue posted on this forum - Nvdrmvideosink in a gstreamer pipeline from a docker DP 6.1.1 container fails to start and responded by @Amycao
From the latest updates I got from Nvidia last month (July 2023) about the current and future Jetpack apparently there is no plan to support this feature unfortunately. So no digital signage with the Jetson in headless mode if Docker is used to run your DS pipelines …
Thanks for the update. Just wanted to make sure it still works for Jetpack 4.6 with Deepstream 6.0 base container as you mentioned in that topic.
This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.