Certainly!
Which deepstream 6 container are you using exactly?
I tried with the base one, with devel, and following your instruction I get an error at autogen which prevents make
from running:
root@ed782deb08f0:/opt/nvidia/deepstream/deepstream-6.0/deepstream_python_apps/3rdparty/gst-python# ./autogen.sh
+ Setting up common submodule
Submodule 'common' (https://gitlab.freedesktop.org/gstreamer/common.git) registered for path 'common'
Cloning into '/opt/nvidia/deepstream/deepstream-6.0/deepstream_python_apps/3rdparty/gst-python/common'...
Submodule path 'common': checked out 'd7ecca16114e443dab9d6f8cbc47a1554e3d4b30'
ln: failed to create symbolic link '.git/hooks/pre-commit': Not a directory
+ check for build tools
checking for autoconf >= 2.60 ... found 2.69, ok.
checking for automake >= 1.10 ... found 1.15.1, ok.
checking for libtoolize >= 1.5.0 ... found 2.4.6, ok.
checking for pkg-config >= 0.8.0 ... found 0.29.1, ok.
+ checking for autogen.sh options
This autogen script will automatically run ./configure as:
./configure --enable-maintainer-mode
To pass any additional options, please specify them on the ./autogen.sh
command line.
+ running libtoolize --copy --force...
libtoolize: putting auxiliary files in '.'.
libtoolize: copying file './ltmain.sh'
libtoolize: putting macros in 'm4'.
libtoolize: copying file 'm4/libtool.m4'
libtoolize: copying file 'm4/ltoptions.m4'
libtoolize: copying file 'm4/ltsugar.m4'
libtoolize: copying file 'm4/ltversion.m4'
libtoolize: copying file 'm4/lt~obsolete.m4'
libtoolize: Consider adding 'AC_CONFIG_MACRO_DIRS([m4])' to configure.ac,
libtoolize: and rerunning libtoolize and aclocal.
+ running aclocal -I m4 -I common/m4 ...
+ running autoheader ...
+ running autoconf ...
+ running automake -a -c -Wno-portability...
configure.ac:47: installing './compile'
configure.ac:10: installing './config.guess'
configure.ac:10: installing './config.sub'
configure.ac:13: installing './install-sh'
configure.ac:13: installing './missing'
Makefile.am: installing './INSTALL'
gi/overrides/Makefile.am: installing './depcomp'
gi/overrides/Makefile.am:7: installing './py-compile'
plugin/Makefile.am:3: warning: 'INCLUDES' is the old name for 'AM_CPPFLAGS' (or '*_CPPFLAGS')
+ running configure ...
./configure default flags: --enable-maintainer-mode
checking build system type... x86_64-pc-linux-gnu
checking host system type... x86_64-pc-linux-gnu
checking target system type... x86_64-pc-linux-gnu
checking for a BSD-compatible install... /usr/bin/install -c
checking whether build environment is sane... yes
checking for a thread-safe mkdir -p... /bin/mkdir -p
checking for gawk... no
checking for mawk... mawk
checking whether make sets $(MAKE)... yes
checking whether make supports nested variables... yes
checking whether UID '0' is supported by ustar format... yes
checking whether GID '0' is supported by ustar format... yes
checking how to create a ustar tar archive... gnutar
checking nano version... 0 (release)
checking whether to enable maintainer-specific portions of Makefiles... yes
checking whether make supports nested variables... (cached) yes
checking how to print strings... printf
checking for style of include used by make... GNU
checking for gcc... gcc
checking whether the C compiler works... yes
checking for C compiler default output file name... a.out
checking for suffix of executables...
checking whether we are cross compiling... no
checking for suffix of object files... o
checking whether we are using the GNU C compiler... yes
checking whether gcc accepts -g... yes
checking for gcc option to accept ISO C89... none needed
checking whether gcc understands -c and -o together... yes
checking dependency style of gcc... gcc3
checking for a sed that does not truncate output... /bin/sed
checking for grep that handles long lines and -e... /bin/grep
checking for egrep... /bin/grep -E
checking for fgrep... /bin/grep -F
checking for ld used by gcc... /usr/bin/ld
checking if the linker (/usr/bin/ld) is GNU ld... yes
checking for BSD- or MS-compatible name lister (nm)... /usr/bin/nm -B
checking the name lister (/usr/bin/nm -B) interface... BSD nm
checking whether ln -s works... yes
checking the maximum length of command line arguments... 1572864
checking how to convert x86_64-pc-linux-gnu file names to x86_64-pc-linux-gnu format... func_convert_file_noop
checking how to convert x86_64-pc-linux-gnu file names to toolchain format... func_convert_file_noop
checking for /usr/bin/ld option to reload object files... -r
checking for objdump... objdump
checking how to recognize dependent libraries... pass_all
checking for dlltool... no
checking how to associate runtime and link libraries... printf %s\n
checking for ar... ar
checking for archiver @FILE support... @
checking for strip... strip
checking for ranlib... ranlib
checking command to parse /usr/bin/nm -B output from gcc object... ok
checking for sysroot... no
checking for a working dd... /bin/dd
checking how to truncate binary pipes... /bin/dd bs=4096 count=1
checking for mt... no
checking if : is a manifest tool... no
checking how to run the C preprocessor... gcc -E
checking for ANSI C header files... yes
checking for sys/types.h... yes
checking for sys/stat.h... yes
checking for stdlib.h... yes
checking for string.h... yes
checking for memory.h... yes
checking for strings.h... yes
checking for inttypes.h... yes
checking for stdint.h... yes
checking for unistd.h... yes
checking for dlfcn.h... yes
checking for objdir... .libs
checking if gcc supports -fno-rtti -fno-exceptions... no
checking for gcc option to produce PIC... -fPIC -DPIC
checking if gcc PIC flag -fPIC -DPIC works... yes
checking if gcc static flag -static works... yes
checking if gcc supports -c -o file.o... yes
checking if gcc supports -c -o file.o... (cached) yes
checking whether the gcc linker (/usr/bin/ld -m elf_x86_64) supports shared libraries... yes
checking whether -lc should be explicitly linked in... no
checking dynamic linker characteristics... GNU/Linux ld.so
checking how to hardcode library paths into programs... immediate
checking for shl_load... no
checking for shl_load in -ldld... no
checking for dlopen... no
checking for dlopen in -ldl... yes
checking whether a program can dlopen itself... yes
checking whether a statically linked program can dlopen itself... no
checking whether stripping libraries is possible... yes
checking if libtool supports shared libraries... yes
checking whether to build shared libraries... yes
checking whether to build static libraries... no
checking for gcc... (cached) gcc
checking whether we are using the GNU C compiler... (cached) yes
checking whether gcc accepts -g... (cached) yes
checking for gcc option to accept ISO C89... (cached) none needed
checking whether gcc understands -c and -o together... (cached) yes
checking dependency style of gcc... (cached) gcc3
checking for gcc option to accept ISO C99... none needed
checking for gcc option to accept ISO Standard C... (cached) none needed
checking for python... /usr/bin/python
checking for python version... 2.7
checking for python platform... linux2
checking for python script directory... ${prefix}/lib/python2.7/dist-packages
checking for python extension module directory... ${exec_prefix}/lib/python2.7/dist-packages
checking for python >= 2.7... checking for pkg-config... /usr/bin/pkg-config
checking pkg-config is at least version 0.9.0... yes
checking for GST... configure: error: Package requirements (gstreamer-1.0 >= 1.14.5) were not met:
No package 'gstreamer-1.0' found
Consider adjusting the PKG_CONFIG_PATH environment variable if you
installed software in a non-standard prefix.
Alternatively, you may set the environment variables GST_CFLAGS
and GST_LIBS to avoid the need to call pkg-config.
See the pkg-config man page for more details.
configure failed
root@ed782deb08f0:/opt/nvidia/deepstream/deepstream-6.0/deepstream_python_apps/3rdparty/gst-python# make
make: *** No targets specified and no makefile found. Stop.
EDIT: to be more precise, without running apt-get install --reinstall ca-certificates
even autogen.sh
fails:
root@7ed1581d1cf9:/opt/nvidia/deepstream/deepstream-6.0/deepstream_python_apps/3rdparty/gst-python# ./autogen.sh
+ Setting up common submodule
Submodule 'common' (https://gitlab.freedesktop.org/gstreamer/common.git) registered for path 'common'
Cloning into '/opt/nvidia/deepstream/deepstream-6.0/deepstream_python_apps/3rdparty/gst-python/common'...
fatal: unable to access 'https://gitlab.freedesktop.org/gstreamer/common.git/': server certificate verification failed. CAfile: /etc/ssl/certs/ca-certificates.crt CRLfile: none
fatal: clone of 'https://gitlab.freedesktop.org/gstreamer/common.git' into submodule path '/opt/nvidia/deepstream/deepstream-6.0/deepstream_python_apps/3rdparty/gst-python/common' failed
Failed to clone 'common'. Retry scheduled
Cloning into '/opt/nvidia/deepstream/deepstream-6.0/deepstream_python_apps/3rdparty/gst-python/common'...
fatal: unable to access 'https://gitlab.freedesktop.org/gstreamer/common.git/': server certificate verification failed. CAfile: /etc/ssl/certs/ca-certificates.crt CRLfile: none
fatal: clone of 'https://gitlab.freedesktop.org/gstreamer/common.git' into submodule path '/opt/nvidia/deepstream/deepstream-6.0/deepstream_python_apps/3rdparty/gst-python/common' failed
Failed to clone 'common' a second time, aborting
There is something wrong with your source tree.
You are missing common/gst-autogen.sh
EDIT2: I also noticed that the folder /opt/nvidia/deepstream/deepstream/sources/apps/
does not exists in my container, only cd /opt/nvidia/deepstream/deepstream-6.0/sources/apps/
Sample docker is just for running samples, does not support to compile and then run application.
If you want to run triton, please use the triton docker, otherwise, suggest to use the devel docker
Let me know which docker you will use.
I would like to use triton. I tried with it too but I had errors. See my last post here: Can't find librivermax.so.0 and libtritonserver.so . I also tried with devel out of curiosity but I had the same errors reported in my post above. I’d like to use triton.
for triton docker, please refer to [DeepStream 6.0] Unable to install python_gst into nvcr.io/nvidia/deepstream:6.0-triton container - #5 by rpaliwal_nvidia v
So do you need to install the deepstream sdk inside the container nvcr.io/nvidia/deepstream:6.0-triton](http://nvcr.io/nvidia/deepstream:6.0-triton ? That’s what the link says, but it seems weird.
@mchi I tried to follow the instruction but they fail at the apt-get install (both trying to install and not installing the deepstream sdk):
Dockerfile
FROM nvcr.io/nvidia/deepstream:6.0-triton
#ENV GIT_SSL_NO_VERIFY=1
RUN sh docker_python_setup.sh
RUN cd sources && \
git clone https://github.com/NVIDIA-AI-IOT/deepstream_python_apps.git
RUN apt install -y python3-gi python3-gst-1.0 python-gi-dev git python3 python3-pip cmake g++ build-essential \
libglib2.0-dev python3-dev python3.6-dev libglib2.0-dev-bin python-gi-dev libtool m4 autoconf automake
COPY ./deepstream-6.0_6.0.0-1_amd64.deb ./
RUN apt -y install ./deepstream-6.0_6.0.0-1_arm64.deb
RUN cd sources/deepstream_python_apps/bindings && \
mkdir build && \
cd build && \
cmake .. && \
make && \
pip install ./pyds-1.1.0-py3-none-linux_x86_64.whl
Output:
Sending build context to Docker daemon 704MB
Step 1/7 : FROM nvcr.io/nvidia/deepstream:6.0-triton
---> 6e629647efba
Step 2/7 : RUN sh docker_python_setup.sh
---> Running in 06c70f3e982e
Reading package lists... Done
Building dependency tree
Reading state information... Done
software-properties-common is already the newest version (0.98.9.5).
0 upgraded, 0 newly installed, 0 to remove and 63 not upgraded.
Ign:1 https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2004/x86_64 InRelease
Ign:2 https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1804/x86_64 InRelease
Get:3 https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2004/x86_64 Release [696 B]
Get:4 http://ppa.launchpad.net/deadsnakes/ppa/ubuntu focal InRelease [18.1 kB]
Get:5 http://security.ubuntu.com/ubuntu focal-security InRelease [114 kB]
Hit:6 https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1804/x86_64 Release
Get:7 https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2004/x86_64 Release.gpg [836 B]
Hit:8 http://archive.ubuntu.com/ubuntu focal InRelease
Get:9 http://archive.ubuntu.com/ubuntu focal-updates InRelease [114 kB]
Get:11 https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2004/x86_64 Packages [515 kB]
Get:12 http://archive.ubuntu.com/ubuntu focal-backports InRelease [108 kB]
Get:13 http://ppa.launchpad.net/deadsnakes/ppa/ubuntu focal/main amd64 Packages [26.3 kB]
Get:14 http://security.ubuntu.com/ubuntu focal-security/main amd64 Packages [1326 kB]
Get:15 http://archive.ubuntu.com/ubuntu focal-updates/restricted amd64 Packages [784 kB]
Get:16 http://security.ubuntu.com/ubuntu focal-security/multiverse amd64 Packages [30.1 kB]
Get:17 http://security.ubuntu.com/ubuntu focal-security/restricted amd64 Packages [726 kB]
Get:18 http://security.ubuntu.com/ubuntu focal-security/universe amd64 Packages [821 kB]
Get:19 http://archive.ubuntu.com/ubuntu focal-updates/main amd64 Packages [1741 kB]
Get:20 http://archive.ubuntu.com/ubuntu focal-updates/universe amd64 Packages [1099 kB]
Get:21 http://archive.ubuntu.com/ubuntu focal-updates/multiverse amd64 Packages [33.6 kB]
Get:22 http://archive.ubuntu.com/ubuntu focal-backports/universe amd64 Packages [21.7 kB]
Get:23 http://archive.ubuntu.com/ubuntu focal-backports/main amd64 Packages [50.0 kB]
Fetched 7530 kB in 2s (4011 kB/s)
Reading package lists... Done
Ign:1 https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2004/x86_64 InRelease
Ign:2 https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1804/x86_64 InRelease
Hit:3 http://security.ubuntu.com/ubuntu focal-security InRelease
Hit:4 https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2004/x86_64 Release
Hit:5 http://ppa.launchpad.net/deadsnakes/ppa/ubuntu focal InRelease
Hit:6 http://archive.ubuntu.com/ubuntu focal InRelease
Hit:7 https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1804/x86_64 Release
Hit:8 http://archive.ubuntu.com/ubuntu focal-updates InRelease
Hit:9 http://archive.ubuntu.com/ubuntu focal-backports InRelease
Reading package lists... Done
Reading package lists... Done
Building dependency tree
Reading state information... Done
The following additional packages will be installed:
libpython3.6-minimal libpython3.6-stdlib python3.6-minimal
Suggested packages:
python3.6-venv python3.6-doc binfmt-support
The following NEW packages will be installed:
libpython3.6-minimal libpython3.6-stdlib python3.6 python3.6-minimal
0 upgraded, 4 newly installed, 0 to remove and 118 not upgraded.
Need to get 4629 kB of archives.
After this operation, 24.1 MB of additional disk space will be used.
Get:1 http://ppa.launchpad.net/deadsnakes/ppa/ubuntu focal/main amd64 libpython3.6-minimal amd64 3.6.15-1+focal1 [577 kB]
Get:2 http://ppa.launchpad.net/deadsnakes/ppa/ubuntu focal/main amd64 python3.6-minimal amd64 3.6.15-1+focal1 [1589 kB]
Get:3 http://ppa.launchpad.net/deadsnakes/ppa/ubuntu focal/main amd64 libpython3.6-stdlib amd64 3.6.15-1+focal1 [2215 kB]
Get:4 http://ppa.launchpad.net/deadsnakes/ppa/ubuntu focal/main amd64 python3.6 amd64 3.6.15-1+focal1 [249 kB]
Fetched 4629 kB in 2s (1952 kB/s)
Selecting previously unselected package libpython3.6-minimal:amd64.
(Reading database ... 69896 files and directories currently installed.)
Preparing to unpack .../libpython3.6-minimal_3.6.15-1+focal1_amd64.deb ...
Unpacking libpython3.6-minimal:amd64 (3.6.15-1+focal1) ...
Selecting previously unselected package python3.6-minimal.
Preparing to unpack .../python3.6-minimal_3.6.15-1+focal1_amd64.deb ...
Unpacking python3.6-minimal (3.6.15-1+focal1) ...
Selecting previously unselected package libpython3.6-stdlib:amd64.
Preparing to unpack .../libpython3.6-stdlib_3.6.15-1+focal1_amd64.deb ...
Unpacking libpython3.6-stdlib:amd64 (3.6.15-1+focal1) ...
Selecting previously unselected package python3.6.
Preparing to unpack .../python3.6_3.6.15-1+focal1_amd64.deb ...
Unpacking python3.6 (3.6.15-1+focal1) ...
Setting up libpython3.6-minimal:amd64 (3.6.15-1+focal1) ...
Setting up python3.6-minimal (3.6.15-1+focal1) ...
Setting up libpython3.6-stdlib:amd64 (3.6.15-1+focal1) ...
Setting up python3.6 (3.6.15-1+focal1) ...
Processing triggers for mime-support (3.64ubuntu1) ...
Reading package lists... Done
Building dependency tree
Reading state information... Done
The following NEW packages will be installed:
libpython3.6
0 upgraded, 1 newly installed, 0 to remove and 118 not upgraded.
Need to get 1570 kB of archives.
After this operation, 4778 kB of additional disk space will be used.
Get:1 http://ppa.launchpad.net/deadsnakes/ppa/ubuntu focal/main amd64 libpython3.6 amd64 3.6.15-1+focal1 [1570 kB]
Fetched 1570 kB in 1s (2386 kB/s)
Selecting previously unselected package libpython3.6:amd64.
(Reading database ... 70730 files and directories currently installed.)
Preparing to unpack .../libpython3.6_3.6.15-1+focal1_amd64.deb ...
Unpacking libpython3.6:amd64 (3.6.15-1+focal1) ...
Setting up libpython3.6:amd64 (3.6.15-1+focal1) ...
Processing triggers for libc-bin (2.31-0ubuntu9.2) ...
Reading package lists... Done
Building dependency tree
Reading state information... Done
The following packages were automatically installed and are no longer required:
bubblewrap distro-info-data fuse gir1.2-packagekitglib-1.0 libappstream4
libfuse2 liblmdb0 libpackagekit-glib2-18 libpolkit-agent-1-0
libpolkit-gobject-1-0 libstemmer0d libyaml-0-2 lsb-release packagekit
policykit-1 python-apt-common python3-apt python3-certifi python3-chardet
python3-dbus python3-idna python3-requests python3-requests-unixsocket
python3-six python3-urllib3 xdg-desktop-portal
Use 'apt autoremove' to remove them.
The following packages will be REMOVED:
libglib2.0-tests python3-dbusmock python3-gi python3-software-properties
software-properties-common
0 upgraded, 0 newly installed, 5 to remove and 116 not upgraded.
After this operation, 11.8 MB disk space will be freed.
(Reading database ... 70737 files and directories currently installed.)
Removing libglib2.0-tests (2.64.6-1~ubuntu20.04.4) ...
Removing python3-dbusmock (0.19-1) ...
Removing software-properties-common (0.98.9.5) ...
Removing python3-software-properties (0.98.9.5) ...
Removing python3-gi (3.36.0-1) ...
Processing triggers for dbus (1.12.16-2ubuntu2.1) ...
--2021-12-03 16:33:44-- http://mirrors.edge.kernel.org/ubuntu/pool/main/p/pygobject/python3-gi_3.26.1-2ubuntu1_amd64.deb
Resolving mirrors.edge.kernel.org (mirrors.edge.kernel.org)... 147.75.197.195, 2604:1380:1:3600::1
Connecting to mirrors.edge.kernel.org (mirrors.edge.kernel.org)|147.75.197.195|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 153080 (149K) [application/octet-stream]
Saving to: 'python3-gi_3.26.1-2ubuntu1_amd64.deb'
python3-gi_3.26.1-2 100%[===================>] 149.49K --.-KB/s in 0.03s
2021-12-03 16:33:44 (4.73 MB/s) - 'python3-gi_3.26.1-2ubuntu1_amd64.deb' saved [153080/153080]
Selecting previously unselected package python3-gi.
(Reading database ... 69764 files and directories currently installed.)
Preparing to unpack python3-gi_3.26.1-2ubuntu1_amd64.deb ...
Unpacking python3-gi (3.26.1-2ubuntu1) ...
dpkg: dependency problems prevent configuration of python3-gi:
python3-gi depends on python3 (<< 3.7); however:
Version of python3 on system is 3.8.2-0ubuntu2.
python3-gi depends on libffi6 (>= 3.0.4); however:
Package libffi6 is not installed.
libgirepository-1.0-1:amd64 (1.64.1-1~ubuntu20.04.1) breaks python3-gi (<< 3.34.0-4~) and is installed.
Version of python3-gi to be configured is 3.26.1-2ubuntu1.
dpkg: error processing package python3-gi (--install):
dependency problems - leaving unconfigured
Errors were encountered while processing:
python3-gi
update-alternatives: using /usr/bin/python3.6 to provide /usr/bin/python3 (python3) in auto mode
update-alternatives: using /usr/bin/python3.6m to provide /usr/bin/python3 (python3) in auto mode
update-alternatives: using /usr/bin/python3.8 to provide /usr/bin/python3 (python3) in auto mode
There are 3 choices for the alternative python3 (providing /usr/bin/python3).
Selection Path Priority Status
------------------------------------------------------------
* 0 /usr/bin/python3.8 3 auto mode
1 /usr/bin/python3.6 1 manual mode
2 /usr/bin/python3.6m 2 manual mode
3 /usr/bin/python3.8 3 manual mode
Press <enter> to keep the current choice[*], or type selection number: update-alternatives: using /usr/bin/python3.6m to provide /usr/bin/python3 (python3) in manual mode
Collecting numpy
Downloading numpy-1.19.5-cp36-cp36m-manylinux2010_x86_64.whl (14.8 MB)
|################################| 14.8 MB 26.4 MB/s
Installing collected packages: numpy
Successfully installed numpy-1.19.5
Collecting opencv-python
Downloading opencv_python-4.5.4.60-cp36-cp36m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (60.3 MB)
|################################| 60.3 MB 30.2 MB/s
Requirement already satisfied: numpy>=1.13.3 in /usr/local/lib/python3.6/dist-packages (from opencv-python) (1.19.5)
Installing collected packages: opencv-python
Successfully installed opencv-python-4.5.4.60
Removing intermediate container 06c70f3e982e
---> eabf215b83f1
Step 3/7 : RUN cd sources && git clone https://github.com/NVIDIA-AI-IOT/deepstream_python_apps.git
---> Running in ac390ea288a0
Cloning into 'deepstream_python_apps'...
remote: Enumerating objects: 424, done.
remote: Counting objects: 100% (252/252), done.
remote: Compressing objects: 100% (133/133), done.
remote: Total 424 (delta 133), reused 215 (delta 114), pack-reused 172
Receiving objects: 100% (424/424), 3.77 MiB | 46.47 MiB/s, done.
Resolving deltas: 100% (228/228), done.
Removing intermediate container ac390ea288a0
---> 840be599c402
Step 4/7 : RUN apt install -y python3-gi python3-gst-1.0 python-gi-dev git python3 python3-pip cmake g++ build-essential libglib2.0-dev python3-dev python3.6-dev libglib2.0-dev-bin python-gi-dev libtool m4 autoconf automake
---> Running in e7d2869f1324
Reading package lists... Done
Building dependency tree
Reading state information... Done
autoconf is already the newest version (2.69-11.1).
automake is already the newest version (1:1.16.1-4ubuntu6).
automake set to manually installed.
cmake is already the newest version (3.16.3-1ubuntu1).
g++ is already the newest version (4:9.3.0-1ubuntu2).
g++ set to manually installed.
libtool is already the newest version (2.4.6-14).
m4 is already the newest version (1.4.18-4).
m4 set to manually installed.
python3 is already the newest version (3.8.2-0ubuntu2).
build-essential is already the newest version (12.8ubuntu1.1).
git is already the newest version (1:2.25.1-1ubuntu3.2).
libglib2.0-dev is already the newest version (2.64.6-1~ubuntu20.04.4).
libglib2.0-dev-bin is already the newest version (2.64.6-1~ubuntu20.04.4).
python3-pip is already the newest version (20.0.2-5ubuntu1.6).
You might want to run 'apt --fix-broken install' to correct these.
The following packages have unmet dependencies:
python3-dev : Depends: libpython3-dev (= 3.8.2-0ubuntu2) but it is not going to be installed
Depends: python3.8-dev (>= 3.8.2-1~) but it is not going to be installed
python3.6-dev : Depends: libpython3.6-dev (= 3.6.15-1+focal1) but it is not going to be installed
E: Unmet dependencies. Try 'apt --fix-broken install' with no packages (or specify a solution).
The command '/bin/sh -c apt install -y python3-gi python3-gst-1.0 python-gi-dev git python3 python3-pip cmake g++ build-essential libglib2.0-dev python3-dev python3.6-dev libglib2.0-dev-bin python-gi-dev libtool m4 autoconf automake' returned a non-zero code: 100
Looking closely at the output, this might be caused by a fail in the first command sh docker_python_setup.sh
: looking closer at the output you will see:
dpkg: error processing package python3-gi (--install):
dependency problems - leaving unconfigured
Errors were encountered while processing:
python3-gi
Maybe that’s why also the apt install fails. How could I fix this?
After some researches I found out that some of the issue were caused by the use of an headless machine. I still can’t use deepstream triton container because of the errors above, but I can at least start the example in the devel container. Let’s focus on this for now.
After processing the first frame, the app crashes:
root@ip-172-31-9-127:/opt/nvidia/deepstream/deepstream-6.0#
root@ip-172-31-9-127:/opt/nvidia/deepstream/deepstream-6.0# cd /opt/nvidia/deepstream/deepstream-6.0/sources/apps/deepstream_python_apps/deepstream-imagedata-multistream
root@ip-172-31-9-127:/opt/nvidia/deepstream/deepstream-6.0/sources/apps/deepstream_python_apps/deepstream-imagedata-multistream# python3 deepstream_imagedata-multistream.py rtsp://wowzaec2demo.streamlock.net/vod/mp4:BigBuckBunny_115k.mov frames
Frames will be saved in frames
(gst-plugin-scanner:28): GStreamer-WARNING **: 19:11:13.266: Failed to load plugin '/usr/lib/x86_64-linux-gnu/gstreamer-1.0/deepstream/libnvdsgst_udp.so': librivermax.so.0: cannot open shared object file: No such file or directory
(gst-plugin-scanner:28): GStreamer-WARNING **: 19:11:13.276: Failed to load plugin '/usr/lib/x86_64-linux-gnu/gstreamer-1.0/deepstream/libnvdsgst_inferserver.so': libtritonserver.so: cannot open shared object file: No such file or directory
Creating Pipeline
Creating streamux
Creating source_bin 0
Creating source bin
source-bin-00
Creating Pgie
Creating nvvidconv1
Creating filter1
Creating tiler
Creating nvvidconv
Creating nvosd
Creating EGLSink
Atleast one of the sources is live
Adding elements to Pipeline
Linking elements in the Pipeline
Now playing...
1 : rtsp://wowzaec2demo.streamlock.net/vod/mp4:BigBuckBunny_115k.mov
Starting pipeline
ERROR: ../nvdsinfer/nvdsinfer_model_builder.cpp:1484 Deserialize engine failed because file path: /opt/nvidia/deepstream/deepstream-6.0/sources/apps/deepstream_python_apps/deepstream-imagedata-multistream/../../../../samples/models/Primary_Detector/resnet10.caffemodel_b1_gpu0_int8.engine open error
0:00:01.266485927 12 0x368b270 WARN nvinfer gstnvinfer.cpp:635:gst_nvinfer_logger:<primary-inference> NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::deserializeEngineAndBackend() <nvdsinfer_context_impl.cpp:1889> [UID = 1]: deserialize engine from file :/opt/nvidia/deepstream/deepstream-6.0/sources/apps/deepstream_python_apps/deepstream-imagedata-multistream/../../../../samples/models/Primary_Detector/resnet10.caffemodel_b1_gpu0_int8.engine failed
0:00:01.266546708 12 0x368b270 WARN nvinfer gstnvinfer.cpp:635:gst_nvinfer_logger:<primary-inference> NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:1996> [UID = 1]: deserialize backend context from engine from file :/opt/nvidia/deepstream/deepstream-6.0/sources/apps/deepstream_python_apps/deepstream-imagedata-multistream/../../../../samples/models/Primary_Detector/resnet10.caffemodel_b1_gpu0_int8.engine failed, try rebuild
0:00:01.266567941 12 0x368b270 INFO nvinfer gstnvinfer.cpp:638:gst_nvinfer_logger:<primary-inference> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:1914> [UID = 1]: Trying to create engine from model files
WARNING: [TRT]: Detected invalid timing cache, setup a local cache instead
0:00:11.720963425 12 0x368b270 INFO nvinfer gstnvinfer.cpp:638:gst_nvinfer_logger:<primary-inference> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:1947> [UID = 1]: serialize cuda engine to file: /opt/nvidia/deepstream/deepstream-6.0/samples/models/Primary_Detector/resnet10.caffemodel_b1_gpu0_int8.engine successfully
INFO: ../nvdsinfer/nvdsinfer_model_builder.cpp:610 [Implicit Engine Info]: layers num: 3
0 INPUT kFLOAT input_1 3x368x640
1 OUTPUT kFLOAT conv2d_bbox 16x23x40
2 OUTPUT kFLOAT conv2d_cov/Sigmoid 4x23x40
0:00:11.726100202 12 0x368b270 INFO nvinfer gstnvinfer_impl.cpp:313:notifyLoadModelStatus:<primary-inference> [UID 1]: Load new model:dstest_imagedata_config.txt sucessfully
Decodebin child added: source
Decodebin child added: decodebin0
Decodebin child added: rtph264depay0
Decodebin child added: h264parse0
Decodebin child added: capsfilter0
Decodebin child added: nvv4l2decoder0
In cb_newpad
Decodebin child added: decodebin1
Decodebin child added: rtpmp4gdepay0
Decodebin child added: aacparse0
Frame Number= 0 Number of Objects= 0 Vehicle_count= 0 Person_count= 0
0:00:12.142796107 12 0x349f630 WARN nvinfer gstnvinfer.cpp:2288:gst_nvinfer_output_loop:<primary-inference> error: Internal data stream error.
0:00:12.142827283 12 0x349f630 WARN nvinfer gstnvinfer.cpp:2288:gst_nvinfer_output_loop:<primary-inference> error: streaming stopped, reason not-negotiated (-4)
Error: gst-stream-error-quark: Internal data stream error. (1): gstnvinfer.cpp(2288): gst_nvinfer_output_loop (): /GstPipeline:pipeline0/GstNvInfer:primary-inference:
streaming stopped, reason not-negotiated (-4)
Exiting app
Frame Number= 1 Number of Objects= 0 Vehicle_count= 0 Person_count= 0
Decodebin child added: avdec_aac0
In cb_newpad
Here’s what I did:
I created a Dockerbuild file:
FROM nvcr.io/nvidia/deepstream:6.0-devel
ENV GIT_SSL_NO_VERIFY=1
#RUN apt install -y python3-pip wget python-gi-dev
# RUN add-apt-repository -y ppa:deadsnakes/ppa && apt update
RUN apt install -y git python-dev python3 python3-pip python3.6-dev python3.8-dev cmake g++ build-essential \
libglib2.0-dev libglib2.0-dev-bin python-gi-dev libtool m4 autoconf automake
#RUN apt install -y python3-pip python-gi-dev python3.6-dev libpython3.6-dev python3.8-dev
#RUN apt install -y libgstreamer1.0-dev libgstreamer-plugins-base1.0-dev wget
#ENV GST_PLUGIN_PATH="/usr/lib/x86_64-linux-gnu/gstreamer-1.0/deepstream:${GST_PLUGIN_PATH}"
#ENV GST_PLUGIN_PATH="/opt/nvidia/deepstream/deepstream-6.0/lib/gst-plugins:${GST_PLUGIN_PATH}"
#ENV LD_LIBRARY="/usr/lib/x86_64-linux-gnu/gstreamer-1.0/deepstream:${LD_LIBRARY}"
#ENV LD_LIBRARY="/opt/nvidia/deepstream/deepstream-6.0/lib/gst-plugins:${LD_LIBRARY}"
#RUN echo '/usr/local/lib/gstreamer-1.0' >> /etc/ld.so.conf
#RUN ldconfig
RUN cd /opt/nvidia/deepstream/deepstream-6.0/sources/apps && \
git clone https://github.com/NVIDIA-AI-IOT/deepstream_python_apps.git
RUN cd /opt/nvidia/deepstream/deepstream-6.0/sources/apps/deepstream_python_apps && \
git submodule update --init
RUN cd /opt/nvidia/deepstream/deepstream-6.0/sources/apps/deepstream_python_apps/3rdparty/gst-python/ && \
./autogen.sh && \
make && \
make install
RUN cd /opt/nvidia/deepstream/deepstream-6.0/sources/apps/deepstream_python_apps/bindings && \
mkdir build && \
cd build && \
cmake -DPYTHON_MAJOR_VERSION=3 -DPYTHON_MINOR_VERSION=6 -DPIP_PLATFORM=linux_x86_64 -DDS_PATH=/opt/nvidia/deepstream/deepstream-6.0 .. && \
make && \
pip3 install pyds-1.1.0-py3-none-linux_x86_64.whl
RUN cd /opt/nvidia/deepstream/deepstream-6.0/sources/apps/deepstream_python_apps && \
mv apps/* ./
RUN pip3 install --upgrade pip
RUN pip3 install numpy opencv-python
Then I built the docker image and run it:
docker build . -t deepstream-custom
sudo docker run -it --rm --net=host --runtime nvidia -e DISPLAY=$DISPLAY -v /tmp/.X11-unix/:/tmp/.X11-unix --cap-add=SYS_PTRACE --security-opt seccomp=unconfined --device /dev/video0 --privileged deepstream-custom
Once inside the container:
cd /opt/nvidia/deepstream/deepstream-6.0/sources/apps/deepstream_python_apps/deepstream-imagedata-multistream
python3 deepstream_imagedata-multistream.py rtsp://wowzaec2demo.streamlock.net/vod/mp4:BigBuckBunny_115k.mov frames
As you can see the plugins error are still there.
I would like to try the sample inside the deepstream triton container but I can’t because of the error in the post above.
I even tried to run the deepstream-test1
but I get the same error:
cd /opt/nvidia/deepstream/deepstream-6.0/sources/apps/deepstream_python_apps/deepstream-test1
python3 deepstream_test_1.py ../../../../samples/streams/sample_qHD.h264
Output:
(gst-plugin-scanner:13): GStreamer-WARNING **: 19:13:07.168: Failed to load plugin '/usr/lib/x86_64-linux-gnu/gstreamer-1.0/deepstream/libnvdsgst_udp.so': librivermax.so.0: cannot open shared object file: No such file or directory
(gst-plugin-scanner:13): GStreamer-WARNING **: 19:13:07.177: Failed to load plugin '/usr/lib/x86_64-linux-gnu/gstreamer-1.0/deepstream/libnvdsgst_inferserver.so': libtritonserver.so: cannot open shared object file: No such file or directory
Creating Pipeline
Creating Source
Creating H264Parser
Creating Decoder
Creating EGLSink
Playing file ../../../../samples/streams/sample_qHD.h264
Adding elements to Pipeline
Linking elements in the Pipeline
Starting pipeline
0:00:00.732354152 12 0x1b9e270 WARN nvinfer gstnvinfer.cpp:635:gst_nvinfer_logger:<primary-inference> NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::initialize() <nvdsinfer_context_impl.cpp:1161> [UID = 1]: Warning, OpenCV has been deprecated. Using NMS for clustering instead of cv::groupRectangles with topK = 20 and NMS Threshold = 0.5
ERROR: ../nvdsinfer/nvdsinfer_model_builder.cpp:1484 Deserialize engine failed because file path: /opt/nvidia/deepstream/deepstream-6.0/sources/apps/deepstream_python_apps/deepstream-test1/../../../../samples/models/Primary_Detector/resnet10.caffemodel_b1_gpu0_int8.engine open error
0:00:01.191660181 12 0x1b9e270 WARN nvinfer gstnvinfer.cpp:635:gst_nvinfer_logger:<primary-inference> NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::deserializeEngineAndBackend() <nvdsinfer_context_impl.cpp:1889> [UID = 1]: deserialize engine from file :/opt/nvidia/deepstream/deepstream-6.0/sources/apps/deepstream_python_apps/deepstream-test1/../../../../samples/models/Primary_Detector/resnet10.caffemodel_b1_gpu0_int8.engine failed
0:00:01.191707161 12 0x1b9e270 WARN nvinfer gstnvinfer.cpp:635:gst_nvinfer_logger:<primary-inference> NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:1996> [UID = 1]: deserialize backend context from engine from file :/opt/nvidia/deepstream/deepstream-6.0/sources/apps/deepstream_python_apps/deepstream-test1/../../../../samples/models/Primary_Detector/resnet10.caffemodel_b1_gpu0_int8.engine failed, try rebuild
0:00:01.191720952 12 0x1b9e270 INFO nvinfer gstnvinfer.cpp:638:gst_nvinfer_logger:<primary-inference> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:1914> [UID = 1]: Trying to create engine from model files
WARNING: [TRT]: Detected invalid timing cache, setup a local cache instead
0:00:11.625905110 12 0x1b9e270 INFO nvinfer gstnvinfer.cpp:638:gst_nvinfer_logger:<primary-inference> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:1947> [UID = 1]: serialize cuda engine to file: /opt/nvidia/deepstream/deepstream-6.0/samples/models/Primary_Detector/resnet10.caffemodel_b1_gpu0_int8.engine successfully
INFO: ../nvdsinfer/nvdsinfer_model_builder.cpp:610 [Implicit Engine Info]: layers num: 3
0 INPUT kFLOAT input_1 3x368x640
1 OUTPUT kFLOAT conv2d_bbox 16x23x40
2 OUTPUT kFLOAT conv2d_cov/Sigmoid 4x23x40
0:00:11.630780224 12 0x1b9e270 INFO nvinfer gstnvinfer_impl.cpp:313:notifyLoadModelStatus:<primary-inference> [UID 1]: Load new model:dstest1_pgie_config.txt sucessfully
Frame Number=0 Number of Objects=6 Vehicle_count=4 Person_count=1
0:00:11.756864262 12 0x1ab5b70 WARN nvinfer gstnvinfer.cpp:2288:gst_nvinfer_output_loop:<primary-inference> error: Internal data stream error.
0:00:11.756882192 12 0x1ab5b70 WARN nvinfer gstnvinfer.cpp:2288:gst_nvinfer_output_loop:<primary-inference> error: streaming stopped, reason not-negotiated (-4)
Error: gst-stream-error-quark: Internal data stream error. (1): gstnvinfer.cpp(2288): gst_nvinfer_output_loop (): /GstPipeline:pipeline0/GstNvInfer:primary-inference:
streaming stopped, reason not-negotiated (-4)
I am able to run the deepstream-test1-rtsp-out so I guess the error was caused by not having a display. Is this possible?
Now, how can I fix the setup issue in the deepstream-triton?
maybe, since you are running on dGPU platform, you can modify the source code to change the sink from eglsink to fakesink and try again
I will try. How can I fix the build setup with the triton container? This one Deserialize engine failed because file path: .../resnet10.caffemodel_b1_gpu0_int8.engine open error - #20 by mfoglio
x86 + dGPU + Triton docker ?
Yes, I am using an EC2 with Tesla T4, Intel I-7
doesn’t this work for you?
Hi @mchi . Unfortunately it does not work. See here: Deserialize engine failed because file path: .../resnet10.caffemodel_b1_gpu0_int8.engine open error - #20 by mfoglio
Hi @mfoglio,
Looks like the dockerfile you wrote also expects a manual input:
There are 3 choices for the alternative python3 (providing /usr/bin/python3).
Selection Path Priority Status
0 /usr/bin/python3.8 3 auto mode
1 /usr/bin/python3.6 1 manual mode
2 /usr/bin/python3.6m 2 manual mode
3 /usr/bin/python3.8 3 manual mode
Press to keep the current choice[*], or type selection number:
Can you please try the steps suggested manually without a dockerfile?
Please let me know what happens.
Hello @rpaliwal_nvidia , I have the same issue. The manual input seems to be requested by some step within the docker_python_setup.sh
.