How to install OpenCV and cuDNN into NGC?

I’m new of NGC and Docker.
I consider implementation of my Azure application using nvidia-docker.
My application abstraction is

  1. capture image from MIPI camera
  2. upload Azure blob storage
  3. repeat 1 and 2 at regular interval(e.g. 10min)
    (In future, proceessed results upload)

I need to install OpenCV(>=3.3) for capturing image.
But l4t-container doesn’t include.
Does anyone tell me how to implement or concrete order ?

Hi,

Suppose you can create a customized image on the top of l4t.
You can apply this script to build OpenCV from source on your container:
https://github.com/AastaNV/JEP/blob/master/script/install_opencv4.3.0_Jetson.sh

Thanks.

Thanks, AastaLLL

I tried the script, but it failed.
And I would like you to confirm two points.

  1. this container image include cuDNN library ?
  2. Do I need to set ‘CUDNN_INCLUDE_DIR’ as environmental variable ? If so, what is proper ?

Error is as bellows,
Run Build Command:“/usr/bin/make” “cmTC_7dbab/fast”
/usr/bin/make -f CMakeFiles/cmTC_7dbab.dir/build.make CMakeFiles/cmTC_7dbab.dir/build
make[1]: Entering directory ‘/root/opencv430/opencv-4.3.0/release/CMakeFiles/CMakeTmp’
Building CXX object CMakeFiles/cmTC_7dbab.dir/src.cxx.o
/usr/bin/c++ -fsigned-char -W -Wall -Werror=return-type -Werror=non-virtual-dtor -Werror=address -Werror=sequence-point -Wformat -Werror=format-security -Winit-self -Wpointer-arith -Wuninitialized -Winit-self -Wsuggest-override -Wno-delete-non-virtual-dtor -Wno-comment -Wimplicit-fallthrough=3 -Wno-strict-overflow -fdiagnostics-show-option -pthread -fomit-frame-pointer -ffunction-sections -fdata-sections -fvisibility=hidden -fvisibility-inlines-hidden -Wno-deprecated -Wno-missing-declarations -Wno-shadow -Wno-unused-parameter -Wno-unused-local-typedefs -Wno-sign-compare -Wno-sign-promo -Wno-undef -Wno-ignored-qualifiers -Wno-extra -Wno-unused-function -Wno-unused-const-variable -Wno-invalid-offsetof -O3 -DNDEBUG -DNDEBUG -fPIE -Wno-enum-compare-switch -std=c++11 -o CMakeFiles/cmTC_7dbab.dir/src.cxx.o -c /root/opencv430/opencv-4.3.0/release/CMakeFiles/CMakeTmp/src.cxx
/root/opencv430/opencv-4.3.0/release/CMakeFiles/CMakeTmp/src.cxx:1:0: warning: ignoring #pragma [-Wunknown-pragmas]
#pragma

cc1plus: warning: unrecognized command line option ‘-Wno-enum-compare-switch’
Linking CXX executable cmTC_7dbab
/usr/bin/cmake -E cmake_link_script CMakeFiles/cmTC_7dbab.dir/link.txt --verbose=1
/usr/bin/c++ -fsigned-char -W -Wall -Werror=return-type -Werror=non-virtual-dtor -Werror=address -Werror=sequence-point -Wformat -Werror=format-security -Winit-self -Wpointer-arith -Wuninitialized -Winit-self -Wsuggest-override -Wno-delete-non-virtual-dtor -Wno-comment -Wimplicit-fallthrough=3 -Wno-strict-overflow -fdiagnostics-show-option -pthread -fomit-frame-pointer -ffunction-sections -fdata-sections -fvisibility=hidden
-fvisibility-inlines-hidden -Wno-deprecated -Wno-missing-declarations -Wno-shadow -Wno-unused-parameter -Wno-unused-local-typedefs -Wno-sign-compare -Wno-sign-promo -Wno-undef -Wno-ignored-qualifiers -Wno-extra -Wno-unused-function -Wno-unused-const-variable -Wno-invalid-offsetof -O3 -DNDEBUG -DNDEBUG -Wl,–gc-sections -Wl,–as-needed CMakeFiles/cmTC_7dbab.dir/src.cxx.o -o cmTC_7dbab
make[1]: Leaving directory ‘/root/opencv430/opencv-4.3.0/release/CMakeFiles/CMakeTmp’

===== END =====

Determining if the include file sys/videoio.h exists failed with the following output:
Change Dir: /root/opencv430/opencv-4.3.0/release/CMakeFiles/CMakeTmp

Run Build Command:“/usr/bin/make” “cmTC_1aa4f/fast”
/usr/bin/make -f CMakeFiles/cmTC_1aa4f.dir/build.make CMakeFiles/cmTC_1aa4f.dir/build
make[1]: Entering directory ‘/root/opencv430/opencv-4.3.0/release/CMakeFiles/CMakeTmp’
Building C object CMakeFiles/cmTC_1aa4f.dir/CheckIncludeFile.c.o
/usr/bin/cc -fsigned-char -W -Wall -Werror=return-type -Werror=address -Werror=sequence-point -Wformat -Werror=format-security -Wmissing-declarations -Wmissing-prototypes -Wstrict-prototypes -Wundef -Winit-self -Wpointer-arith -Wshadow -Wuninitialized -Winit-self -Wno-comment -Wimplicit-fallthrough=3 -Wno-strict-overflow -fdiagnostics-show-option -pthread -fomit-frame-pointer -ffunction-sections -fdata-sections -fvisibility=hidden -O3 -DNDEBUG -DNDEBUG -fPIE -o CMakeFiles/cmTC_1aa4f.dir/CheckIncludeFile.c.o -c /root/opencv430/opencv-4.3.0/release/CMakeFiles/CMakeTmp/CheckIncludeFile.c
/root/opencv430/opencv-4.3.0/release/CMakeFiles/CMakeTmp/CheckIncludeFile.c:1:10: fatal error: sys/videoio.h: No such file or directory
#include <sys/videoio.h>
^~~~~~~~~~~~~~~
compilation terminated.
CMakeFiles/cmTC_1aa4f.dir/build.make:65: recipe for target ‘CMakeFiles/cmTC_1aa4f.dir/CheckIncludeFile.c.o’ failed
make[1]: *** [CMakeFiles/cmTC_1aa4f.dir/CheckIncludeFile.c.o] Error 1
make[1]: Leaving directory ‘/root/opencv430/opencv-4.3.0/release/CMakeFiles/CMakeTmp’
Makefile:126: recipe for target ‘cmTC_1aa4f/fast’ failed
make: *** [cmTC_1aa4f/fast] Error 2

Dockerfile :

FROM nvcr.io/nvidia/l4t-base:r32.4.2
RUN mkdir /root/opencv430
COPY install_opencv4.3.0_Jetson.sh /root/opencv430/
RUN ls /root/opencv430/
RUN echo $PWD
RUN cd /root/opencv430 && sh ./install_opencv4.3.0_Jetson.sh /root/opencv430/opencv-4.3.0

Thanks

Hi,

The image mount library from the host.

It looks like the error is caused by sys/videoio.h rather than cuDNN.
Would you mind to check if the header exists in the container or not first?

Thanks.

OK, I’ll check it.
And, is there the container pre-installed OpenCV(>=3.3.1) and cuDNN in NGC?
I woud like to use those libraries in Python3.

Hi,

Sorry for the late update.
We found that you can install the python-based opencv within docker via this command directly

$ apt-get install python-opencv

Thanks.

I am also trying to build OpenCV with CUDA and CuDNN in docker environment on Jetson Nano on top of nvcr.io/nvidia/l4t-ml:r32.4.4-py3 . But it was not built. I got the below error.
Can anyone from Nvidia help us with the issue?

Moreover, If I use apt-get install python-opencv in Docker, it is not compiled with GPU support. It’s not the solution I think.

How can I use OpenCV with CUDA and CuDNN support in docker environment on top of nvcr.io/nvidia/l4t-ml:r32.4.4-py3.

/usr/lib/aarch64-linux-gnu/libcublas.so: file not recognized: File truncated
collect2: error: ld returned 1 exit status
make[2]: *** [lib/libopencv_cudev.so.4.5.0] Error 1
modules/cudev/CMakeFiles/opencv_cudev.dir/build.make:95: recipe for target 'lib/libopencv_cudev.so.4.5.0' failed
CMakeFiles/Makefile2:2962: recipe for target 'modules/cudev/CMakeFiles/opencv_cudev.dir/all' failed
make[1]: *** [modules/cudev/CMakeFiles/opencv_cudev.dir/all] Error 2
Makefile:162: recipe for target 'all' failed
make: *** [all] Error 2
Make did not successfully build

From that error, it looks like you may need to set your docker default runtime to nvidia so that it is able to access the CUDA/cuDNN libraries while building the container: GitHub - dusty-nv/jetson-containers: Machine Learning Containers for NVIDIA Jetson and JetPack-L4T

1 Like