Current situation with Jetpack 4.5.1 (after initial installation):
- OpenCV (Python) aged (4.1.1) without CUDA acceleration, no FFMPEG support
- FFMPEG: no hardware accelerated decoding/encoding
Meanwhile there is an Option to get FFMPEG with hardware accelerated decoding as decribed here.
There is also an easy way to get an up-to-date OpenCV (currently 4.5.3) with FFMPEG support using:
python3 -m pip install opencv-python
But this has no hardware accelaration at all (neither OpenCV nor the FFMPEG integration).
There is also an option to build OpenCV with CUDA support. You can find several scripts, I used an aption of this. But usually, all these builds end up without FFMPEG support, although all required libraries are found. Reason is, that the FFMPEG test build during the CMake build fails (propably due to some static libraries).
I was successful to get a CUDA accelerated build of OpenCV with FFMPEG, when I previously uninstalled all previous OpenCV and FFMPEG installations by:
python3 -m pip uninstall opencv-python
sudo apt-get --purge autoremove python-opencv
sudo apt --purge autoremove ffmpeg
Afterwards, I downloaded latest FFMPEG sources from https://ffmpeg.org/download.html#releases and build it with the following configuration:
./configure --enable-shared --disable-static --prefix=/usr/local
sudo make install
Afterwards you should include the library paths to ld config by:
sudo gedit /etc/ld.so.conf.d/ffmpeg.conf
In this file you add the paths to the ffmpeg libraries (assuming you built ffmpeg-4.4 under the folder /home/username/):
Afterwards, update ld config by:
You should also set LD_LIBRARY_PATH, e.g. by:
With thes preparations, it should be possible to build OpenCV including CUDA support and FFMPEG. You can check this after the build and install finished, by:
>>> import cv2
In my case, now it showed:
And also CUDA support was activated.
Previously, I tried it also with the hardware accelerated FFMPEG 4.2.2 build mentioned earlier (by adding
--enable-shared to the configure command line), but in that case, the ffmpeg test build fails during CMake and so you end up with OpenCV without FFMPEG integration.
Now, after all this hard work and getting the confirmation for FFMPEG integration, I felt pretty lucky!
But the enthusiasm did not last long. Problem was, that this OpenCV build didn’t not work. As soon as you use cv2.VideoCapture(…), the program terminates with a segmentation fault (core dumped).
Building hardware accelerated OpenCV without FFMPEG support is quite easy. This typically happens, when you use a ffmpeg build in a different way than that described before for the pure (unaccelerated) version. It was working also hardware accelerated as expected - but without FFMPEG support.
In my case, I want to read from a RTSP stream. Using GStreamer instead of FFMPEG is possible with code similar to this example: Doesn't work nvv4l2decoder for decoding RTSP in gstreamer + opencv - #3 by DaneLLL
The problem is, that the resulting display looks much worse and distorted compared to a version using FFMPEG. As soon, as there is some further frame processing (even when using threading), the display lags behind with a continously increasing latency. All these problems do not occur when using a FFMPEG enabled OpenCV (even without hardware acceleration), so GStreamer is no option for me.
If you followed me up to this point and already made similar experiments, you can imagine, how much time I spent on this - still without success.
So, please NVIDIA:
Give us a Jetpack with hardware accelerated FFMPEG and a hardware accelerated OpenCV including FFMPEG support on the Jetson product line! I like GStreamer - but not in combination with OpenCV!
Or - if still not officially supported - I hope for someone else who could provide us a solid description how to bring these worlds together.