Hi,
after posting my issue to the Kdenlive-Forum and then to the Nvidia Geforce Forum, I have now been advised to post my issue here … well … okay. Let’s try:
My environment
Operating system: Arch Linux
Video-editing software: Kdenlive 18.0.3, MLT version 6.10.0, FFmpeg libraries
FFMpeg is updated/changed to fmpeg-nvenc-full
Installed nvidia-drivers and packages:
extra/ffnvcodec-headers 8.2.15.6-1 [Installiert]
extra/libvdpau 1.1.1+3+ga21bf7a-1 [Installiert]
extra/libxnvctrl 415.18-1 [Installiert]
extra/nvidia 415.18-4 [Installiert]
extra/nvidia-lts 1:415.18-1 [Installiert]
extra/nvidia-settings 415.18-1 [Installiert]
extra/nvidia-utils 415.18-1 [Installiert]
extra/opencl-nvidia 415.18-1 [Installiert]
community/cuda 10.0.130-2 [Installiert]
The issue
When I start the rendering process in Kdenlive, the GTX 1050 Ti seems not be used at all!
I am watching the GTX 1050 usage with $ watch nvidia-smi and in parallel the CPU usage with $ htop. When I start a rendering process, the GTX 1050 usage stays unchanged at approx. 2% while the usages of the CPU-cores shots up into the sky. That’s not the idea of having a GPU …
In Kdenlive I already tried several render-profiles (among others, my modified one: properties=x264-medium f=mp4 vcodec=h264_nvenc acodec=aac g=120 crf=%quality ab=%audiobitrate+‘k’)
I have to admin, one items confuses me: I thought, when you specify vcodec=h264_nvenc, then the nvidia graphic-card is used for sure - or an error message should appear, which isn’t happening. I mean, isn’t it at the end ffmpeg, which is handling the coding/rendering. So, there might be still the possibility, that my AMD Phenom II x3 CPU is the culprit, because it isn’t delivering data fast enough to the GPU. In the next days it will change to an AMD Phenom II x6 … if then the GPU usage doesn’t show any difference,then the GTX 1050 TI should be indeed useless for Kdenlive. :-(