Question Regarding 10 Series NVENC Encoding

I had been using a GTX 1080 for NVENC H264 encoding in FFmpeg, I would encode 3440x1440 @ 100FPS stream and 1080p60 stream simultaneously without issue, task manager would reveal I was using about 30% of the encoding power.

I had read in many places in the past that encoding was identical across the board for 10 series GPUs. So to test I switched to a GTX 1050 and they clearly don’t have comparable encoding performance, I can hardly encode the 3440x1440 @ 100FPS signal on the 1050 alone with 65% constant encoding usage and spikes.

I found this page:

This makes it sound like a 1060 6GB would have comparable encoding performance to a GTX 1080…

However this post seems to infer that the GTX 1080 has 2x the encoding power of a 1060 and even a 1070:
https://devtalk.nvidia.com/default/topic/987460/nvdec-cuda-nvenc-speed-comparison/

So to sum up what I’m really asking - Are all 10 series GPUs (1080ti excluded) inferior in terms of encoding when compared to a GTX 1080?

  1. You must accept that NVidia is incapable to describe correctly own products. I lost hundreds of $$$ for NVidia official incorrect or intentionally unpublished information.
  2. https://developer.nvidia.com/video-encode-decode-gpu-support-matrix is wrong again. Line "GeForce GTX 1060 - 1070 Ti" should be "GeForce GTX some 1060 / 1070 / 1070 Ti / 1080" eg. GP104 or should be split to two lines if "GTX some 1060 / 1070 / 1070 Ti" have GP104 but only one hwenc (see next point).
  3. But be warned. Some crippled (with HW faults from chip factory) chips have less CUDA (some faulted SMs are disabled) and sometimes one of two hwenc is faulted too (compare Quadro P4000/P5000 - see https://devtalk.nvidia.com/default/topic/1036615/). So, there may be the same problem with GTX 1070/1070Ti (GP104-200-A1/GP104-300-A1 vs. GP104-400-A1). If https://devtalk.nvidia.com/default/topic/987460/nvdec-cuda-nvenc-speed-comparison/ is correct (eg. tested correctly on real card) GTX 1070 has only one hwenc enabled.
  4. Check also https://en.wikipedia.org/wiki/List_of_Nvidia_graphics_processing_units for chip marking and how it is crippled. For example check GTX 1060 - it uses 4 different chips. Normal chip (GP106-400-A1), crippled chips (GP106-300-A1 / GP106-350-K3-A1) and super crippled chip (GP104-140-A1 (only 9 from 20 SMs and 192 of 256 bit memory bus width are working)). ... Life with NVidia is like a box of chocolates, you never know what you're gonna get ! :-)

PS: All introduced Geforce RTX 20xx have crippled chips.