nvenc/dec with NVIDIA GeForce


According to following link

Video Encode and Decode GPU Support Matrix | NVIDIA Developer
HW accelerated encode and decode are supported on NVIDIA GeForce, Quadro, Tesla, and GRID products with Fermi, Kepler, Maxwell and Pascal generation GPUs. Learn more …
GeForce cards are not supported for nvenc/dec SDKs. I am considering to use ffmpeg for a project of mine and I’d like to use hardware acceleration if possible. I am planning to buy a GTX1080 or 1070 for gaming soon. If these cards also support nvenc/dec/cuda libs, then I wouldn’t need another card.

Will I need a separate card to utilize encoding/decoding capabilities of nvidia SDK, or a GTX 1070/1080 is fine?

Thank you in advance.

GeForce cards support CUDA and hardware accelerated decoding, limited only by the system resources.

There is a limit of 2 simultaneous encode sessions per system when using NVENC on GeForce cards.

Thank you for your reply.What do you mean by “system resources”, nvenc/nvdec engine/module on hardware? How about decoding capabilities like nvdec (cuvid?)? I will need both nvenc and nvdec. Is there a limit on decoding sessions by nvdec as well? Also, I will be developing a real-time encoding/decoding application, probably just one or perhaps two parallel enc/dec processes running. For production, I could probably test a card dedicated for this job, like Quadro or whatever, but for development purposes, do you think GeForce 1070/1080 will do the job?

Thank you.

Any comments for my last question?