NvEncOpenEncodeSessionEx returns OUT_OF_MEMORY

Hi,

while integrating the NVIDIA encoder in an application I noticed the following:

If I have 1 encoding session running, stop this session and reopen it, everything works fine.
If I have 2 encoding sessions running, stop and reopen both of them, NvEncOpenEncodeSessionEx returns OUT_OF_MEMORY for the second session.

I’ve read the docs and based my code on the NvEncoder sample, so I think my (de)initializations are correct. To be sure, I’ve slightly modified the NvEncoder sample so it also opens 2 simultaneous streams and could reproduce the behavior mentioned above.

Could it be that something’s missing in the (de)initialization of the NvEncoder sample? As far as I can tell, it follows the steps from the ProgGuide. By the way, I’m using a GeForce GT 710, which I suppose is limited to 2 streams. However, because both sessions are closed before reopening them, I’d surprise me if this is the reason for the error.

Any help is appreciated. :)
Tomes.

Probably also useful: I’m using Video Codec v8.0.14 and driver version is 391.01.

Does this work when you run two independent processes in parallel?

Can you copy/paste your code here or better yet, post the entire project? You can send it to me via PM.

Everything runs fine when running them as parallel independent processes.