Hello,
we’re experiencing a stability issue with our render farm with C4D r21 using Redshift, that is set up with team render server and clients.
The farm is made of 15 pc (with dynamic IP) most of them running rtx’s 2070 or 2080, and a few running gtx 1080.
Every computer is running Windows 10 Professional Edition, and the CPU are all Intel i7. All GPU drivers and C4D releases are up to date.
Once we run a render job all clients starts correctly, but after a random time (usually a few hours) clients start to crash. It’s also then impossible to restart them from the web interface.
This happens at random times, on random machines and with different projects.
We found these two errors on windows event viewer on the machines when the client crashes:
The description for Event ID 13 from source nvlddmkm cannot be found. Either the component that raises this event is not installed on your local computer or the installation is corrupted. You can install or repair the component on the local computer.
If the event originated on another computer, the display information had to be saved with the event.
The following information was included with the event:
\Device\0000010d
Graphics Exception: ESR 0x50d730=0x503000f 0x50d734=0x24 0x50d728=0x4c1eb72 0x50d72c=0x174
The message resource is present but the message was not found in the message table
The description for Event ID 13 from source nvlddmkm cannot be found. Either the component that raises this event is not installed on your local computer or the installation is corrupted. You can install or repair the component on the local computer.
If the event originated on another computer, the display information had to be saved with the event.
The following information was included with the event:
\Device\0000010d
Graphics SM Global Exception on (GPC 0, TPC 2, SM 0): Multiple Warp Errors
The message resource is present but the message was not found in the message table
How can we solve this? it’s causing us a lot of trouble because we cannot rely on the render farm and unfortunately we’re getting closer to our deadline.
Thanks a lot in advance for your help!