Second GTX Titan Black has been disabled by windows (code 43)

Hello there.

First of all I will like to apologize for my English. Second of all I know this issue had so many peoples at a front of me and all of the solutions suggested here wasn’t been helpful for me.

Basically my motherboard recently failed so I decide to upgrade little bit my system (It was ddr3 machine).
My H/W is now:
Asus Maximus VIII Hero Intel Z170
Intel Core i7 6700K overclocked to 4.5ghz
Corsair Vengeance 16GB DDR4 working on 3000mhz
2x Nvidia GTX Titan Black in SLI
Running windows 10 with all of the updates installed (22/12/2016)

Whole machine is water cooled so the gpu’s have a waterblocks as well.

After replacing motherboard was everything fine for first 2 boot ups. After I get blue screen: Your PC ran into a problem, blah, blah, blah with message VIDEO TDR FAILURE. After That I received exclamation mark in device manager on my GPU in PCI 2 slot and the description is pretty well known code 43.

I already tried all of the ways for reinstalling the drivers mentioned here on this forum but nothing works. I tried to ran Gpu’s without SLI, but no difference. I still have in device manager code 43.

It is maybe good to mention that the PSU is okay with correct voltage everywhere.

Is there anything suggested to do, or I will have to throw that GPU to bin?
I appreciate any help.

Thank You.

It is not clear whether your GPUs have been water-cooled before and after the upgrade. If you installed water cooling as part of the upgrade, I think there is a possibility that you could have damaged the GPU. But before coming to that conclusion, you would want to check a few things:

(1) Remove both GPUs from the system for visual inspection of the water-cooling block installation. I have never installed one of those, but I guess it is possible to install them “backwards”, or with gaps between the cooler and processor, or fasteners not fully closed. Is a thermal transmitter (e.g. pad, paste) being used between cooler and processor? Was it applied correctly? If you visually compare the two GPUs with cooler installed, do they look the same in every detail? Any difference you spot may be indicative of a problem.

(2) Visually inspect both PCIe slots. Use a flashlight if need be. Do you see any bent connectors, dirt, oil, or tiny objects (morsels of plastic, metal shavings) in the slots? Try to clean by holding the motherboard upside down and tapping the back. Use canned compressed air to clean if necessary.

Now, run some controlled experiments. Before you get started, disable CPU overclocking. You don’t want to have another risk factor in play while you are focused on troubleshooting the GPU issue.

(2) Try each GPU by itself, in the first PCIe slot. Make sure that the GPU is completely inserted into the PCIe slot, and secure the GPU at the bracket (this may be a screw, latch, or other mechanism). This makes sure that there is no undue mechanical stress on the PCIe connector, and helps ensure signal integrity when vibrations (e.g. from fans, or HDDs) affect the slot connector. Make sure all auxiliary PCIe power cables are plugged in properly. Usually there is a small tab on the connector that snaps into place when the connector is pushed all the way into the socket on the GPU. Do not use Y-splitters or 6-pin to 8-pin adapters on any of these power cables!

If both GPUs work individually, the GPUs themselves are fine. Maybe one of the two PCIe slots is damaged, so try both GPUs individually in the other slot as well.

(3) Many systems let you configure the PCIe slots in a variety of ways, so check in the system BIOS whether they properly configured. Looking at the Intel ARK (, the i7-6700K has only 16 PCIe lanes, so with two GPUs they probably need to run in a x8 configuration for both PCIe slots. I don’t have experience with that kind of configuration, you might need to try multiple BIOS settings to see whether any works. For full PCIe throughput in dual GPU systems, I recommend using CPUs with 40 PCIe lanes.