Nvidia Web Driver causing CUDA issues

After countless attempts in trying to install CUDA to my system, I am about out of options.
I have installed, uninstalled and re-installed countless times for High Sierra and still am getting the same error on my GPU Sniffer thru Adobe

— GPU Computation Info —
Found 3 devices supporting GPU computation.
CUDA Device 0 -
Name: Quadro 4000
Vendor: NVIDIA
Capability: 2
Driver: 9.2
Total Video Memory: 2047MB

  • Not chosen because of initialization failure.

Because of this my CUDA will not work with my Adobe products for rendering. The issue seems to appear once I install the most current nvidia Web Driver and restart my computer.

Running a MacPro 5,1 with dual video cards. 1 ATI Radeon HD 5770 1024MB & a Quadro 4000 for Mac.

Here is my full Terminal description from GPU Sniffer.

Last login: Mon Jul 23 13:35:19 on console
GMIMAC11k:~ gmimac11k$ /Applications/Adobe\ Premiere\ Pro\ CC\ 2018/Adobe\ Premiere\ Pro\ CC\ 2018.app/Contents/GPUSniffer.app/Contents/MacOS/GPUSniffer ; exit;
GPUSniffer testing 254
<140736218272640> <0> Debug Assert failed!
Expression: result == CUDA_SUCCESS
Context creation failure: CUDA_ERROR_INVALID_IMAGE
Process: GPUSniffer
Process ID: 679
Thread ID: 140736218272640
File: /PPro12.1.2/releases/shared/adobe/MediaCore/GPUFoundation/Src/CUDA/CUDADevice.cpp
Line: 223
Function: virtual bool GF::CUDADevice::CreateContext(bool)
Callstack:
GF::CUDADevice::CreateContext(bool)
GF::Device::InitializeContextImpl(bool)
GF::Initialize(bool, bool, bool, bool, bool, void*, void*)
DS::(anonymous namespace)::GPUSnifferInner(std::__1::basic_ostream<char, std::__1::char_traits >&, unsigned int)
DS::GPUSnifferMain(std::__1::basic_ostream<char, std::__1::char_traits >&, unsigned int)
main
start

— OpenGL Info —
Vendor: ATI Technologies Inc.
Renderer: ATI Radeon HD 5770 OpenGL Engine
OpenGL Version: 2.1 ATI-1.68.20
GLSL Version: 1.20
Monitors: 1
Monitor 0 properties -
Size: (0, 0, 2560, 1600)
Max texture size: 16384
Supports non-power of two: 1
Shaders 444: 1
Shaders 422: 1
Shaders 420: 1

— GPU Computation Info —
Found 3 devices supporting GPU computation.
CUDA Device 0 -
Name: Quadro 4000
Vendor: NVIDIA
Capability: 2
Driver: 9.2
Total Video Memory: 2047MB

  • Not chosen because of initialization failure.
    OpenCL Device 1 -
    Name: Quadro 4000
    Vendor: NVIDIA (Apple platform)
    Capability: 1.2
    Driver: 1.1
    Total Video Memory: 2048MB
    OpenCL Device 2 -
    Name: ATI Radeon HD 5770
    Vendor: AMD (Apple platform)
    Capability: 1.2
    Driver: 1.2
    Total Video Memory: 1024MB
    logout
    Saving session…
    …copying shared history…
    …saving history…truncating history files…
    …completed.

[Process completed]

Please any assistance would really help me. Feel like I have tried almost everything instead of internal coding…

CUDA 9.x (9.0, 9.1, 9.2) doesn’t support Quadro 4000. Nor will any future CUDA toolkit.

The last CUDA toolkit that supported that GPU is CUDA 8.0

Eventually, support for these older Fermi GPUs will probably completely disappear from NVIDIA software, including the driver. I couldn’t tell you exactly when that will happen, or if it has already happened for your GPU on the Mac GPU driver you installed. It may also be an incompatibility with some other aspect of the GPU stack, perhaps not the GPU driver itself but the Mac CUDA driver you have installed.

If there is an older driver/config that works, you might want to use that and acknowledge that your Quadro 4000 has reached end of life.

Do you have a work around I can use to get my Quadro 4000 to work on High Sierra?