So I have a hp proliant dl160 running Windows 10 64bit. 64gb of ram 2x quad core psu.
I have fitted to tesla cards c2075 and c1060 and now need someone who can program cuda.
The server is running plex media server, this program need to transcode films quickly.
Any one feel that they are able to help.
I must be honest… I haven’t got a clue about c/c++
This will be a problem, as support for the C1060 was discontinued in CUDA and NVIDIA graphics drivers years ago. Not sure about the support status for the C2075.
I also do not recall that these old GPUs contained the NVENC hardware used by modern GPU-accelerated video transcoders. Note that NVENC is processing capability that is orthogonal to the hardware support needed for CUDA.
So dose this then mean there is no way of getting even one of the cards working on my system?
Somone please help.
How do you imagine that someone would be able help remotely without information on specific issues you are encountering? At this point, have you actually plugged the GPUs into your system and installed drivers? I would say your best bet for getting things to work is finding a knowledgeable person in your immediate vicinity who is willing to make house calls (obviously you may have to pay them).
You may or may not be able to get the C2075 running on your system. I don’t know what the support status of the C2075 (or more generally GPUs with compute capability 2.x) is. I think CUDA 8 may have been the last version that supported it, while the current version of CUDA is 9.2. So you may need to install older CUDA software and older drivers. You did not mention what video transcoding software you plan to use. You might want to check what the minimum hardware requirements for that software are.
In any event my standing recommendation is not to bother with such completely outdated hardware, especially if people have zero prior experience with GPUs and GPU computing. The number of people who have such old hardware at their disposable and could compare notes with you based on recent hands-on experience is likely very small, and none of them may be on this forum.