Cuda doubts

Is anybody else reconsidering their choice in using cuda? The recent chip shortages and crypto-mining has left my project in this strange dead zone, wondering about code portability and scaling, using cheaper CPU hardware rather than expensive GPU hardware with architecture lock in.

Khronos has recently published an update to the SYCL specification and it’s beginning to look enticing, more so with where the industry is going, I see super computers popup with CPUs in them rather than GPUs suggesting it’s becoming more practical and cost effective to scale in this way.

There are some attractive features in SYCL which are helping it’s allure!!

I’d love the freedom to lift and shift my code to run on a CPU or a GPU (Nvidia or AMD), even if not practical from a performance perspective, it gives more options related to cost.

I’ve seen the new specific rtx mining chips being announced, without monitor outputs and I think that’s terrific, I’d love cut down hardware like that to reduce cost, if only I could compete with the miners to get it, even if it’s not targeted towards machine learning people like myself.

I also worry about the new rtx 3060 technology, which sounds great to nerf mining software, but at the same time I wonder how well that’ll work and if it’s prone to mistake, will it nerf my custom AI code accidentally?

It seems like there’s gamers, there’s miners and then there’s me and my awkward little niche that could do with some Nvidia TLC.

Are the shortages and miners and inability to buy RTX 30xx cards are reasonable prices stunting anybody else’s projects?

Where are you located? Are you actually able to get your hands on desirable high-end CPUs that deliver performance similar to GPUs? And at reasonable prices?

For most of the past year, I have read report after report of semiconductor shortages across the board, including Intel CPUs and AMD CPUs. Clearly there is some impact from the pandemic. As for the other contributing factors I see a constantly shifting set of explanations, down to arcane things like a shortage of particular insulating foils used in the production of PCBs. Most predictions I see state that a global shortage of a plethora of semiconductor parts is expected to last through the middle of 2021.

If you have been around for a while, you will recall that the computer industry occasionally experiences extended shortages of various components, from DRAM to hard disks.

What niche are you in? Having worked in a number of small to giant Silicon Valley companies, including NVIDIA, I would boldly claim that NVIDIA supports software developers using their platforms better than most companies.

Have you looked at Intel’s OneAPI, or potentially the nVidia HPC SDK? They advertise to scale across both CPUs and GPUs (and Intel’s API also extends to FPGAs) - all based on a single API and code base.