- When will the Nvidia Web Drivers be released for macOS Mojave 10.14 -

We now know why Apple pulled this stunt - to prepare their ecosystem for Metal on M1 Macs. But the root problem is much deeper.

Let me explain. In a world where adherence to traditional belief systems is falling rapidly, Steve Jobs created a personality cult around himself. The outcome: what RationalWiki calls “the world’s most expensive religion”.

Attribution: Dicklyon, CC BY-SA 4.0 <https://creativecommons.org/licenses/by-sa/4.0>, via Wikimedia Commons

Nvidia’s story is less well known. In the 1990’s, Jensen Huang left AMD to create a new company. Foreshadowing the era where clock speeds plateau at 3 GHz, the founders centered it on parallel computing. They would flaunt massive computational power, making competitors envy their products. If someone doesn’t buy “their” invention - the GPU - they’re behind the curve.

invidia - Latin for envy, signified by green and the evil eye (Nvidia’s logo). One of the seven deadly sins, also known as coveting and materialism. CUDA was deliberately engineered into a closed (source) ecosystem, so desirable that people wish they were part of it. Moore’s law stayed alive, shifting focus from greater clock speed to greater parallelism. Lacking GPU acceleration means being sent 10 years into the past, the era of sequential computing.

Marketing is ingrained into these APIs. PTX (parallel thread execution) is “the” IR of parallel computing. When OpenCL was in its infancy, Nvidia leveraged its massive market share to force encourage small developers to choose CUDA instead. GPGPU became synonymous with CUDA.

The issue is not Apple yanking CUDA from their ecosystem. It’s that people think there’s no genuine, high-quality alternative to CUDA. Nvidia preached a gospel about the era of parallel computing, with green plastered all over it. Yes, they put a lot of respectable work into perfecting cuDNN, NVLink, etc, but the concept of GPGPU is universal to all vendors.

Why don’t most people try out OpenCL, SYCL, Metal, or Vulkan? What about the myriad applications where people spend most time on the CPU, debugging things, and rarely need GPGPU? Hopefully by 2030, our answers to these questions will change. History has proven that open/fair competition wins in the end.

TL;DR - This discussion has gotten too heated. I’m speaking up for Apple users and put a lot of effort into this, including rewording it to be more considerate. Hopefully my comment wraps up discussion so moderators can stop worrying about it.