RTX A6000 ADA - no more NV Link even on Pro GPUs?

Here you go.
Multiple consumer gpus supported by ROCm

Here you go.

I know this is two years old, but tell me Markus. If I am sure you are aware that NVLink was used mostly to add more memory. How is this far from dead? $6K for 48GB of RAM doesn’t seem much of a deal. but $12K for 96GB of RAM and roughly 190% of compute power seems something a professional would want to buy. I already have a Blender Scene for my hobby that consumes more than 27GB of VRAM and is nowhere close to be finish. Nowadays displacement textures require more memory. And for AI there are specific NPUs. So is Nvidia wanting to keep the pro space alive or should we go to Apple when memory is a concern?

Hello @juanjoseluisgarcia, welcome.

Kind of a Necro thread, but ok. :-)

NVLINK is and always was much more than just “adding more memory”. Even if I repeat myself: NVLink & NVSwitch for Advanced Multi-GPU Communication

Still alive and kicking.

I hear your concern and frustration regarding pro workstation use-cases and I can understand it. In the end it comes down to business decisions which I at least cannot influence. Compared to HPC and current AI developments any NVLINK consumer or pro solution would be a niche market.

And I hope you understand that I cannot reply to

So is Nvidia wanting to keep the pro space alive or should we go to Apple when memory is a concern?