Is it possible to use Nvidia A10 without virtualization?

Sorry if this is a trivial question, but I’m having a hard time trying to find a precise official answer.

My goal is to install an A10 on a remote baremetal with Windows Server and access it via Remote Desktop to run rendering tasks such as 3DS Max Vray rendering and Unreal Engine 4 Pixel Streaming architectural visualization. I don’t intend to virtualize the server.

Is it possible to use A10 for that purpose without needing to worry about vGPU, extra licenses, etc? All the documentation and promo material on A10 seems to imply I need vGPU solutions to use it.

Hello @rabellogp and welcome back, it has been a while.

This is definitely not a trivial question. These types of cards are meant as pure “work-horses” in (rack based) server setups with multi-GPU configurations. For those it only makes sense to use them in a virtualization environment for multiple users.

On the other hand if you manage to set up your Windows Server in a headless fashion with one (or more) A10 as the main GPU to be used by locally (on that machine) installed apps, then no one will stop you from using that machine in a more classic workstation fashion by connecting through RDP.

But what I do not know is if you can actually get that setup running. I am not aware of this being tested or reported by other developers. Which means that I would be very grateful if you could keep us updated on your success.

Hope this helps!

Thanks for the tips!

Due to the lack of precise information on that regard, we ended up going for RTX A5000 instead of A10. They’re not as efficient and compact for a server rack situation… But at least it’s a safe bet for a non-virtualized environment. We couldn’t afford the risk of the A10 not working and needing to be swapped giving all the crazy situation with price and availability of GPUs.

Anyway, the possibility of A10 working on non-virtualized environment remains a mystery.