Home server with an M40 for Parsec or Steam In-Home streaming

Hello,

I’ve been trying to set up past these few days my home server to be able to play games.
I own a Supermicro X11SPG-TF board that has an on board ASPEED GPU with an VGA output.
The GPU accelerator in it is Nvidia Tesla M40 bought for like 200$ from ebay.
CPU: -Not that this is important: Xeon Gold 6138
OS: Windows server 2016 Datacenter

Day 1:
The drivers I got from: https://www.nvidia.com/download/index.aspx
are complete and utter trash with no options whatsoever but they did work for a while and to some extent.

I’ve actually had success with installing the drivers and making the card appear in Display Adapters, I’ve successfully (took me hours) set the GPU into WDDM from TCC, which was very hard because googling “Tesla” in 2019 will lead to unwanted results, and then for like a few seconds I’ve played a DirectX game. After restarting the server however, the GPU didn’t want to render Steam and DirectX games (I’ve had nvidia-smi -l running in the background). The only thing that would now run on my M40 was OpenGL. I’ve tested OpenGL with Unigine Heaven.

That’s about shortest summary I could give you about day 1.

Day 2:
After reinstalling/installing, removing drivers, reinstalling/installing software like VNC, Teamviewer, Zero-something VPN and all that crap I have lost all my nerves.
X11SPG-TF is a Supermicro server board that takes ages to post and as I was seeing that this led to nowhere I have thought of another approach.

The initial idea to game on my M40 came from me googling “Cloud Gaming” which eventually led me to Parsec which basically offers “Do it your way” cloud gaming setup.
Everyone was using these NV6 Azure servers with the M60 which is essentially the same as my M40.(I think, I’m not actually sure about this one. I’ve come across the term VDI and apparently that’s what my M40 does not have.)

So I started studying how Azure works and it’s basically the same as Hyper-V, so I went ahead and added a Hyper-V server role to my server, set up an Gen 2 VM and suddenly noticed that I needed something called RemoteFX to pass a GPU to the VM, so I went to the server manager features, added that as well, restarted and immediately came across another issue, being the 1024MB VRAM limitation, unbearable lag in even completely simple games and other relishes like that.

I gave up.

Day 3.
Searching the same few forums (eg./r cloudy gamer and this forum) google would constantly spout out I’ve noticed that there is in fact something I am missing, because the Azure servers don’t use trashy RemoteFX to pass their GPU to the VM. So I googled for that and came across many other forums, it was like I was suddenly enlightened.

So skip forward a few hours, I grab a few commands, stick them into powershell and voila! A new unidentified device popped up in the device manager in my VM. “Let’s install drivers!” I thought to myself. So I went to the Nvidia site, downloaded the freshest, newest Cuda 10.1(no clue what this is) drivers, installed them and rebooted the VM.

The VM is running on a 2TB M.2 SSD Raid, so the boot times were incredible. I could try so many thing with this VM and instead of losing hours in reboots I started losing minutes and the will to live.

Once I was back in the OS I ran into another problem. The GPU was detected as Nvidia M40, but had a yellow exclamation mark next to it, meaning something is wrong.
It was a code (12)… “What is a code (12)?” You might think to yourself, well… I’ve no clue either.
So I googled : “Hyper-V VM Tesla M40 Code 12 not enough resources error” I immediately got to enjoy some new Tesla model S autopilot footage.

At this point I stopped googling with the "Tesla" keyword. What each and single article about the error code were telling me was that I should either "uninstall the driver and install it again" which is the first thing even a not tech-savvy person would do, or I had to modify 4G decoding and stuff like that in the BIOS.

I did everything the first 20 articles told me and to my surprise, the error is still there. I have known all along that the issue must be coming from the MMIO allocation or something like that. The article I was following on the MMIO allocation was telling me to set the minimum to 2GB and maximum to the size of my VRAM, so for me that was 12GB. I’ve done that and I’ve even played around with for like an hour and I have tested out every value between 1GB - 3GB on the minimum and 8GB - 13GB on the maximum.

Still no luck, same error.

Day 4.

I was in school this day, so I only had like 12 hours of free time after I came home to do this so I didn’t achieve much.

“What if this assignment wasn’t meant for Tesla GPUs” I thought to myself. So I removed Hyper-V and everything related to it, also reassigning the M40 to the host and continued from there.

What I was trying to do at this point is search in-between the lines of what I’ve been reading since Day 1.

What I ended up doing was basically trying to get the Tesla M40 as a display location on the Generic PnP Monitor. Which I haven’t achieved, internet people tell me it’s not achievable but somehow I’ve had 100FPS, fully utilized Tesla M40 playing ARK: Survival Evolved.
For a few seconds… Game crashed and then back to OpenGL, driver reinstalling etc… Just to see that amazing framerate for a moment.

I wasn’t sure why it kept crashing, like was it a driver issue, does the card overheat, what’s wrong with it? (Just for the record, nvidia-smi was telling me it was only 70C)

So I went to take a look at my machine and what I found out was that the jet propeller fell off from the Tesla. (It was duct-taped to it)

I’ve cleared the crashing issue but now instead of crashing a combination of something I did (quitting rdp, exiting the game and launching it again) made it freak out the same way it was before but now instead of crashing the game or the whole server it simply refused to “render” DirectX programs again.

It was like 2AM by now and I had to wake up at 5AM, so I had to call it quits.

Day 5 (Today)
I took a laptop to school just so I can work on it while I’m bored or something.

Having it on the server like that simply wasn’t doing it and so I’ve started searching more about the reassignment and found out that Microsoft has an official 100x better guide than those I’ve been following.

The issue was the upper MMIO, it had to be set to a ridiculous number in order for it to work. So I reinstalled Hyper-V and everything back into the VM, passed it the GPU, gave it drivers and horraay! No exclamation mark, no nothing. Simply Nvidia Tesla M40 working correctly.

At this point I basically had the same setup as the most people on (/r cloudy gamer) and was able to follow their guide from here.

That didn’t last long and in fact, not even a second. Same issue with the virtual monitor. Everyone (There was even a video) installed the driver, restarted the VM instance and on start had 2 monitors in the device manager, where one was on the M60 and the other on basic microsoft adapter which you had to disable.

So I started searching for this issue and there was a dude, in fact two people following the same guide who ran into the same issue.
https://www.reddit.com/r/cloudygamer/comments/7twmiq/problem_with_azure_nv6_and_nvidia_tesla_m60/

What they did was set the Tesla from TCC into WDDM and that fixed it for them, which was the first thing I have learned about these GPUs so it was something I have already attempted.

That’s about it. This is a really watered down story. It took me 5 whole days including this one and me writing this for an hour to get basically nowhere.

I’ve sent an application to the Nvidia Grid 90 days trial to test those grid drivers, since that’s what someone also suggested to me and it’s the only thing I haven’t tried out yet.

Thank you for reading and sorry about my grammar, sentence construction and maybe even spelling.
I’m not a native speaker and it’s like 12AM so it’s hard to keep focus.

Can someone please assist me in this matter? I’m willing to do anything (Besides paying money, since doing that I could simply buy myself gaming hours on Parsec killing the whole purpose of doing it on my own hardware).

Parsec says:
"NVIDIA Tesla, GRID and Quadro

Professional workstation and server graphics cards will work with Parsec provided that they support hardware video encoding (NVIDIA NVENC),support either a physical display or display emulation via EDID, and are running in WDDM mode."

Can I achieve something like that? Even with a completely different VM, I don’t even care at this point. Feel free to suggest me even paid services, but I prefer the Open Source, free ones.

btw. Whoever helps me solve this gets 20 euro from me on paypal for what it’s worth.

Hi,

really not sure what you are going to achieve. You are mixing different things which shows you didn’t really investigate before buying something. Tesla M40 is not enabled for vGPU. And it is not the same GPU like Tesla M60 which is enabled for vGPU.
There is a difference between Tesla and vGPU drivers as you have already experienced.
Tesla drivers are for Compute use case only, vGPU drivers for graphics use case. In addition you would need a license (QvDWS) for running the vGPU driver within a VM.
Then you start to talk about RemoteFX. You should have looked into DDA (Passthrough) or using another hypervisor like Citrix XenServer or VMWare ESX.
Main issue: You bought the wrong board. There is no vGPU driver available so you can only use the Tesla driver with limited functionality for your purpose. I never tested with a M40 so I cannot say if there might be any chance to get it up and running in Passthrough at all.

regards
Simon

Hello,
Thank you for your answer.

You have misunderstood what I wrote into this topic. I’m not blaming you, my writing is horrible and I am sorry about that.

I simply did a sunmary of what I tried. I’ve got the passthrough up and running with no problems whatsoever in my Hyper-V Windows 2016 VM.

Right now as of writing I am about to install the Grid drivers (the enterprise ones) and I don’t know if they’re gonna work, since the M40 is not listed supported but I quess we will see.

Btw. I didn’t buy this board for gaming, I’ve just had it lying around and since my gpu died recently I wanted to game on this Tesla.

I really believe this can be done. As I pointed out earlier I’ve even had a game running on the Tesla and the fps were amazing but I couldn’t change the resolution and once I restarted the game, it switched to the Aspeed GPU.

Isn’t there something that I could use to emulate a monitor that runs on the Tesla? I’ve searched all over but I couldn’t find anything, except stuff like optimus for notebooks and some dude claiming that the RealVNC makes a virtual display, which isn’t true since I’ve tried that aswell.

Any insight would be helpful. Within a VM or outside a VM I don’t care.
I wouldn’t even mind if the Aspeed was the monitor and Tesla the renderer.

Update:

The grid drivers would’t install just as I suspected. Only the Grid M40 is supported, I own the basic 12GB M40.

Is there any way I could render a game on the Tesla and pass the video to the Aspeed on-board graphics? I mean laptops and old AMD APUs can achieve this. I suspect that this actually happened while I was running ARK on 100FPS, because I definitely didn’t have a monitor connected to the Tesla M40 in the device manager.

Also, is this the right forum to ask or should I ask somewhere else aswell? Are there any subreddits someone knows of on this topic?

Thanks.

No way to output the rendered data on the onboard GPU. You would need to use a remoting stack in between. You can use RDP/Citrix or VMWare as example. But this doesn’t solve the driver issue with the M40 as this board is simply not enabled for vGPU.

Hello,

Yes, you are indeed right. Citrix did the trick.

Hello @elton

Hello, i dropped you a message. Please do check if you have the time. Thanks

Hi, excuse, did you manage to use the M40 to play games?
I have one and I would like to know how you made the settings, and what tools did you use?

Hello,

It’s been some time but I’m back. The M40 in the end was a no-go.
I didn’t get it to work and I most likely never will but I am trying it again with a Tesla T4.

:) T4 will work for sure…

1 Like

Yes, it worked. I even use it to mine Ethereum in the downtime to pay a little for itself.

Hello.
I managed to use the M40 12GB RAM
I have 2 Tesla M40s on a Dell T620 server Running with Windows Server 2019
Modified files to recognize the card as a GTX Titan X.
As my server only has basic video output, I make a remote desktop connection with the resolution I want to have, and then I make the connection with Parsec.
The only problems are some delays when loading the game, then it normalizes.
Until now I have only played Horizon Zero Dawn at 2560x1440 with Medium quality.

1 Like

It’s been some time but I’m back. The M40 in the end was a no-go.
I didn’t get it to work and I most likely never will but I am trying it again with a Tesla T4. firmwarexbd

May I ask how you accomplished that? Which “files” did you modify and did you get it working with other games? I did manage to run ARK in DX mode off of the M40 once by mistake, and it was pretty decent but I forgot to plug in the fan so it crashed on me because of the temperatures and I didn’t have any luck since. Don’t get me wrong, the game starts but in OpenGL mode and that’s not what I want.

I have been playing Cyberpunk 2077, in FHD with high graphics and Vsync disabled, it gives me between 24 and 30 FPS, Doom Eternal running in Vulkan gives me about 60 FPS, Middle-earth: Shadow of War still gives me about 60 FPS.

The file that I modified was “nv_dispi.inf” located in the Display.Driver folder.
Replace the identifier of the TItan X card with the identifier of the Tesla M40, throughout the file.

You have to unzip the driver to gain access, you also have to disable the requirement for signed drivers in windows.

Since Parsec needs to have an active monitor, I must make a remote connection to the virtual machine, I tried to use a quadro card together with the Tesla to connect a monitor, but in that configuration, windows does not assign the Titan X card (Tesla M40) to high performance use, and the games run on the quadro card.

For the fan use this Tutorial so that they are activated when I start playing

1 Like


Rage 2 running on vulkan

1 Like

Captura de Pantalla 2021-03-23 a la(s) 23.32.13

CyberPunk 2077

They are the only captures I have, at the moment.

1 Like

Thanks for the reply!

Great results and thanks for the “hack”. Are u using VNC to emulate a display using a remote connection or just rdp?
How’s the stability? Have you had any games that refused to run?

I’m using RDP, some games have a FPS drop when you enter menus or load another scenario, one or two seconds it goes down to 15 fps and then it normalizes, and I had problems with the last Batman game: Arkham Knight, it tells me no It can be executed on a remote connection, the game Hellblade Senua’s Sacrifice does not run, it doesn’t send me any error, it just won’t open, NieRAutomata sends me a “No Graphic Memory” error, and seeing that it can be solved but this game is from GamePass PC and files are protected.

The biggest problem I have is with vsync, it drops too much in performance in most games, and in Doom Ethernal with vulkan, it does not lower the FPS but I feel more input lag.
By keeping it deactivated you can feel the displacement of the environment with small pauses when you go very fast on the map.

Some of the games that I tested do not run as full screen, if not as a borderless window, it looks like full screen, if I change it to full screen, it looks like a window.