Indiana Jones and the Great Circle maxes out my RTX 4090 at 100% utilization in MangoHud while only using ~170W in TDP, while it should reach around ~350W during gameplay. The result is very poor performance, with an average of around 18FPS at 7680x2160p. On my 7900XTX rig under identical settings with the same monitor, running with amdvlk drivers averages around 70FPS at 7680x2160p.
I tested God of War: Ragnarok just to make sure I wasn’t running into thermal issues or something, and I was easily reaching 350W maximums.
Please note that the game currently needs the env variable DXVK_NVAPI_GPU_ARCH=GA100 %command% to fake as an Ampere card, in order to avoid a game crash under Proton.
Try setting VKD3D_CONFIG=no_upload_hvv if you haven’t tried that. It should disable Resizable Bar. (it doesn’t always work tho’, so disabling it in BIOS is the second option)
Oh, sorry, did not know that, the screenshot is a bit blurry so I didn’t see it. My bad.
Since dxvk-nvapi was involved I kind just took for granted that it used DX12.
nvidia has a big problem with directx 12 and they seem to be taken a long time to fix this urgent issue , dx12 is 30-40% behind dx11 in the same games that support both and the one that are only dx12 well u better play them on windows or have amd card as the performance is extremely bad with vk3d .
Let’s keep this post on-topic please, I meant for this thread to be an issue report not an airing of grievances. Thanks.
Some additional info for NV on this: Issue persists under the latest Vulkan developer beta driver, same with using the LTS kernel and nvidia-drm proprietary module.
While the ingame stats (if enabled) show the proper amount of available (GPU) memory, mangohud shows that the game isn’t using more than ~4GB, which might explain the abysmal performance.
The card itself (4080 in my case) reports 100% usage but only consumes some ~70W while doing so. nvtop shows high PCIe activity in terms of reads and writes, even when just being in the game menu (which has a still of the game graphics in the background).
Timestamp : Sat Dec 7 10:51:15 2024
Driver Version : 565.77
CUDA Version : 12.7
Attached GPUs : 1
GPU 00000000:01:00.0
GPU Power Readings
Power Draw : 48.40 W
Current Power Limit : 200.00 W
Requested Power Limit : 200.00 W
Default Power Limit : 200.00 W
Min Power Limit : 100.00 W
Max Power Limit : 200.00 W
Power Samples
Duration : 2.94 sec
Number of Samples : 119
Max : 69.23 W
Min : 44.38 W
Avg : 49.20 W
GPU Memory Power Readings
Power Draw : N/A
Module Power Readings
Power Draw : N/A
Current Power Limit : N/A
Requested Power Limit : N/A
Default Power Limit : N/A
Min Power Limit : N/A
Max Power Limit : N/A
So yeah, the game uses about 2GB of VRAM and 400MB of BAR memory (Roughly 1.2 GB VRAM was already in use when i launched the game, and 42mb of BAR memory).
Same issue here, I am getting 15-17 FPS regardless of the in-game settings. VRAM usage is low and power draw is only about 70 watts in my case. However, GPU usage seems to be almost at max.
I contacted NVIDIA support through the support form for consumers describing the same issue, as well as Bethesda support.
I also maintain a Reddit post discussing this issue with current workarounds, which don’t seem to work unfortunately.
I just saw the topic on this forum of the same issue that happened back in 2020 with the release of DOOM Eternal, another game on idTech engine (though Indiana Jones is based on Motor, which is idTech related). The problem was exactly the same: low VRAM utilization lead to poor performance. I guess the fix being introduced back then doesn’t work now as this game is based on a new engine.
Well, I don’t think it’s NVIDIA’s fault. More that the fixes for idTech engine just doesn’t work for this new Motor engine. So they kinda need to be reapplied according to how Motor works.
Yes; most likely this same Doom Eternal issue also applies here but since they dont have an app profile for this as of now performance is equally borked as how it was before with Doom Eternal.
It is an Nvidia thing but was also an AMD thing too.
Engines has quirks, since every vendor wants those games running on their gpus they do apply quirks instead of waiting devs to fix their work. Because it doesnt happen usually.