BUG: Indiana Jones and the Great Circle only runs at ~40% of max TDP on RTX 4090

Indiana Jones and the Great Circle maxes out my RTX 4090 at 100% utilization in MangoHud while only using ~170W in TDP, while it should reach around ~350W during gameplay. The result is very poor performance, with an average of around 18FPS at 7680x2160p. On my 7900XTX rig under identical settings with the same monitor, running with amdvlk drivers averages around 70FPS at 7680x2160p.

I tested God of War: Ragnarok just to make sure I wasn’t running into thermal issues or something, and I was easily reaching 350W maximums.

At least one other user with an RTX 4070 Super also reports poor performance in the game’s Proton issue thread: Indiana Jones and the Great Circle (2677660) · Issue #8292 · ValveSoftware/Proton · GitHub

nvidia-bug-report.log.gz (615.4 KB)

steam-2677660.log (1.4 MB)

Please note that the game currently needs the env variable DXVK_NVAPI_GPU_ARCH=GA100 %command% to fake as an Ampere card, in order to avoid a game crash under Proton.

4 Likes

Try setting VKD3D_CONFIG=no_upload_hvv if you haven’t tried that. It should disable Resizable Bar. (it doesn’t always work tho’, so disabling it in BIOS is the second option)

Game uses Vulkan, not DX12.

Oh, sorry, did not know that, the screenshot is a bit blurry so I didn’t see it. My bad.
Since dxvk-nvapi was involved I kind just took for granted that it used DX12.

1 Like

nvidia has a big problem with directx 12 and they seem to be taken a long time to fix this urgent issue , dx12 is 30-40% behind dx11 in the same games that support both and the one that are only dx12 well u better play them on windows or have amd card as the performance is extremely bad with vk3d .

Sure, but this isn’t a DX12 game as VulkanGuy kindly pointed out.

yeah , i know just a general issue with dx12 and nvidia , sadly the dev seems to either ignore this with almost 0 communication about it .

Let’s keep this post on-topic please, I meant for this thread to be an issue report not an airing of grievances. Thanks.

Some additional info for NV on this: Issue persists under the latest Vulkan developer beta driver, same with using the LTS kernel and nvidia-drm proprietary module.

1 Like

I can confirm. I even tried with a lower resolution of 1280x800 on medium settings - it barely touchея 45-55 FPS. 4090 here.

While the ingame stats (if enabled) show the proper amount of available (GPU) memory, mangohud shows that the game isn’t using more than ~4GB, which might explain the abysmal performance.

The card itself (4080 in my case) reports 100% usage but only consumes some ~70W while doing so. nvtop shows high PCIe activity in terms of reads and writes, even when just being in the game menu (which has a still of the game graphics in the background).

Indeed, I get 7 FPS on a RTX 4070.

Timestamp                                 : Sat Dec  7 10:51:15 2024
Driver Version                            : 565.77
CUDA Version                              : 12.7

Attached GPUs                             : 1
GPU 00000000:01:00.0
    GPU Power Readings
        Power Draw                        : 48.40 W
        Current Power Limit               : 200.00 W
        Requested Power Limit             : 200.00 W
        Default Power Limit               : 200.00 W
        Min Power Limit                   : 100.00 W
        Max Power Limit                   : 200.00 W
    Power Samples
        Duration                          : 2.94 sec
        Number of Samples                 : 119
        Max                               : 69.23 W
        Min                               : 44.38 W
        Avg                               : 49.20 W
    GPU Memory Power Readings 
        Power Draw                        : N/A
    Module Power Readings
        Power Draw                        : N/A
        Current Power Limit               : N/A
        Requested Power Limit             : N/A
        Default Power Limit               : N/A
        Min Power Limit                   : N/A
        Max Power Limit                   : N/A
$ nvidia-smi dmon -s m
# gpu     fb   bar1   ccpm 
# Idx     MB     MB     MB 
    0   3239    472      0 
    0   3239    472      0 
    0   3239    472      0 
    0   3239    472      0

So yeah, the game uses about 2GB of VRAM and 400MB of BAR memory (Roughly 1.2 GB VRAM was already in use when i launched the game, and 42mb of BAR memory).

2 Likes

Same issue here, I am getting 15-17 FPS regardless of the in-game settings. VRAM usage is low and power draw is only about 70 watts in my case. However, GPU usage seems to be almost at max.

I contacted NVIDIA support through the support form for consumers describing the same issue, as well as Bethesda support.
I also maintain a Reddit post discussing this issue with current workarounds, which don’t seem to work unfortunately.

2 Likes

I just saw the topic on this forum of the same issue that happened back in 2020 with the release of DOOM Eternal, another game on idTech engine (though Indiana Jones is based on Motor, which is idTech related). The problem was exactly the same: low VRAM utilization lead to poor performance. I guess the fix being introduced back then doesn’t work now as this game is based on a new engine.

@esullivan @amrits
Is it possible to get a hotfix for this?

Something is obviously different from Windows here.

1 Like

Yeah, I’m having the same issue… I wanted to play the game this weekend, but obviously, that’s not happening thanks to Nvidia :)

Well, I don’t think it’s NVIDIA’s fault. More that the fixes for idTech engine just doesn’t work for this new Motor engine. So they kinda need to be reapplied according to how Motor works.

Yes; most likely this same Doom Eternal issue also applies here but since they dont have an app profile for this as of now performance is equally borked as how it was before with Doom Eternal.

1 Like

Nope, the game is running fine on AMD so it’s NVIDIA thing.

It’s mine issue, yes. Same here.

It is an Nvidia thing but was also an AMD thing too.

Engines has quirks, since every vendor wants those games running on their gpus they do apply quirks instead of waiting devs to fix their work. Because it doesnt happen usually.

2 Likes