Hi guys,
I am using the function “nvmlDeviceGetTemperature” to get the GPU temperature when the ambient temperature is below -5 degrees. But the returned value is extremely large(“12029”). This value is not acceptable when I am using this API. I noticed that the type of returned value of this function is an unsigned integer and it can not hold the negative value. I guess there must be a formula that can translate this extremely large value into real GPU temperature. Could you share this formula if it exists? Thanks in advance.
Thanks,
James
Hi James,
Could you provide some more details on your test setup? For instance, is this on Linux or Windows?
This is on Windows Server. The GPU is A2.
After checking again, this API will return “Unknown Error(999)” when the GPU temperature is below 0C instead of the extremely large value. But I still want to get the temperature value when it is below 0C. What API can I use to get this temperature?
Platform doesn’t matter. The function should take in a pointer to a signed integer but it doesn’t.
Index 0 should hopefully return the temp as a signed integer for you.
This is what I want. Thanks.
I’m facing an intriguing challenge with the nvmlDeviceGetTemperature
function from NVIDIA’s NVML library, and I’m hoping to click into the community’s expertise. I’m encountering an unexpected behavior when the ambient temperature drops below -5 degrees Celsius. Instead of getting a reasonable GPU temperature reading, I’m getting an abnormally large value that doesn’t align with the actual temperature. I suspect that the unsigned integer type used for the returned value might be causing issues with negative temperatures.