The error message "System throttling due to over-current" appears when running YOLOv4

Hi ,
I used Xavier NX and flashed the latest BSP 32.5 and Jetpack 4.5. I increased instantaneous OC limit (suggestions from the following NV question: System throttled due to over-current? - #58 by JerryChang), from 3.6A to 5A, and then I ran YOLOv4 under 15W 6 core, but this error message "System throttling due to over-current"still popps up.

hello sam.tsai,

are you able to reproduce the same failure by running sample application repeatedly, i.e. deepstream-test3

Hi Jerry Chang,

It is effective to increase the instantaneous OC instantaneous limit from 3.6A to 5A for deepstream sample applications, but YOLOv4 is invalid, YOLOv4 has a heavy GPU load.

I get the same problem when I work on Jetson NX, Jetpack 4.5.1
Is there any method to solve this?

hello XiangjiBU,

note, this is reproduce with heavy GPU loads. we’re still having internal investigation.

Hi JerryChang,

Is there a solution?

hello sam.tsai,

not yet, it’s still having internal investigation.


@JerryChang Let us know, how we can contribute. I’m having this issue reproducibly on two Jetson Nano Dev Kits while running inference on 3 USB cams.

hi all,

if you still observe OC throttling warning even after increasing OC limit to 5A.
it’s suggested to reduce CPU/GPU within nvpmodel customization to reduce power consumption,
here’s utility,, to create customize power modes.
please have a try, looking forward to your test results,

I’m getting Server Error (500) after login ?

(Can anybody confirm link actually works?)

Hi JerryChang,

How can I turn off this warning message?

hello sam.tsai,

it’s suggested to create a customize power modes, by reducing CPU/GPU frequencies to reduce power consumption of nvpmodel.

hello pev,

could you please try again, you should have forum account to access the tool.
please check the Tutorials | NVIDIA Developer, here’s training video, Getting started with new PowerEstimator tool for Jetson for your reference.

I do have forum account which is working OK (can login to this forum) but i still get Server Error when i try to log in to access the powerestimator tool… is there any other place i could get the tool ?

I tried using the tool as well, i can login in fine but adjusting the parameters and pressing estimate power does nothing. No estimated number shows and no download option shows for the config. Am i missing something?

You mean there was problem to login the developer site:Autonomous Machines | NVIDIA Developer ?
Supposed the user name and password are the same, could you try again?

I have the same condition here, I can login forum, but PoserEstimator still have 500 server error. Is there any other way to test this tool?

I used another login and i have been able to log in a couple of times… sometimes i get error although its not 500 server error…

About the tool as amr_elgendy said… Estimate Power button does nothing… no wattage or any value(s)… always just that same red Power rectangle…has anybody actually got this working ?

I doubt may be some network glitch browser specific issue. Although we have handled almost all browsers but for better experience please try with Chrome browser to see it issue still present.

And, please be sure to Logout and login again.


I am still getting this issue.

I’ve tried to measure the power usage as well

JetPack 4.4: 12V 2.0A
JetPack 4.5.1: 12V 1.5A

It seems that 4.5.1 the current is being limited, even though i’ve set OC limit to 5A. I am not sure if the OC change is being accepted by the system or not. But I cannot get beyond 1.5A usage in 4.5.1, which is easily achievable in 4.4.

Is there any other way to force the current limit to 5000mA ?

When i tried to add the OC limit to 5000mA in the startup script and rebooted, after that using the system for sometime it froze on the nvidia logo with weird checkerboard pattern all over the screen, in short it was a brick. I had to reflash the whole system.

The main question is, everything works fine in 4.4 but not on 4.5.1, so there must be some fix for this.