Module Power for Jetson TX2

Hi,

What is the Module Power for Jetson TX2?
For Jetson TX1 the module power is specified in TX1 datasheet v1.1 as 6.5 – 15 W. We would require the min-max power consumption for the TX2 module for power budgeting. Request if you can provide an answer to this question.

Regards,
Shareef

I can’t give exact numbers, but roughly speaking the TX2 is half at a given load. Peripherals on PCIe or USB add directly to this (a self-powered USB HUB would of course not draw power…everything else does).

There is some information about power required when combined with carrier in this thread:

https://devtalk.nvidia.com/default/topic/1006369/jetson-tx2/power-requirement-for-jetson-tx2-development-board/

Hi,

This does not answer my query as I am looking for the TX2 module power consumption range (min-max) without a carrier card similar to that given for TX1 module.
Hope Nvidia can share the TX2 module power numbers.

Thanks,
Shareef

In the “Jetson TX1-TX2 Developer Kit Carrier Board Specification” page 35 and 36 shows the module gets VDD_MOD from VDD_IN and VDD_MUX goes to VDD_5V0_IO_SYS and VDD_3V3_SYS

VDD_IN/VDD_MUX Main power input from DC Adapter is 5.5-19.6v ~4000mA. Max would be 19.6 * ~4 = 78.4 watts which fits the 90W development kit power adapter. Now subtract the max of VDD_5V0_IO_SYS and VDD_3V3_SYS 78.4 - (5 * 7) - (3.3 * 7) = 20.3 so 20.3W would be the max VDD_MOD could pull with 3, 5 fully loaded. Min would be best actually tested because min load with it doing absolutely nothing (no code running) doesn’t seem useful.

See Description and Table 2 on the following link as well. Watts used depends on the mode and “Jetson TX2 was designed for peak processing efficiency at 7.5W of power.”

https://devblogs.nvidia.com/parallelforall/jetson-tx2-delivers-twice-intelligence-edge/

Estimating the actual power draw based on power regulator design limits seems like not a great way.
The documentation also states that the Jeton TX2 runs coolest/best with a 9V power supply.

I looked through the documentation, and the closest I could come was estimates in the 12-15W range, and a recommendation to “run the system with the user application running, and measure actual power consumption and temperature rise.”

Thus, it’s quite possible that NVIDIA doesn’t think anyone will use ALL the features all at the same time, and thus decline to state what the theoretical max power dissipation would be if you did that.

I agree but it depends on what they are using the number for. The Nvidia developer carrier board seems to have designed 20.3W as the module max and that, and the other values, seem to grant a comfortable margin for the module if one was designing a carrier board for the device.

For determining estimated/min/max battery life, actually testing a device and not use theoretical numbers would be more useful. Similarly, to determine power requirements for a farm of them, it would be better to actually test. There will be differences if Wifi is utilized or not, etc. as you mention. What the board pulls really depends on what it is doing and what mode it has been set to.

If someone asked me what min/max to design for module power on a custom carrier, I’d say 0W (power off) to 20.3W (maximum with slop) would be reasonable. If they wanted to go tighter than that range or determine average for some scenario, I’d suggest they test it.

Saying that the board is designed to allow every rail to draw its full rated power at the same time, and leave everything else to the module, seems like a stretch to me.
For example, I think my kit shipped with a 12V/5A adapter, rather than a 19V power adapter, which means 60W total. Does that mean my kit was designed for a module drawing negative (generating) 9.7W ?

According to marketing literature, the 7.5W power point is where NVIDIA feels the TX2 provides the same processing power as the TX1 did at 15W. (Who knows, maybe we’ll see a cooler-running Nintendo Switch at some point?)
That would in turn indicate that the max power of the TX2 is similar to the max power of TX1.
However, given that NVIDIA is silent on this issue, we can only guess, and doing engineering based on marketing claims is … not engineering ;-)

Can someone in Nvidia provide greater fidelity on the power estimates not yet listed in the TX2 module datasheet?

  1. For the TX2 module power numbers, can Nvidia clarify what scenarios (i.e. numbers of camera sensors interfaced to TX2, camera resolution & frame rate, video compression standard, number of video streams, processor loading, memory storage, other active interfaces such as SATA ports, USB ports, ambient temperature, etc.) are being used to capture these data power points?

  2. Further, will Nvidia do a breakdown of their power numbers with more fidelity to include what the power estimates will be for various video compression (i.e H.264, H.265, VP9, VP8) for 1 camera sensor, incrementing by one up to 6 camera sensors?

  3. Can Nvidia advise when this power numbers will be available on the TX2?

Thanks for your help!

For any similar questions, please refer to this topic: https://devtalk.nvidia.com/default/topic/916735/jetson-tx1-power-requirements-and-power-management/#5180997