Is there an optimal input power voltage for the TX1 module? I know it can accept a wide range of voltages (5.5V-19.5V). But from an efficiency standpoint, is there an input voltage that produces the most efficient power conversion?
Good question. I imagine we’ll need someone from Big N to answer this.
my guess would be it is based on the tasks you will be giving the board. why spend the money to plug in a 19V unit when your tx1 tasks will never use that capacity? on the other hand, using a 9V unit may not provide the juice your tx1 is going to need to complete your tasks. the tx1 is supposedly using whatever input you provide it efficiently based on tasks presented it. but i’m not an engineer.
There is no optimal external voltage for TX1 module, as the external power go into TX1 through a DC/DC switch, TPS54335DRCT, and will be converted to 5V/3V sys used by TX1 module as input. And the efficiency is determined by consumed current and voltage gap between input and output, but mainly by consumed current. That means in most use case, the efficiency has no obvious difference when current exceed a threshold, such as 0.1A or 1A.
Voltage is not really important, you must just supply a voltage in the correct range.
What is really important is the CURRENT.
Jetson TX1 assorbs a lot of current when it works at its maximum power, then your power source must be able to provide at least 7A according to the current usage reported in the document “NVIDIA Jetson TX1 Developer Kit Carrier Board Specification” in the table 25 at page 29.
You need to think about power…not voltage.
If you look at the table that myzhar has pointed out and the power tree in the “P2597_B02_Concept_schematics.pdf” page 3 you can notice that :
*the input power goes to a buck (TPS53015) which is the first stage for all other carrier power supplies. It is 7A max & 5V output. So for the carrier, the max power needed is 35W.
*the second load of the input is the TX1 module itself.
The TPS54335DRCT is the first stage for the module power. It is a 5V/3A buck so the max expected power consumption for it is 15W. (to be confirmed by NVIDIA it is the only one).
=> My conclusion is 50W input is enough and should be your “optimal power input”.
And with the need of 5.5V/19.6V voltage you can imagine configuration like 10V/5A or other configuration at your convenience.
PS : if you’re looking for the combination with the lowest losses only for the module, watch the TPS54335DRCT datasheet (http://media.digikey.com/pdf/Data%20Sheets/Texas%20Instruments%20PDFs/TPS54335,6.pdf) page 25. It seems 1A output is the best efficiency so 15V/1A at VDD_IN the best one.
On P2180 CVM board(TX1 module), The input power(VDD_IN 19.6V) goes to 3,3v DCDC buck, 5v DCDC buck and CPU,GPU DCDC regulators OVR2. The 5V/3A buck is not the only one.
Okay I really appreciate that you communicate those information :).
I wonder if someone has tested or NVIDIA supports the fact that Carrier + Module (P2597 + P2180) work fine (all peripherals at highest speeds) with 5.5V - 4A supply on VDD_IN ?
Or the 4A mentioned are only for the module ?
Even if for SATA or PCIe it depends on the devices consumption of course.
CPU,GPU,EMC run at max frequency and 100% usage on CPU and GPU with burn cortex and CUDA sample.
The total TX1 module power consumption is ~14w.
The test don’t have SATA or PCIE device.
Does nvida have TX1 test report for showing the total power consumption vs different scenarios?
For exp, like these:
- standby (sleep)
- CPU fully loading
- GPU fully loading
- GPU + CPU fully loading 14W.
All available power consumption data that can be public are included in OEM DG, module datasheet, please refer to that.