I’ll jump in and provide some additional feedback, since we bought all the hype and purchased one of these units. All of the specs on the Xavier AGX listed on NVIDIA’s site are sheerly “functional parameters”. That is, they look good on paper but cannot be applied in real-world applications. I won’t go into a whole lot of detail here, but anyone that has worked with Xavier AGX to any degree will agree that NVIDIA has not provided proper support for the hardware and none of the modern libraries for ML or AI vision will work with it, even with software upgrades. Just follow this thread, to learn why Jetson fails to perform (miserably) on multiple benchmark tests, partially due to the inability to CUDA-compile the most prominent libraries available. Even with CUDA compilation of OpenCV, for example, we have seen single-camera frame-rates of only about .25 fps (no, the decimal is not a typo). I do not get the impression that their team even fully understands the nuances of the hardware built into the units. It was just glued to the circuit board to “look nice” in advertisements. Meanwhile, you will find developers the world over that have not been able to make use of it. “ECC support of 32 GB memory” should be the least of your concerns. Worry more about why NVIDIA couples libraries with Jetpack that are not CUDA-compiled along with CUDA, itself. That should tell you everything you need to know about the readiness of these products for real-world applications.