It’s clear that NVIDIA® Jetson™ modules come with varied AI compute capability, power efficiency, and form factors. This is also one of the reasons why NVIDIA cameras are growing in popularity. The entire NVIDIA® Jetson™ product family uses a common software stack that facilitates and broadens deployment. The platforms are equipped with the same jetpack Software Development Kit (SDK), which includes a board support package (BSP), Linux OS, and CUDA.
Let’s see how NVIDIA® Jetson AGX Orin™ fares when compared to the other members of the Jetson family:
- NVIDIA Jetson Orin NX 8GB and 16GB modules deliver up to 70 and 100 TOPS of AI performance, respectively, in the smallest Jetson form factor. This gives you up to 3 times the performance of NVIDIA® Jetson AGX Xavier™ and up to five times the performance of NVIDIA® Jetson Xavier™
- Jetson Nano is a compact module designed for powering entry-level edge AI applications and devices.
- Jetson AGX Xavier is an upgraded version of NVIDIA® Jetson™ TX2 for deploying end-to-end AI robotic systems. This gives you more than 20 times the performance and 10 times the energy efficiency of NVIDIA® Jetson™ TX2.
- Jetson Xavier NX comes with the NVIDIA Xavier SoC with a module size of Jetson Nano. It delivers accelerated computing performance up to 21 TOPS. This compact AI supercomputer offers you more than 10 times the performance of Jetson™ TX2.
- Jetson TX2/TX2i is reputed to be the fastest and most power-efficient embedded AI computing device – built around an NVIDIA Pascal™-family GPU. It is loaded with 8GB of memory and 59.7GB/s of memory bandwidth with several hardware interfaces for pain-free integration.
In this section, let us dig a bit more into the details of how the Jetson AGX Orin series compares with other Jetson modules.