Questions about Jetson Nano 16GB eMMC Model – JetPack, Python, ONNX, DeepStream, TensorRT, and Booting from SD Card

Hi Everyone!

I have a few questions regarding the Jetson Nano 16GB eMMC model and would appreciate any insights from the community:

  1. Latest JetPack Version Support:

    • What is the latest version of JetPack that supports the Jetson Nano 16GB eMMC model? Are there any specific enhancements or optimizations for this version of the board in the latest JetPack release?
  2. Python Version Compatibility:

    • Can I run any version of Python (e.g., Python 3.7, 3.8, 3.9, etc.) on the Jetson Nano 16GB eMMC model? Are there any known compatibility issues or limitations with running different versions of Python?
  3. ONNX Runtime Support:

    • Is ONNX Runtime compatible with the Jetson Nano 16GB eMMC model? If so, are there any specific versions of ONNX Runtime that work better with this board, or any potential issues to be aware of?
  4. DeepStream Support:

    • Can I use DeepStream on the Jetson Nano 16GB eMMC model? Which versions of DeepStream are supported, and are there any performance differences or limitations compared to other Jetson models?
  5. TensorRT Support:

    • Does TensorRT run efficiently on the Jetson Nano 16GB eMMC model? Are there specific versions of TensorRT that work best, and are there any known issues when using it on this model?
  6. Booting from a Customized SD Card:

    • If I customize an SD card to boot the Jetson Nano 16GB eMMC model, will there be any issues or failures during bootup or operation? Are there any important guidelines or limitations when booting from an SD card instead of using the onboard eMMC storage?

Thanks in advance for your help!

Regarding JetPack and SDKs support version, you can find those info from JetPack SDK 4.6.5 | NVIDIA Developer.

For DeepStream supported version, you can find the info from DeepStream SDK | NVIDIA NGC

1 Like

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.