Hi @AastaLLL , that’s exactly what I did… But after that I still have issues, please check my first comment. I need to kwow how I can get a success object detection.
Related topics
| Topic | Replies | Views | Activity | |
|---|---|---|---|---|
| Inference onnx failed, Unsupported ONNX data type: UINT8 (2) | 2 | 1570 | October 10, 2021 | |
| Inference with jetson nano | 3 | 895 | June 14, 2022 | |
| Importing a ONNX model for performing an inference using TensorRT | 5 | 3095 | October 15, 2021 | |
| Regarding the information about modifying sample code to convert the prebuilt onnx models and perform inferencing | 10 | 798 | September 5, 2023 | |
| Run onnx model on jetson nano | 2 | 8586 | October 15, 2021 | |
| Exporting Tensorflow models to Jetson Nano | 25 | 7005 | October 15, 2021 | |
| TensrFlow2.0 to run on Jetson Nano 2GB | 4 | 612 | October 15, 2021 | |
| Jetson-inference TensorRT onnx model | 2 | 800 | May 31, 2023 | |
| Loading custom Object Detection Model to Jetson Dev for real-time use | 4 | 705 | October 15, 2021 | |
| I am trying to convert the ONNX SSD mobilnet v3 model into TensorRT Engine. I am getting the below error | 24 | 4089 | February 17, 2022 |