Hi,
YES. It is optimized with TensorRT.
Thanks.
| Topic | Replies | Views | Activity | |
|---|---|---|---|---|
| Jetson-inference TensorRT onnx model | 2 | 733 | May 31, 2023 | |
| SSD Mobilenet V2 TensorRT optimization for Jetson TX2 | 6 | 1886 | October 18, 2021 | |
| How to deploy SSD Mobilenet V2 for inference in jetson nano | 2 | 1324 | May 23, 2022 | |
| How to speed up ssd_mobilenet_v2_fpn with tensorRT in jetson nano? | 2 | 535 | October 15, 2021 | |
| Optimize official ssd_mobilenet model but get nothing valuable | 0 | 649 | June 5, 2019 | |
| SSD Mobilenet onnx from saved_model trained in tensorflow api | 2 | 1236 | October 15, 2021 | |
| Running original tensorflow model for SSD mobilenet v2 with live USB camera capture on jetson nano | 2 | 1285 | October 18, 2021 | |
| Steps for TensorRT implementation on mobileNet-ssd with Jetson tx2 | 4 | 4555 | October 24, 2018 | |
| run ssd_mobilenet_v2_quantized_300x300_coco | 4 | 1145 | October 18, 2021 | |
| I do not get any performance improvement after using TensorRT provider for object detection model | 7 | 1433 | July 12, 2022 |