The reason behind the problem understood. It is a power issue caused by docking station connected to host machine. Thanks for your reply.
Related topics
| Topic | Replies | Views | Activity | |
|---|---|---|---|---|
| TensorRT inference Time | 1 | 766 | September 20, 2018 | |
| Inference time changes after training | 5 | 597 | September 25, 2020 | |
| Low Compute utilization of converted TensorFlow model during inference | 19 | 1734 | October 18, 2021 | |
| Why my inference time is so long when using trtexec - FP16? | 4 | 1989 | October 18, 2021 | |
| inference time of UFF using tensorrt is slower than tensorflow | 9 | 2767 | October 18, 2021 | |
| Inference slow using nvInfer and TensorRT directly into PX2 | 6 | 762 | April 17, 2019 | |
| Inference Time is not stable | 10 | 1769 | January 3, 2019 | |
| Inference time mismatch between same configuration on Windows and Ubuntu | 2 | 676 | September 27, 2023 | |
| uff inference time large than pb time when process vgg 19 | 10 | 1261 | December 4, 2018 | |
| Inference time on Jetson Xavier compared with local host PC? | 8 | 798 | October 18, 2021 |