Thanks anyway although I think that parser in DeepStream and TensorRT should be related to each other. Change in TensorRT should reflect the change in DeepStream config parser as well for the engine file. Anyway, I have re-implemented the whole thing under DeepStream 6.0. I will report this problem in TensorRT later.
Related topics
| Topic | Replies | Views | Activity | |
|---|---|---|---|---|
| Migrating DeepStream 5.0 competible model to 6.1/6.0.1/.6.0 | 1 | 388 | August 22, 2022 | |
| Yolov7 onnx model convert to trt model error in deepstream 6.2, but use trtexec ok | 9 | 450 | June 11, 2024 | |
| DeepStream 4.0.1 and TensorRT 6.0.1.5 | 5 | 763 | October 12, 2021 | |
| Error in trtexec conversion model in deepstream container: Cuda failure: forward compatibility was attempted on non supported HW Aborted (core dumped) | 3 | 371 | July 1, 2023 | |
| Cannot Deploy PyTorch Model on DeepStream | 9 | 1148 | February 13, 2023 | |
| DeepLab not Working on DeepStream5.0.1 | 10 | 1144 | May 24, 2022 | |
| Deepstream | 7 | 66 | January 19, 2026 | |
| Cuda error for converted model from trtexec | 4 | 141 | December 30, 2024 | |
| Upgrade TensorRT 7.1.x in deepstream5.0 | 3 | 499 | October 12, 2021 | |
| Not able to convert onnx file to engine file for Deepstream 7.1 | 5 | 170 | June 12, 2025 |