my batch size was 1. i exported to onnx with batch size of 2 it works now
Related topics
| Topic | Replies | Views | Activity | |
|---|---|---|---|---|
| Issues with running inference on multiple rtsp streams in deepstream-imagedata-multistream | 24 | 752 | August 7, 2024 | |
| Unable to add multiple streams for inference | 3 | 491 | September 5, 2022 | |
| How to use custom RTSP model in deep stream imagedata multistream | 8 | 1049 | April 12, 2022 | |
| Deepstream 6.1 imagedata multistream not working | 8 | 918 | May 28, 2022 | |
| Deepstream Multistream slower than single stream | 8 | 138 | July 9, 2024 | |
| Inference on multiple streams doesn't work | 1 | 43 | August 6, 2025 | |
| Error when running in multiple RTSP source | 6 | 590 | October 12, 2021 | |
| NVDSINFER_CONFIG_FAILED when using 2+ video sources simultaneously | 5 | 362 | February 9, 2024 | |
| Error when setting streammux's batchsize >1 for custom model | 5 | 717 | May 17, 2022 | |
| Get stuck in the running deepstream python apps with more than one rtsp streams | 2 | 558 | October 12, 2021 |