I tried that but I get into errors, I made a more specific forum post on that at Deepstream parallel inference failing to produce video output with 'nvmultistreamtiler'(Deepstream parallel inference failing to produce video output with 'nvmultistreamtiler'
Related topics
Topic | Replies | Views | Activity | |
---|---|---|---|---|
Deepstream multi FPS inference with no sync | 2 | 171 | April 14, 2024 | |
Deepstream multiuri pipeline - inference desynchronization | 7 | 820 | May 17, 2022 | |
How to maximize inferences/sec in a deepstream pipeline | 13 | 1042 | October 12, 2021 | |
Create batch of frames for a single file stream | 6 | 1127 | October 12, 2021 | |
Sliced inferencing over multiple streams | 12 | 313 | May 14, 2024 | |
Using a Gstreamer Tee element in inference pipeline | 9 | 4137 | October 12, 2021 | |
Nvstreamux process 1 frame in a batch at a time instead of 32 frames | 13 | 54 | November 15, 2024 | |
Stop inference on deepstream python example | 13 | 624 | October 12, 2021 | |
DeepStream limiting number of bounding boxes per frame. | 8 | 1079 | October 12, 2021 | |
Temporal Batching single source | 4 | 448 | October 12, 2021 |