• Hardware Platform (Jetson / GPU)
Jetson agx xavier
• DeepStream Version
5.1
• JetPack Version (valid for Jetson only)
4.5.1
• Issue Type( questions, new requirements, bugs)
Bug:
Deepstream crashes and creates segmentation fault or just stops working if using a secondary classified that generates eg an identification array of size 128.
It works just fine when having the same model as a primary classifier.
I’ve tested this both for nvinfer and also nvinferserver to disable post processing with the same result.
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)
I’ve been using deepstream_test1.py as a base and added a secondary classifier for the detected objects which outputs a signature array.
All code, configuration file and how to create a simple model is located here
There is also described how to run the test and reproduce the bug.
In short: use a model that outputs an 128 size array and add it as a secondary classifier.