Please provide complete information as applicable to your setup.
**• Hardware Platform: **: Jetson Orin NX
• DeepStream Version : 7.0
• JetPack Version: 6.0
• Issue Type: Question
I want to inference two secondary classifiers sequentially. I have a PGIE that run traffic net to detect the vehicles and the I have a vehiclemake net to get the make of each detected vehicle. Now I have also some nets trained with vehicle models. I have a net for each make. If I use the vehiclemake net as operate-on-gie-id in the modelnet the model does not inference anything. If I set operate-on-gie-id to the PGIE net, I get results. What can I do to inference two classifiers in sequential mode?
The gie-unique-id of the vehiclemake net is 2. This is the configuration file of one of my model net:
property:
gpu-id: 0
net-scale-factor: 1.0
offsets: 103.939;116.779;123.68
onnx-file: ../../models/vehicle_models/VOLKSWAGEN/VOLKSWAGEN_model.onnx
model-engine-file: ../../models/vehicle_models/VOLKSWAGEN/VOLKSWAGEN_model.onnx_b2_gpu0_fp32.engine
labelfile-path: ../../models/vehicle_models/VOLKSWAGEN/labels.txt
#force-implicit-batch-dim: 1
batch-size: 2
num-detected-classes: 21
network-mode: 0
input-object-min-height: 64
input-object-min-width: 64
model-color-format: 1
gpu-id: 0
gie-unique-id: 37
process-mode: 2
operate-on-gie-id: 2
operate-on-class-ids: 33
is-classifier: 1
network-type: 1
output-blob-names: predictions
classifier-async-mode: 1
classifier-threshold: 0.01
infer-dims: 3;224;224
maintain-aspect-ratio: 0
output-tensor-meta: 0