While converting onnx to engine .engine generating none embeddings

While converting the ONNX model to a TensorRT engine using the following configuration:[property]
gpu-id=0
net-scale-factor=0.01735207357279195
offsets=123.675;116.28;103.53
model-color-format=0

Set the model file path

onnx-file= /root/data/REID_V1.0/REID_DS7_MALAD/REID/MODELS/EMBEDDING/ImageClassifier.onnx

Set the model input layer name

model-engine-file=/root/data/REID_V1.0/REID_DS7_MALAD/REID/MODELS/EMBEDDING/ImageClassifier.onnx_b1_gpu0_fp32.engine

input-dims=3;256;128;0
input-object-min-width=30
input-object-min-height=50

batch-size=8
process-mode=2
output-blob-names=modelOutput
model-color-format=0
operate-on-gie-id=1
gie-unique-id=2

classifier-threshold=0
output-tensor-meta=1
maintain-aspect-ratio=0

Extract embedding on “people” class only

operate-on-class-ids=0
network-type=100 , The engine was successfully generated, but during inference, the engine is not producing embeddings. When I print the embeddings, it shows None. The ONNX model used is ImageClassifier.onnx. please provide solution

Hi,

Which Deepstream version do you use?
It looks like you are testing a classifier, have you tried to set it to network-type=1?

https://docs.nvidia.com/metropolis/deepstream/dev-guide/text/DS_plugin_gst-nvinfer.html

Thanks.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.