Deepstream secondary gie use googlenet trained by nvidia/digits

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU)
My graphic card is RTX 3080.
• DeepStream Version
The Version is 6.0

• TensorRT Version
TensorRT Version on tags 21.08 is v8.0.1.6.

• NVIDIA GPU Driver Version (valid for GPU only)
The Driver version is 470.129.06.

• Issue Type( questions, new requirements, bugs)
Firstly I use nvidia/digits docker to train a caffe model and download model.
The model contains a deploy.prototxt and a snapshot_iter_3362.caffemodel .
The output layer name in deploy.prototxt is softmax.
So I executed the trtexec command which is

“./trtexec --deploy=model/deploy.prototxt --model=model/snapshot_iter_3362.caffemodel --output=softmax --batch=16 --saveEngine=deploy.engine”

Finally, in deepstream app secondary-gie0 settings, I modify the config-file as follows.

[property]
gpu-id=0
net-scale-factor=1.5
model-file=snapshot_iter_3362.caffemodel
proto-file=deploy.prototxt
model-engine-file=deploy.engine
mean-file=csc.ppm
labelfile-path=caffe_label.txt
force-implicit-batch-dim=1
batch-size=16
model-color-format=1

network-mode=0
is-classifier=1
process-mode=1
output-blob-names=softmax
classifier-async-mode=1
classifier-threshold=0.001
input-object-min-width=128
input-object-min-height=128
operate-on-gie-id=1
operate-on-class-ids=5;6
classifier-type=vehicletype

The deepstream app can execute well, but the result of sgie, class id is always 0.

I use opencv dnn to load caffe mode and prototxt, and get good inference resulit of class id.
In fact, no matter I use trtexec to produce engine or use deepstream app to produce snapshot_iter_3362.caffemodel_b16_gpu0_fp32.engine, the class id of inference result is always zero.

Thank you for your help and suggestion.

• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)

• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)

Can you share more on the model? How to do the post process of the output of the model?

*model preparing
step 1: docker pull nvidia/digits:latest

step2: docker run -d --gpus all -v $(pwd):/data -p5000:5000 nvidia/digits:latest

step3: open browser via url 127.0.0.1:5000

step3: prepare a classifer lmdb dataset

step4: New Image Classification Model

step5: Choose default googlenet to train classify model and click ‘create’ button

step6 training infomation like this:

step7 download the model:

the model contains snapshot.caffemodel and deploy.prototxt, and also has label.txt, mean.binaryproto…etc.

I use python to convert mean.binaryproto to ppm file.

import os
import numpy as np
import matplotlib.pyplot as plt
import caffe
from PIL import Image
mean_filename=‘mean.binaryproto’
proto_data = open(mean_filename, “rb”).read()
a = caffe.io.caffe_pb2.BlobProto.FromString(proto_data)
mean = caffe.io.blobproto_to_array(a)[0]
mean = np.reshape(mean, (256, 256, 3))
Image.fromarray(np.array(mean, dtype=np.uint8)).save(‘truck.ppm’)

The deploy.prototxt content is as follows:
deploy.prototxt (36.6 KB)

The last layer
layer {
name: “softmax”
type: “Softmax”
bottom: “loss3/classifier”
top: “softmax”
}

post process of the output of the model

I just use the deepstream sample code to print the sgie results and check class id of the sgie inference.

The code is as follows:

obj->classifier_meta_list = g_list_sort(obj->classifier_meta_list, temp_component_id_compare_func);
for (NvDsMetaList *l_class = obj->classifier_meta_list; l_class != NULL; l_class = l_class->next)
{
NvDsClassifierMeta *cmeta = (NvDsClassifierMeta *)l_class->data;
for (NvDsMetaList *l_label = cmeta->label_info_list; l_label != NULL;
l_label = l_label->next)
{
NvDsLabelInfo *label = (NvDsLabelInfo *)l_label->data;
if (label->pResult_label)
{
g_print(" %s\n “, label->pResult_label);
}
else if (label->result_label[0] != ‘\0’)
{
g_print(”%s\n", label->result_label);
}
}
}

Sorry, I found a mistake that I set the wrong label.txt.

originaly, I set class truck, train, bus, car like this:

truck
train
bus
car

now I change txt format :
truck:train:bus:car

The inference result is all right.

Glad to know you fixed your issue.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.