How image classification works as a PIE

Please provide complete information as applicable to your setup.

**• Hardware Platform (Jetson / GPU) :Jetson
**• DeepStream Version:6.1.1
**• JetPack Version (valid for Jetson only):L4T 35.1
**• TensorRT Version:8.4.1.5
**• NVIDIA GPU Driver Version (valid for GPU only):
**• Issue Type( questions, new requirements, bugs):questions

My development environment is jetson NX (deepstream 6.1.1) and I want to see how image classification works as a PIE.
I used Deepstream-test1-app for testing, and image classification was the main model. With RESTNET18 trained with TAO, there were no issues on the test set. Put the model in the directory and configure the file as follows.

[property]
gpu-id=0
net-scale-factor=1
tlt-encoded-model=./classifier_restnet18/final_model.etlt
tlt-model-key=
model-engine-file=./classifier_restnet18/final_model.etlt_b16_gpu0_fp16.engine
labelfile-path=./classifier_restnet18/labels.txt
batch-size=16

0=FP32 and 1=INT8 mode

network-mode=2
process-mode=1
model-color-format=0
gie-unique-id=1
operate-on-gie-id=1
output-blob-names=predictions/Softmax
classifier-async-mode=1
classifier-threshold=0.1
network-type=1
uff-input-blob-name=input_1
infer-dims=3;224;224

The program can run, but the frame_meta->obj_meta_list is empty.
I want to know:
(1) How to obtain classification results from frame_meta.
(2) Is there an error in the configuration file?

please refer to DeepStream SDK FAQ - #25 by fanzh

Thanks for your reply.

Just did the test, if using the configuration in the sample

model-file=/opt/nvidia/deepstream/deepstream/samples/models/Secondary_CarColor/resnet18.caffemodel
proto-file=/opt/nvidia/deepstream/deepstream/samples/models/Secondary_CarColor/resnet18.prototxt
model-engine-file=/opt/nvidia/deepstream/deepstream/samples/models/Secondary_CarColor/resnet18.caffemodel_b16_gpu0_int8.engine
mean-file=/opt/nvidia/deepstream/deepstream/samples/models/Secondary_CarColor/mean.ppm
labelfile-path=/opt/nvidia/deepstream/deepstream/samples/models/Secondary_CarColor/labels.txt
int8-calib-file=/opt/nvidia/deepstream/deepstream/samples/models/Secondary_CarColor/cal_trt.bin

DeepStream can obtain classification results.

My model was trained from TAO and got final_model.etlt, which is OK to test in TAO environments. With deepstrem, final_model.etlt_b16_gpu0_fp16.engine was generated. At this time, the detection results in DeepStream are not correct.
Want to know,
(1) Does TAO’s environment have to be the same as Jesson’s? Now TAO is Ubuntu 18 and Jetson is Ubuntu 20
(2) Whether there are other things that have not been noticed in the process

no, please refer to deepstream tao classify sample multi_task_tao: GitHub - NVIDIA-AI-IOT/deepstream_tao_apps: Sample apps to demonstrate how to deploy models trained with TAO on DeepStream

The tao version I used is 3.22, referring to the classification.ipynb file, all steps are normal, but I can’t get results in deepstream 6.1.1 and deepstream 6.0. Can you know the specific reason? Is it necessary to refer to multi_task_tao?

There is no update from you for a period, assuming this is not an issue anymore. Hence we are closing this topic. If need further support, please open a new one. Thanks

you can refer to multi_task_tao 's configuration, it is a TAO classification model. deepstream_tao_apps/configs/multi_task_tao at master · NVIDIA-AI-IOT/deepstream_tao_apps · GitHub

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.