Yolo Secondary Classification Model

Please provide complete information as applicable to your setup.

**• Hardware Platform Jetson Xavier
**• DeepStream Version 6.1
**• JetPack Version 5.0.2

I am currently using deepstream app test2
I want to use my custom yolov8 classification model as sgie
I am able to convert the yolov8 model by using Export - Ultralytics YOLOv8 Docs but this model is not getting built in my gstreamer pipeline
Is it possible to use Yolov8 for secondary classification?

Could you describe that in detail?

Yes. You need to configure the relevant parameters and implement the parse-bbox-func-name function yourself.

I want to use v4l2src->PGIE->SGIE(Yolov8 Classification Model)
I am unable to convert my yolov8 classification .pt model to onnx. How can I do this?
Also, can you tell me how to use custom yolov8 classification model in my pipeline?

You can refer to https://github.com/ultralytics/ultralytics/issues/1856 or https://deci.ai/blog/how-to-convert-a-pytorch-model-to-onnx/.

You can refer to the deepstream-lpr-app, the pipeline is …->pgie(detection)->sgie(detection)->sgie(classification)->…

could you tell me how exactly do I implement parse-bbox-func-name

You need to do this yourself based on the output layers of your model. You can refer to our implement nvdsinfer_custombboxparser_tao.cpp first.