Assistance with Custom Model in DeepStream Composer

Hello,

I am a beginner in DeepStream Composer, and my current version of DeepStream Composer is 6.3. I have used a custom model for person detection (single class). The model was successfully downloaded. However, when I run the pipeline, a different model (the default model) is being used, which displays multi-class detections (cars, persons, bicycles, etc.).

How can I resolve this issue and ensure that my custom model is used instead?

Thank you in advance for your help.


Deserialize yoloLayer plugin: yolo
0:00:03.284301925 4774 0x7f06dc124640 INFO nvinfer gstnvinfer.cpp:682:gst_nvinfer_logger:<nvinfer_bin_nvinfer> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::deserializeEngineAndBackend() <nvdsinfer_context_impl.cpp:1988> [UID = 1]: deserialized trt engine from :/home/anavid-server/doc_model_person/model_b1_gpu0_fp16.engine
INFO: …/nvdsinfer/nvdsinfer_model_builder.cpp:610 [Implicit Engine Info]: layers num: 5
0 INPUT kFLOAT data 3x640x640
1 OUTPUT kFLOAT num_detections 1
2 OUTPUT kFLOAT detection_boxes 8400x4
3 OUTPUT kFLOAT detection_scores 8400
4 OUTPUT kFLOAT detection_classes 8400

0:00:03.302024189 4774 0x7f06dc124640 INFO nvinfer gstnvinfer.cpp:682:gst_nvinfer_logger:<nvinfer_bin_nvinfer> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:2091> [UID = 1]: Use deserialized engine model: /home/anavid-server/doc_model_person/model_b1_gpu0_fp16.engine
0:00:03.303338016 4774 0x7f06dc124640 INFO nvinfer gstnvinfer_impl.cpp:328:notifyLoadModelStatus:<nvinfer_bin_nvinfer> [UID 1]: Load new model:/home/anavid-server/doc_model_person/config_person_2rtsp.txt sucessfully
Running…
****** NvDsScheduler Runtime Keyboard controls:
p: Pause pipeline

r: Resume pipeline
q: Quit pipeline
2024-11-28 11:27:03.413 INFO extensions/nvdsbase/nvds_scheduler.cpp@398: NvDsScheduler Pipeline ready

From the log, the graph is using /home/anavid-server/doc_model_person/model_b1_gpu0_fp16.engine, is this what you expected?