ONNX as inference model (YoloV8)

I’ve just re-implemented the above steps but with Deepstream 6.2, and python 3.8.10. Now I also have 4 layers to my onnx model. Here is my log file:
log.txt (5.2 KB)
Line 25 says “Detect-postprocessor failed to init resource because dlsym failed to get func NvDsInferParseCustomONNX pointer”. I found this resource online where the same problem was solved: “Yolov4 with Deepstream 5.0 (Python sample apps), dlsym failed to get func NvDsInferParseCustomBatchedNMSTLT pointer”, but I am not sure what ‘gilles.charlier97’ meant by “I finally solved it by changing the libnvds_infercustomparser.so by the libnvds_infercustomparser-tlt.so thas is inside the folder postprocessor of the Github repo deepstream_tlt_apps.”