I am trying to run a deepstream app with custom yolov8-seg model on Jetson Orin Nano. I was able to run the app successfully with TensorRT 8.5.22 and Deepstream 6.3. But facing this error with Deepstream 6.4 and TRT v8.6.2
:~/DeepStream-Yolo-Seg$ deepstream-app -c deepstream_app_config.txt
WARNING: Deserialize engine failed because file path: /home/user/DeepStream-Yolo-Seg/yolov8m-seg.onnx_b1_gpu0_fp32.engine open error
0:00:07.294695365 25044 0xaaaad89a3070 WARN nvinfer gstnvinfer.cpp:679:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::deserializeEngineAndBackend() <nvdsinfer_context_impl.cpp:2080> [UID = 1]: deserialize engine from file :/home/user/DeepStream-Yolo-Seg/yolov8m-seg.onnx_b1_gpu0_fp32.engine failed
0:00:07.684402263 25044 0xaaaad89a3070 WARN nvinfer gstnvinfer.cpp:679:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:2185> [UID = 1]: deserialize backend context from engine from file :/home/user/DeepStream-Yolo-Seg/yolov8m-seg.onnx_b1_gpu0_fp32.engine failed, try rebuild
0:00:07.690917314 25044 0xaaaad89a3070 INFO nvinfer gstnvinfer.cpp:682:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:2106> [UID = 1]: Trying to create engine from model files
WARNING: [TRT]: onnx2trt_utils.cpp:372: Your ONNX model has been generated with INT64 weights, while TensorRT does not natively support INT64. Attempting to cast down to INT32.
WARNING: [TRT]: onnx2trt_utils.cpp:400: One or more weights outside the range of INT32 was clamped
WARNING: [TRT]: Tensor DataType is determined at build time for tensors not marked as input or output.
ERROR: [TRT]: 10: Could not find any implementation for node {ForeignNode[/0/model.22/Constant_output_0.../1/Gather_9]}.
ERROR: [TRT]: 10: [optimizer.cpp::computeCosts::3869] Error Code 10: Internal Error (Could not find any implementation for node {ForeignNode[/0/model.22/Constant_output_0.../1/Gather_9]}.)
ERROR: Build engine failed from config file
ERROR: failed to build trt engine.
0:05:02.354594746 25044 0xaaaad89a3070 ERROR nvinfer gstnvinfer.cpp:676:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Error in NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:2126> [UID = 1]: build engine file failed
0:05:02.785767278 25044 0xaaaad89a3070 ERROR nvinfer gstnvinfer.cpp:676:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Error in NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:2212> [UID = 1]: build backend context failed
0:05:02.785841679 25044 0xaaaad89a3070 ERROR nvinfer gstnvinfer.cpp:676:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Error in NvDsInferContextImpl::initialize() <nvdsinfer_context_impl.cpp:1351> [UID = 1]: generate backend failed, check config file settings
0:05:02.786554366 25044 0xaaaad89a3070 WARN nvinfer gstnvinfer.cpp:898:gst_nvinfer_start:<primary_gie> error: Failed to create NvDsInferContext instance
0:05:02.786578591 25044 0xaaaad89a3070 WARN nvinfer gstnvinfer.cpp:898:gst_nvinfer_start:<primary_gie> error: Config file path: /home/user/DeepStream-Yolo-Seg/config_infer_primary_yoloV8_seg.txt, NvDsInfer Error: NVDSINFER_CONFIG_FAILED
** ERROR: <main:716>: Failed to set pipeline to PAUSED
Quitting
nvstreammux: Successfully handled EOS for source_id=0
ERROR from primary_gie: Failed to create NvDsInferContext instance
Debug info: /dvs/git/dirty/git-master_linux/deepstream/sdk/src/gst-plugins/gst-nvinfer/gstnvinfer.cpp(898): gst_nvinfer_start (): /GstPipeline:pipeline/GstBin:primary_gie_bin/GstNvInfer:primary_gie:
Config file path: /home/user/DeepStream-Yolo-Seg/config_infer_primary_yoloV8_seg.txt, NvDsInfer Error: NVDSINFER_CONFIG_FAILED
App run failed