Deepstream Triton Inference Server Error, Segmentation fault (core dumped)

There is no update from you for a period, assuming this is not an issue anymore. Hence we are closing this topic. If need further support, please open a new one. Thanks

  1. here are some commands: 1. gdb ./deepstream-app, 2. set args xxx", 3 execute bt after crash, please google for more details.
  2. please refer to depstream yolov5 sample: GitHub - NVIDIA-AI-IOT/deepstream_tao_apps: Sample apps to demonstrate how to deploy models trained with TAO on DeepStream, and
    DeepStream SDK FAQ - #24 by mchi