Please provide complete information as applicable to your setup.
• Hardware Platform (Jetson / GPU) T4
• DeepStream Version
• JetPack Version (valid for Jetson only)*
• TensorRT Version 8.2
**• NVIDIA GPU Driver Version (valid for GPU only)**11.6
• Issue Type( questions, new requirements, bugs)
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)
during inference i get below error :
WARNING: Overriding infer-config batch-size 0 with number of sources 1
Creating nvtracker
Adding elements to Pipeline
Linking elements in the Pipeline
yolov7.py:953: PyGIDeprecationWarning: GObject.MainLoop is deprecated; use GLib.MainLoop instead
loop = GObject.MainLoop()
test---------
Now playing...: inputimages/horses.jpg
Starting pipeline
0:00:00.390670756 24187 0x3f42a30 WARN nvinferserver gstnvinferserver_impl.cpp:287:validatePluginConfig:<primary-inference> warning: Configuration file batch-size reset to: 1
INFO: infer_grpc_backend.cpp:164 TritonGrpcBackend id:5 initialized for model: yolov7
python3: infer_cuda_utils.cpp:86: nvdsinferserver::CudaTensorBuf::CudaTensorBuf(const nvdsinferserver::InferDims&, nvdsinferserver::InferDataType, int, const string&, nvdsinferserver::InferMemType, int, bool): Assertion `!hasWildcard(dims)' failed.
Aborted (core dumped)```