Cannot convert onnx REID model

Description

A clear and concise description of the bug or issue.

Environment

TensorRT Version: 8.6.2.3
GPU Type: tegra234
Nvidia Driver Version:
CUDA Version: 12.2.140
CUDNN Version: 8.9.4.25
Operating System + Version:
Python Version (if applicable):
TensorFlow Version (if applicable):
PyTorch Version (if applicable):
Baremetal or Container (if container which image + tag):

I am trying to use this REID model

for tracking in my deepstream pipeline.

But when building the model, it fails with this:

[NvMultiObjectTracker] Load engine failed. Create engine again.
WARNING: [TRT]: onnx2trt_utils.cpp:372: Your ONNX model has been generated with INT64 weights, while TensorRT does not natively support INT64. Attempting to cast down to INT32.
[NvMultiObjectTracker] Begin building engine for tracker ReID…
WARNING: [TRT]: DLA requests all profiles have same min, max, and opt value. All dla layers are falling back to GPU
WARNING: [TRT]: Calibration Profile is not defined. Calibrating with Profile 0
ERROR: [TRT]: 4: [standardEngineBuilder.cpp::initCalibrationParams::1715] Error Code 4: Internal Error (Calibration failure occurred with no scaling factors detected. This could be due to no int8 calibrator or insufficient custom scales for network layers. Please see int8 sample to setup calibration correctly.)

So , I am assuming there is a calibration file missing ?

The previous elt (ReIdentificationNet | NVIDIA NGC) version did work without calibration. In any case, the model card for this particular model does not list any usable calibration files.

Relevant Files

Please attach or include links to any models, data, files, or scripts necessary to reproduce your issue. (Github repo, Google Drive, Dropbox, etc.)

Steps To Reproduce

Please include:

  • Exact steps/commands to build your repro
  • Exact steps/commands to run your repro
  • Full traceback of errors encountered

Hi @felix.darvas ,
Apologies for the delay,
I would recommend you to raise the concern on Deepstream Forum.

thanks, that is very helpful - do you think I will get an answer there in just 17 days as well?