NvDCF Re-id onnx model error

**• Hardware Platform (GPU) RTX 3060
**• DeepStream Version 6.3
• TensorRT Version
**• NVIDIA GPU Driver Version (valid for GPU only) CUDA12.1
• Issue Type (questions)

1.I’m using NvDCF for tracking and then configuring an ONNX Re-id model, but I’m encountering the following error. What could be the reason for this?

WARNING: [TRT]: onnx2trt_utils.cpp:377: Your ONNX model has been generated with INT64 weights, while TensorRT does not natively support INT64. Attempting to cast down to INT32.
[NvMultiObjectTracker] Begin building engine for tracker ReID…
ERROR: [TRT]: 4: [network.cpp::validate::3100] Error Code 4: Internal Error (image: for dimension number 1 in profile 0 does not match network definition (got min=128, opt=128, max=128), expected min=opt=max=3).)


minDetectorConfidence: 0.1894 # If the confidence of a detector bbox is lower than this, then it won’t be considered for tracking

enableBboxUnClipping: 1 # In case the bbox is likely to be clipped by image border, unclip bbox
preserveStreamUpdateOrder: 0 # When assigning new target ids, preserve input streams’ order to keep target ids in a deterministic order over multuple runs
maxTargetsPerStream: 150 # Max number of targets to track per stream. Recommended to set >10. Note: this value should account for the targets being tracked in shadow mode as well. Max value depends on the GPU memory capacity

minIouDiff4NewTarget: 0.3686 # If the IOU between the newly detected object and any of the existing targets is higher than this threshold, this newly detected object will be discarded.
minTrackerConfidence: 0.1513 # If the confidence of an object tracker is lower than this on the fly, then it will be tracked in shadow mode. Valid Range: [0.0, 1.0]
probationAge: 2 # If the target’s age exceeds this, the target will be considered to be valid.
maxShadowTrackingAge: 42 # Max length of shadow tracking. If the shadowTrackingAge exceeds this limit, the tracker will be terminated.
earlyTerminationAge: 1 # If the shadowTrackingAge reaches this threshold while in TENTATIVE period, the target will be terminated prematurely.

useUniqueID: 0 # Use 64-bit long Unique ID when assignining tracker ID. Default is [true]
enableReAssoc: 1 # Enable Re-Assoc

minMatchingScore4Overall: 0.6622 # min matching score for overall
minTrackletMatchingScore: 0.2940 # min tracklet similarity score for re-assoc
minMatchingScore4ReidSimilarity: 0.0771 # min reid similarity score for re-assoc

matchingScoreWeight4TrackletSimilarity: 0.7981 # weight for tracklet similarity score
matchingScoreWeight4ReidSimilarity: 0.3848 # weight for reid similarity score

minTrajectoryLength4Projection: 34 # min trajectory length required to make projected trajectory
prepLength4TrajectoryProjection: 58 # the length of the trajectory during which the state estimator is updated to make projections
trajectoryProjectionLength: 33 # the length of the projected trajectory
maxAngle4TrackletMatching: 67 # max angle difference for tracklet matching [degree]
minSpeedSimilarity4TrackletMatching: 0.0574 # min speed similarity for tracklet matching
minBboxSizeSimilarity4TrackletMatching: 0.1013 # min bbox size similarity for tracklet matching
maxTrackletMatchingTimeSearchRange: 27 # the search space in time for max tracklet similarity
trajectoryProjectionProcessNoiseScale: 0.0100 # trajectory projector’s process noise scale w.r.t. state estimator
trajectoryProjectionMeasurementNoiseScale: 100 # trajectory projector’s measurement noise scale w.r.t. state estimator
trackletSpacialSearchRegionScale: 0.0100 # the search region scale for peer tracklet

reidExtractionInterval: 8 # frame interval to extract reid features per target

dataAssociatorType: 0 # the type of data associator among { DEFAULT= 0 }
associationMatcherType: 1 # the type of matching algorithm among { GREEDY=0, CASCADED=1 }
checkClassMatch: 1 # If checked, only the same-class objects are associated with each other. Default: true

minMatchingScore4Overall: 0.0222 # Min total score
minMatchingScore4SizeSimilarity: 0.3552 # Min bbox size similarity score
minMatchingScore4Iou: 0.0548 # Min IOU score
minMatchingScore4VisualSimilarity: 0.5043 # Min visual similarity score

matchingScoreWeight4VisualSimilarity: 0.3951 # Weight for the visual similarity (in terms of correlation response ratio)
matchingScoreWeight4SizeSimilarity: 0.6003 # Weight for the Size-similarity score
matchingScoreWeight4Iou: 0.4033 # Weight for the IOU score

tentativeDetectorConfidence: 0.1024 # If a detection’s confidence is lower than this but higher than minDetectorConfidence, then it’s considered as a tentative detection
minMatchingScore4TentativeIou: 0.2852 # Min iou threshold to match targets and tentative detection

stateEstimatorType: 1 # the type of state estimator among { DUMMY=0, SIMPLE=1, REGULAR=2 }

processNoiseVar4Loc: 6810.8668 # Process noise variance for bbox center
processNoiseVar4Size: 1541.8647 # Process noise variance for bbox size
processNoiseVar4Vel: 1348.4874 # Process noise variance for velocity
measurementNoiseVar4Detector: 100.0000 # Measurement noise variance for detector’s detection
measurementNoiseVar4Tracker: 293.3238 # Measurement noise variance for tracker’s localization

visualTrackerType: 1 # the type of visual tracker among { DUMMY=0, NvDCF=1 }

useColorNames: 1 # Use ColorNames feature
useHog: 1 # Use Histogram-of-Oriented-Gradient (HOG) feature
featureImgSizeLevel: 3 # Size of a feature image. Valid range: {1, 2, 3, 4, 5}, from the smallest to the largest
featureFocusOffsetFactor_y: -0.1054 # The offset for the center of hanning window relative to the feature height. The center of hanning window would move by (featureFocusOffsetFactor_y*featureMatSize.height) in vertical direction

filterLr: 0.0767 # learning rate for DCF filter in exponential moving average. Valid Range: [0.0, 1.0]
filterChannelWeightsLr: 0.0339 # learning rate for the channel weights among feature channels. Valid Range: [0.0, 1.0]
gaussianSigma: 0.5687 # Standard deviation for Gaussian for desired response when creating DCF filter [pixels]
ReID: # need customization
reidType: 2
batchSize: 100
workspaceSize: 1000
reidFeatureSize: 128
reidHistorySize: 148
inferDims: [128, 64, 3]
networkMode: 0
inputOrder: 1
colorFormat: 0
offsets: [0.0, 0.0, 0.0]
netScaleFactor: 1.0000
keepAspc: 1
onnxFile: “/opt/nvidia/deepstream/deepstream/samples/models/Tracker/r50_ibn224.onnx”

Re-id model:
r50_ibn224.onnx (90.4 MB)

Here is the guide to use ONNX Re-ID model: Gst-nvtracker — DeepStream 6.3 Release documentation

Please fill the right input/output if you want to customize Re-ID model, such as inferDims.

I modified the parameter settings, but still can’t execute it.
Is there an official ONNX Re-id model available for testing?

There is no update from you for a period, assuming this is not an issue anymore. Hence we are closing this topic. If need further support, please open a new one. Thanks

Please check the doc I shared above. There is step by step guide to run the ONNX Re-ID model.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.