Unable to fetch Tracker User Meta

Please provide complete information as applicable to your setup.
• DeepStream Version 6.2
• Issue Type( questions, new requirements, bugs) Bug
Hi, I am using Deepstream-test2/ from deepstream_python_apps/ gitrepo
def osd_sink_pad_buffer_probe(pad,info,u_data):
.
.
.
I was trying to access Tracker user meta
l_user=batch_meta.batch_user_meta_list
# l_user = frame_meta.frame_user_meta_list
print(l_user)
while l_user is not None:
try:
# Note that l_user.data needs a cast to pyds.NvDsUserMeta
# The casting is done by pyds.NvDsUserMeta.cast()
# The casting also keeps ownership of the underlying memory
# in the C code, so the Python garbage collector will leave
# it alone
user_meta=pyds.NvDsUserMeta.cast(l_user.data)
except StopIteration:
break
print(“User_meta”, dir(user_meta.user_meta_data), user_meta.base_meta.meta_type)
if(user_meta and user_meta.base_meta.meta_type==pyds.NvDsMetaType.NVDS_TRACKER_PAST_FRAME_META):
try:
# Note that user_meta.user_meta_data needs a cast to pyds.NvDsTargetMiscDataBatch
# The casting is done by pyds.NvDsTargetMiscDataBatch.cast()
# The casting also keeps ownership of the underlying memory
# in the C code, so the Python garbage collector will leave
# it alone
pPastDataBatch = pyds.NvDsTargetMiscDataBatch.cast(user_meta.user_meta_data)
except StopIteration:
break
for miscDataStream in pyds.NvDsTargetMiscDataBatch.list(pPastDataBatch):
print(“streamId=”,miscDataStream.streamID)
print(“surfaceStreamID=”,miscDataStream.surfaceStreamID)
for miscDataObj in pyds.NvDsTargetMiscDataStream.list(miscDataStream):
print(“numobj=”,miscDataObj.numObj)
print(“uniqueId=”,miscDataObj.uniqueId)
print(“classId=”,miscDataObj.classId)
print(“objLabel=”,miscDataObj.objLabel)
for miscDataFrame in pyds.NvDsTargetMiscDataObject.list(miscDataObj):
print(‘frameNum:’, miscDataFrame.frameNum)
print(‘tBbox.left:’, miscDataFrame.tBbox.left)
print(‘tBbox.width:’, miscDataFrame.tBbox.width)
print(‘tBbox.top:’, miscDataFrame.tBbox.top)
This is my infer_config file
[property]
gpu-id=0
net-scale-factor=0.0039215697906911373
model-file=…/…/…/…/samples/models/Primary_Detector/resnet10.caffemodel
proto-file=…/…/…/…/samples/models/Primary_Detector/resnet10.prototxt
model-engine-file=…/…/…/…/samples/models/Primary_Detector/resnet10.caffemodel_b1_gpu0_int8.engine
labelfile-path=…/…/…/…/samples/models/Primary_Detector/labels.txt
int8-calib-file=…/…/…/…/samples/models/Primary_Detector/cal_trt.bin
force-implicit-batch-dim=1
batch-size=1
network-mode=1
process-mode=1
model-color-format=0
num-detected-classes=4
interval=0
gie-unique-id=1
output-blob-names=conv2d_bbox;conv2d_cov/Sigmoid
#scaling-filter=0
#scaling-compute-hw=0

[class-attrs-all]
pre-cluster-threshold=0.2
eps=0.2
group-threshold=1

This is my tracker_yml

%YAML:1.0
################################################################################
# SPDX-FileCopyrightText: Copyright (c) 2019-2021 NVIDIA CORPORATION & AFFILIATES. All rights reserved.
# SPDX-License-Identifier: Apache-2.0
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
################################################################################

BaseConfig:
  minDetectorConfidence: 0   # If the confidence of a detector bbox is lower than this, then it won't be considered for tracking

TargetManagement:
  enableBboxUnClipping: 1   # In case the bbox is likely to be clipped by image border, unclip bbox
  maxTargetsPerStream: 150  # Max number of targets to track per stream. Recommended to set >10. Note: this value should account for the targets being tracked in shadow mode as well. Max value depends on the GPU memory capacity

  # [Creation & Termination Policy]
  minIouDiff4NewTarget: 0.5   # If the IOU between the newly detected object and any of the existing targets is higher than this threshold, this newly detected object will be discarded.
  minTrackerConfidence: 0.2   # If the confidence of an object tracker is lower than this on the fly, then it will be tracked in shadow mode. Valid Range: [0.0, 1.0]
  probationAge: 3 # If the target's age exceeds this, the target will be considered to be valid.
  maxShadowTrackingAge: 30  # Max length of shadow tracking. If the shadowTrackingAge exceeds this limit, the tracker will be terminated.
  earlyTerminationAge: 1   # If the shadowTrackingAge reaches this threshold while in TENTATIVE period, the target will be terminated prematurely.

TrajectoryManagement:
  useUniqueID: 0   # Use 64-bit long Unique ID when assignining tracker ID. Default is [true]

DataAssociator:
  dataAssociatorType: 0 # the type of data associator among { DEFAULT= 0 }
  associationMatcherType: 0 # the type of matching algorithm among { GREEDY=0, GLOBAL=1 }
  checkClassMatch: 1  # If checked, only the same-class objects are associated with each other. Default: true

  # [Association Metric: Thresholds for valid candidates]
  minMatchingScore4Overall: 0.0   # Min total score
  minMatchingScore4SizeSimilarity: 0.6  # Min bbox size similarity score
  minMatchingScore4Iou: 0.0       # Min IOU score
  minMatchingScore4VisualSimilarity: 0.7  # Min visual similarity score

  # [Association Metric: Weights]
  matchingScoreWeight4VisualSimilarity: 0.6  # Weight for the visual similarity (in terms of correlation response ratio)
  matchingScoreWeight4SizeSimilarity: 0.0    # Weight for the Size-similarity score
  matchingScoreWeight4Iou: 0.4   # Weight for the IOU score

StateEstimator:
  stateEstimatorType: 1  # the type of state estimator among { DUMMY=0, SIMPLE=1, REGULAR=2 }

  # [Dynamics Modeling]
  processNoiseVar4Loc: 2.0    # Process noise variance for bbox center
  processNoiseVar4Size: 1.0   # Process noise variance for bbox size
  processNoiseVar4Vel: 0.1    # Process noise variance for velocity
  measurementNoiseVar4Detector: 4.0    # Measurement noise variance for detector's detection
  measurementNoiseVar4Tracker: 16.0    # Measurement noise variance for tracker's localization

VisualTracker:
  visualTrackerType: 1 # the type of visual tracker among { DUMMY=0, NvDCF=1 }

  # [NvDCF: Feature Extraction]
  useColorNames: 1     # Use ColorNames feature
  useHog: 0            # Use Histogram-of-Oriented-Gradient (HOG) feature
  featureImgSizeLevel: 2  # Size of a feature image. Valid range: {1, 2, 3, 4, 5}, from the smallest to the largest
  featureFocusOffsetFactor_y: -0.2 # The offset for the center of hanning window relative to the feature height. The center of hanning window would move by (featureFocusOffsetFactor_y*featureMatSize.height) in vertical direction

  # [NvDCF: Correlation Filter]
  filterLr: 0.075 # learning rate for DCF filter in exponential moving average. Valid Range: [0.0, 1.0]
  filterChannelWeightsLr: 0.1 # learning rate for the channel weights among feature channels. Valid Range: [0.0, 1.0]
  gaussianSigma: 0.75 # Standard deviation for Gaussian for desired response when creating DCF filter [pixels]

[tracker]
tracker-width=640
tracker-height=384
gpu-id=0
ll-lib-file=/opt/nvidia/deepstream/deepstream/lib/libnvds_nvmultiobjecttracker.so
ll-config-file=tracker_config.yml
enable-past-frame=1

Could anyone help me out where could I possibly missing.

Can you share a sample project that reproduces the problem? There is no problem with the configuration file and code.

While parsing the config in the main code enable past frame was not being set, I fixed it,
Well thanks for replying.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.