Hello,
I am using Jetson Nano with Jetpack 4.2.1 fresh install.
I took pretrained SSD Inception v2 2018 model from Tensorflow model zoo repo.
Created engine file from Tensor RT samples sampleUffSSD by serializing engine.
trtModelStream = engine->serialize();
ofstream p("./ssd_inception_v2.engine");
p.write((const char*)trtModelStream->data(),trtModelStream->size());
p.close();
In deepstream sample “deepstream-test3” my dstest3_pgie_config.txt is as per below:
[property]
gpu-id=0
net-scale-factor=0.0078431372
offsets=127.5;127.5;127.5
model-color-format=0
model-engine-file=/usr/src/tensorrt/samples/sampleUffSSD/ssd_inception_v2_2018_pretrained.engine
#model-engine-file=/usr/src/tensorrt/samples/sampleUffSSD/ssd_inception_v2_aws_18874.engine
#labelfile-path=../../../../samples/models/Primary_Detector/ssd_coco_labels.txt
labelfile-path=../../../../samples/models/Primary_Detector/coco_labels.txt
batch-size=1
## 0=FP32, 1=INT8, 2=FP16 mode
network-mode=0
num-detected-classes=7
interval=0
gie-unique-id=1
is-classifier=0
output-blob-names=MarkOutput_0
custom-lib-path=/opt/nvidia/deepstream/deepstream-4.0/sources/objectDetector_SSD/nvdsinfer_custom_impl_ssd/libnvdsinfer_custom_impl_ssd.so
parse-bbox-func-name=NvDsInferParseCustomSSD
[class-attrs-all]
threshold=0.5
roi-top-offset=0
roi-bottom-offset=0
detected-min-w=0
detected-min-h=0
detected-max-w=0
detected-max-h=0
The problem is when I run the sample with mp4 video, for some seconds video runs at normal speed and then pauses and then runs at normal speed. Some Frame are being skipped.
Detections are okay.
To check inference time on each frame I took time difference b/w tiler_src_pad_buffer_probe and tiler_sink_pad_buffer_prob (Added by me).
I found time difference 14 to 25 milliseconds.