Testing deepstream-test1 in Python

I am testing deepstream-test1 app. The program stops at

0:00:03.495019878 15326     0x2016bf90 INFO                 nvinfer gstnvinfer.cpp:619:gst_nvinfer_logger:<primary-inference> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:1805> [UID = 1]: Use deserialized engine model: /opt/nvidia/deepstream/deepstream-5.0/samples/models/numplate/resnet_18_fp16.etlt_b1_gpu0_fp16.engine
0:00:03.499872786 15326     0x2016bf90 INFO                 nvinfer gstnvinfer_impl.cpp:313:notifyLoadModelStatus:<primary-inference> [UID 1]: Load new model:dstest1_pgie_config.txt sucessfully
NvMMLiteOpen : Block : BlockType = 261 
NVMEDIA: Reading vendor.tegra.display-size : status: 6 
NvMMLiteBlockCreate : Block : BlockType = 261 

My config file is

[property]
gpu-id=0
net-scale-factor=0.0039215697906911373
tlt-model-key=NHRvZzAwbHFncTk0MXJ0YmwwbXB1bGxhbnU6MjYzNzc2MDctYzQ5MC00NjkxLThkODAtODM0NDc3ZTRhNTNh
tlt-encoded-model=../../../../samples/models/numplate/resnet_18_fp16.etlt
labelfile-path=../../../../samples/models/numplate/labels.txt
model-engine-file=../../../../samples/models/numplate/resnet_18_fp16.etlt_b1_gpu0_fp16.engine
input-dims=3;720;720;0
batch-size=1
model-color-format=0
uff-input-blob-name=input_1
process-mode=1
## 0=FP32, 1=INT8, 2=FP16 mode
network-mode=2
num-detected-classes=2
cluster-mode=1
interval=0
gie-unique-id=1
output-blob-names=output_bbox/BiasAdd;output_cov/Sigmoid

[class-attrs-all]
pre-cluster-threshold=0.2
eps=0.2
group-threshold=1

Tested both mov file and mp4 file, both have issue.

Is “resnet_18_fp16.etlt_b1_gpu0_fp16.engine” generated successfully?

And can you provide more log? There is no error in the log you provided.
Can you also provide the environment information? Refer to the following format:
• Hardware Platform (jetson xavier nx devkit)
• DeepStream Version
• JetPack Version (valid for Jetson only)
• TensorRT Version

Yes engine is generated successfully.

AGX Xavier. Deepstream 5.0. Jetpack 4.4.
TensorRT 7.1.

The followings are the only log I have and stuck there.

xavier@xavier-desktop:/opt/nvidia/deepstream/deepstream/sources/deepstream_python_apps/apps/deepstream-test1$ python3 deepstream_test_1.py /opt/nvidia/deepstream/deepstream-5.0/samples/streams/sample_720p.mp4
Creating Pipeline 
 
Creating Source 
 
Creating H264Parser 

Creating Decoder 

Creating EGLSink 

Playing file /opt/nvidia/deepstream/deepstream-5.0/samples/streams/sample_720p.mp4 
Warning: 'input-dims' parameter has been deprecated. Use 'infer-dims' instead.
Adding elements to Pipeline 

Linking elements in the Pipeline 

Starting pipeline 


Using winsys: x11 
Opening in BLOCKING MODE 
0:00:05.718812517  9154     0x3ecc5f90 INFO                 nvinfer gstnvinfer.cpp:619:gst_nvinfer_logger:<primary-inference> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::deserializeEngineAndBackend() <nvdsinfer_context_impl.cpp:1701> [UID = 1]: deserialized trt engine from :/opt/nvidia/deepstream/deepstream-5.0/samples/models/numplate/resnet_18_fp16.etlt_b1_gpu0_fp16.engine
INFO: [Implicit Engine Info]: layers num: 3
0   INPUT  kFLOAT input_1         3x720x720       
1   OUTPUT kFLOAT output_bbox/BiasAdd 8x45x45         
2   OUTPUT kFLOAT output_cov/Sigmoid 2x45x45         

0:00:05.719055633  9154     0x3ecc5f90 INFO                 nvinfer gstnvinfer.cpp:619:gst_nvinfer_logger:<primary-inference> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:1805> [UID = 1]: Use deserialized engine model: /opt/nvidia/deepstream/deepstream-5.0/samples/models/numplate/resnet_18_fp16.etlt_b1_gpu0_fp16.engine
0:00:05.734272487  9154     0x3ecc5f90 INFO                 nvinfer gstnvinfer_impl.cpp:313:notifyLoadModelStatus:<primary-inference> [UID 1]: Load new model:dstest1_pgie_config.txt sucessfully
NvMMLiteOpen : Block : BlockType = 261 
NVMEDIA: Reading vendor.tegra.display-size : status: 6 
NvMMLiteBlockCreate : Block : BlockType = 261

@edit_or

You can open more GST logs using this command:

export GST_DEBUG=3

And then run this python program again.
The log may be like this:

0:00:03.061754869 17986     0x2cf9a0f0 WARN       codecparsers_h264 gsth264parser.c:2092:gst_h264_parser_parse_slice_hdr: couldn't find associated picture parameter set with id: 0
0:00:03.061816056 17986     0x2cf9a0f0 WARN       codecparsers_h264 gsth264parser.c:2092:gst_h264_parser_parse_slice_hdr: couldn't find associated picture parameter set with id: 0
0:00:03.061841529 17986     0x2cf9a0f0 WARN       codecparsers_h264 gsth264parser.c:2092:gst_h264_parser_parse_slice_hdr: couldn't find associated picture parameter set with id: 227

It seems the pipeline defined by this python file does not support codec of .mp4 format.
Try to use .h264 format.

I was facing the same and your solution works. But then how can i process .mp4 videos with it?
Thanks in advance

Hi shubham.shan09,

Please help to open a new topic for your issue. Thanks

okay