I want to display the result as an image using DeepStream Python Binding

I want to display the result as an image using DeepStream Python Binding

Installed DeepStream SDK and Python Binding
But when I run the sample, the inference result image is not displayed
/opt/nvidia/deepstream/deepstream-4.0/sources/python/apps/deepstream-test1$ python3 deepstream_test_1.py test2.webm
Creating Pipeline

Creating Source

Creating H264Parser

Creating Decoder

Creating EGLSink

Playing file test2.webm
Adding elements to Pipeline

Linking elements in the Pipeline

Starting pipeline

Using winsys: x11
Opening in BLOCKING MODE
Creating LL OSD context new
0:00:03.781270136 6787 0x24fee530 INFO nvinfer gstnvinfer.cpp:519:gst_nvinfer_logger: NvDsInferContext[UID 1]:initialize(): Trying to create engine from model files
0:00:03.781532222 6787 0x24fee530 WARN nvinfer gstnvinfer.cpp:515:gst_nvinfer_logger: NvDsInferContext[UID 1]:generateTRTModel(): INT8 not supported by platform. Trying FP16 mode.

0:02:01.778206192 6787 0x24fee530 INFO nvinfer gstnvinfer.cpp:519:gst_nvinfer_logger: NvDsInferContext[UID 1]:generateTRTModel(): Storing the serialized cuda engine to file at /opt/nvidia/deepstream/deepstream-4.0/samples/models/Primary_Detector/resnet10.caffemodel_b1_fp16.engine
Error: gst-stream-error-quark: No valid frames found before end of stream (5): gstbaseparse.c(3603): gst_base_parse_loop (): /GstPipeline:pipeline0/GstH264Parse:h264-parser

Please noted test1 sample only accpet h264 elementary source, you can use test3 sample which use uridecodebin, any GStreamer supported container format, and any codec can be used as input.