Hello Nvidia DeepStream Community,
I am currently working on the deepstream_imagedata-multistream
test application using the DeepStream 6.4 Docker image (deepstream:6.4-triton-multiarch)
on a dGPU
setup (not Jetson).
Issue Description:
When running the application with RTSP streams, it functions correctly under normal conditions. However, if there is a network issue or any other problem causing one of the RTSP streams to disconnect momentarily, the application can handle the stream disconnection and continues processing the remaining streams without any issues.
The problem arises when the disconnected RTSP stream comes back online. At this point, the application encounters an error and exits. The specific error messages are as follows:
Warning: gst-resource-error-quark: Could not read from resource. (9): ../gst/rtsp/gstrtspsrc.c(5832): gst_rtspsrc_loop_udp (): /GstPipeline:pipeline0/GstBin:source-bin-01/GstURIDecodeBin:uri-decode-bin/GstRTSPSrc:source:
Unhandled return value -7.
Error: gst-resource-error-quark: Could not read from resource. (9): ../gst/rtsp/gstrtspsrc.c(5900): gst_rtspsrc_loop_udp (): /GstPipeline:pipeline0/GstBin:source-bin-01/GstURIDecodeBin:uri-decode-bin/GstRTSPSrc:source:
Could not receive message. (System error)
Exiting app
Additionally, the GDB backtrace is as follows (the app is running with 2 RTSP streams, one of them gets disconnected momentarily, and when it comes back, the app leads to an error):
pstream_python_apps/apps/deepstream-imagedata-multistream# gdb --args python3 deepstream_imagedata-multistream.py 'rtsp://camera_stream_url1' 'rtsp://camera_stream_url2' frames
GNU gdb (Ubuntu 12.1-0ubuntu1~22.04) 12.1
Copyright (C) 2022 Free Software Foundation, Inc.
License GPLv3+: GNU GPL version 3 or later <http://gnu.org/licenses/gpl.html>
This is free software: you are free to change and redistribute it.
There is NO WARRANTY, to the extent permitted by law.
Type "show copying" and "show warranty" for details.
This GDB was configured as "x86_64-linux-gnu".
Type "show configuration" for configuration details.
For bug reporting instructions, please see:
<https://www.gnu.org/software/gdb/bugs/>.
Find the GDB manual and other documentation resources online at:
<http://www.gnu.org/software/gdb/documentation/>.
For help, type "help".
Type "apropos word" to search for commands related to "word"...
Reading symbols from python3...
(No debugging symbols found in python3)
(gdb) run
Starting program: /usr/bin/python3 deepstream_imagedata-multistream.py rtsp://camera_stream_url1 rtsp://camera_stream_url2 frames
[Thread debugging using libthread_db enabled]
Using host libthread_db library "/usr/lib/x86_64-linux-gnu/libthread_db.so.1".
[New Thread 0x7ffff29ff640 (LWP 239)]
[New Thread 0x7ffff21fe640 (LWP 240)]
[New Thread 0x7fffef9fd640 (LWP 241)]
[New Thread 0x7fffeb1fc640 (LWP 242)]
[New Thread 0x7fffea9fb640 (LWP 243)]
[New Thread 0x7fffe61fa640 (LWP 244)]
[New Thread 0x7fffe39f9640 (LWP 245)]
[New Thread 0x7fffe31f8640 (LWP 246)]
[New Thread 0x7fffe09f7640 (LWP 247)]
[New Thread 0x7fffde1f6640 (LWP 248)]
[New Thread 0x7fffd99f5640 (LWP 249)]
[New Thread 0x7fffd71f4640 (LWP 250)]
[New Thread 0x7fffd49f3640 (LWP 251)]
[New Thread 0x7fffd21f2640 (LWP 252)]
[New Thread 0x7fffd19f1640 (LWP 253)]
[New Thread 0x7fffcd1f0640 (LWP 254)]
[New Thread 0x7fffcc9ef640 (LWP 255)]
[New Thread 0x7fffca1ee640 (LWP 256)]
[New Thread 0x7fffc79ed640 (LWP 257)]
Frames will be saved in frames
Creating Pipeline
Creating streamux
[New Thread 0x7fffaffff640 (LWP 258)]
Creating source_bin 0
uri_name rtsp://hoopad:admin123@192.168.10.21:554/cam/realmonitor?channel=1&subtype=0
Creating source bin
source-bin-00
/opt/nvidia/deepstream/deepstream-6.4/sources/deepstream_python_apps/apps/deepstream-imagedata-multistream/deepstream_imagedata-multistream.py:312: DeprecationWarning: Gst.Element.get_request_pad is deprecated
sinkpad = streammux.get_request_pad(padname)
Creating source_bin 1
uri_name rtsp://hoopad:admin123@192.168.10.30:554/cam/realmonitor?channel=1&subtype=0
Creating source bin
source-bin-01
Creating Pgie
Creating nvvidconv1
Creating filter1
Creating tiler
Creating nvvidconv
Creating nvosd
[Detaching after vfork from child process 259]
Creating EGLSink
Atleast one of the sources is live
[New Thread 0x7fff6e5ff640 (LWP 260)]
WARNING: Overriding infer-config batch-size 30 with number of sources 2
Adding elements to Pipeline
Linking elements in the Pipeline
Now playing...
1 : rtsp://hoopad:admin123@192.168.10.21:554/cam/realmonitor?channel=1&subtype=0
2 : rtsp://hoopad:admin123@192.168.10.30:554/cam/realmonitor?channel=1&subtype=0
Starting pipeline
[New Thread 0x7fff4d5ff640 (LWP 261)]
[New Thread 0x7fff4cdfe640 (LWP 262)]
[New Thread 0x7fff46fde640 (LWP 263)]
WARNING: [TRT]: Using an engine plan file across different models of devices is not recommended and is likely to affect performance or even cause errors.
0:00:03.672362819 230 0x555556e95d60 INFO nvinfer gstnvinfer.cpp:682:gst_nvinfer_logger:<primary-inference> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::deserializeEngineAndBackend() <nvdsinfer_context_impl.cpp:2092> [UID = 1]: deserialized trt engine from :/opt/nvidia/deepstream/deepstream-6.4/samples/models/Primary_Detector/resnet18_trafficcamnet.etlt_b1_gpu0_int8.engine
INFO: ../nvdsinfer/nvdsinfer_model_builder.cpp:610 [Implicit Engine Info]: layers num: 3
0 INPUT kFLOAT input_1 3x544x960
1 OUTPUT kFLOAT output_bbox/BiasAdd 16x34x60
2 OUTPUT kFLOAT output_cov/Sigmoid 4x34x60
0:00:03.767187207 230 0x555556e95d60 WARN nvinfer gstnvinfer.cpp:679:gst_nvinfer_logger:<primary-inference> NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::checkBackendParams() <nvdsinfer_context_impl.cpp:2024> [UID = 1]: Backend has maxBatchSize 1 whereas 2 has been requested
0:00:03.768700265 230 0x555556e95d60 WARN nvinfer gstnvinfer.cpp:679:gst_nvinfer_logger:<primary-inference> NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:2201> [UID = 1]: deserialized backend context :/opt/nvidia/deepstream/deepstream-6.4/samples/models/Primary_Detector/resnet18_trafficcamnet.etlt_b1_gpu0_int8.engine failed to match config params, trying rebuild
0:00:03.769983957 230 0x555556e95d60 INFO nvinfer gstnvinfer.cpp:682:gst_nvinfer_logger:<primary-inference> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:2106> [UID = 1]: Trying to create engine from model files
WARNING: [TRT]: The implicit batch dimension mode has been deprecated. Please create the network with NetworkDefinitionCreationFlag::kEXPLICIT_BATCH flag whenever possible.
WARNING: [TRT]: Missing scale and zero-point for tensor output_bbox/bias, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor conv1/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor conv1/bias, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor bn_conv1/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor bn_conv1/Reshape_1/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor bn_conv1/batchnorm/add/y, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor bn_conv1/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor bn_conv1/Reshape_3/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor bn_conv1/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor bn_conv1/Reshape_2/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor bn_conv1/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor bn_conv1/Reshape/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_1a_conv_1/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_1a_conv_1/bias, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_1a_bn_1/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_1a_bn_1/Reshape_1/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_1a_bn_1/batchnorm/add/y, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_1a_bn_1/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_1a_bn_1/Reshape_3/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_1a_bn_1/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_1a_bn_1/Reshape_2/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_1a_bn_1/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_1a_bn_1/Reshape/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_1a_conv_2/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_1a_conv_2/bias, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_1a_bn_2/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_1a_bn_2/Reshape_1/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_1a_bn_2/batchnorm/add/y, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_1a_bn_2/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_1a_bn_2/Reshape_3/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_1a_bn_2/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_1a_bn_2/Reshape_2/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_1a_bn_2/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_1a_bn_2/Reshape/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_1a_conv_shortcut/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_1a_conv_shortcut/bias, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_1a_bn_shortcut/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_1a_bn_shortcut/Reshape_1/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_1a_bn_shortcut/batchnorm/add/y, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_1a_bn_shortcut/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_1a_bn_shortcut/Reshape_3/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_1a_bn_shortcut/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_1a_bn_shortcut/Reshape_2/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_1a_bn_shortcut/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_1a_bn_shortcut/Reshape/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_1b_conv_1/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_1b_conv_1/bias, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_1b_bn_1/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_1b_bn_1/Reshape_1/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_1b_bn_1/batchnorm/add/y, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_1b_bn_1/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_1b_bn_1/Reshape_3/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_1b_bn_1/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_1b_bn_1/Reshape_2/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_1b_bn_1/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_1b_bn_1/Reshape/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_1b_conv_2/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_1b_conv_2/bias, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_1b_bn_2/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_1b_bn_2/Reshape_1/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_1b_bn_2/batchnorm/add/y, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_1b_bn_2/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_1b_bn_2/Reshape_3/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_1b_bn_2/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_1b_bn_2/Reshape_2/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_1b_bn_2/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_1b_bn_2/Reshape/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_1b_conv_shortcut/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_1b_conv_shortcut/bias, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_1b_bn_shortcut/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_1b_bn_shortcut/Reshape_1/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_1b_bn_shortcut/batchnorm/add/y, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_1b_bn_shortcut/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_1b_bn_shortcut/Reshape_3/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_1b_bn_shortcut/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_1b_bn_shortcut/Reshape_2/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_1b_bn_shortcut/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_1b_bn_shortcut/Reshape/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_2a_conv_1/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_2a_conv_1/bias, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_2a_bn_1/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_2a_bn_1/Reshape_1/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_2a_bn_1/batchnorm/add/y, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_2a_bn_1/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_2a_bn_1/Reshape_3/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_2a_bn_1/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_2a_bn_1/Reshape_2/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_2a_bn_1/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_2a_bn_1/Reshape/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_2a_conv_2/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_2a_conv_2/bias, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_2a_bn_2/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_2a_bn_2/Reshape_1/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_2a_bn_2/batchnorm/add/y, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_2a_bn_2/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_2a_bn_2/Reshape_3/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_2a_bn_2/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_2a_bn_2/Reshape_2/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_2a_bn_2/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_2a_bn_2/Reshape/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_2a_conv_shortcut/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_2a_conv_shortcut/bias, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_2a_bn_shortcut/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_2a_bn_shortcut/Reshape_1/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_2a_bn_shortcut/batchnorm/add/y, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_2a_bn_shortcut/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_2a_bn_shortcut/Reshape_3/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_2a_bn_shortcut/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_2a_bn_shortcut/Reshape_2/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_2a_bn_shortcut/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_2a_bn_shortcut/Reshape/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_2b_conv_1/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_2b_conv_1/bias, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_2b_bn_1/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_2b_bn_1/Reshape_1/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_2b_bn_1/batchnorm/add/y, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_2b_bn_1/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_2b_bn_1/Reshape_3/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_2b_bn_1/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_2b_bn_1/Reshape_2/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_2b_bn_1/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_2b_bn_1/Reshape/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_2b_conv_2/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_2b_conv_2/bias, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_2b_bn_2/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_2b_bn_2/Reshape_1/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_2b_bn_2/batchnorm/add/y, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_2b_bn_2/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_2b_bn_2/Reshape_3/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_2b_bn_2/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_2b_bn_2/Reshape_2/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_2b_bn_2/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_2b_bn_2/Reshape/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_2b_conv_shortcut/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_2b_conv_shortcut/bias, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_2b_bn_shortcut/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_2b_bn_shortcut/Reshape_1/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_2b_bn_shortcut/batchnorm/add/y, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_2b_bn_shortcut/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_2b_bn_shortcut/Reshape_3/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_2b_bn_shortcut/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_2b_bn_shortcut/Reshape_2/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_2b_bn_shortcut/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_2b_bn_shortcut/Reshape/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_3a_conv_1/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_3a_conv_1/bias, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_3a_bn_1/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_3a_bn_1/Reshape_1/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_3a_bn_1/batchnorm/add/y, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_3a_bn_1/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_3a_bn_1/Reshape_3/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_3a_bn_1/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_3a_bn_1/Reshape_2/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_3a_bn_1/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_3a_bn_1/Reshape/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_3a_conv_2/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_3a_conv_2/bias, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_3a_bn_2/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_3a_bn_2/Reshape_1/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_3a_bn_2/batchnorm/add/y, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_3a_bn_2/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_3a_bn_2/Reshape_3/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_3a_bn_2/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_3a_bn_2/Reshape_2/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_3a_bn_2/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_3a_bn_2/Reshape/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_3a_conv_shortcut/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_3a_conv_shortcut/bias, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_3a_bn_shortcut/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_3a_bn_shortcut/Reshape_1/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_3a_bn_shortcut/batchnorm/add/y, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_3a_bn_shortcut/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_3a_bn_shortcut/Reshape_3/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_3a_bn_shortcut/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_3a_bn_shortcut/Reshape_2/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_3a_bn_shortcut/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_3a_bn_shortcut/Reshape/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_3b_conv_1/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_3b_conv_1/bias, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_3b_bn_1/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_3b_bn_1/Reshape_1/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_3b_bn_1/batchnorm/add/y, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_3b_bn_1/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_3b_bn_1/Reshape_3/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_3b_bn_1/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_3b_bn_1/Reshape_2/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_3b_bn_1/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_3b_bn_1/Reshape/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_3b_conv_2/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_3b_conv_2/bias, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_3b_bn_2/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_3b_bn_2/Reshape_1/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_3b_bn_2/batchnorm/add/y, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_3b_bn_2/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_3b_bn_2/Reshape_3/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_3b_bn_2/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_3b_bn_2/Reshape_2/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_3b_bn_2/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_3b_bn_2/Reshape/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_3b_conv_shortcut/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_3b_conv_shortcut/bias, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_3b_bn_shortcut/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_3b_bn_shortcut/Reshape_1/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_3b_bn_shortcut/batchnorm/add/y, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_3b_bn_shortcut/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_3b_bn_shortcut/Reshape_3/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_3b_bn_shortcut/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_3b_bn_shortcut/Reshape_2/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_3b_bn_shortcut/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_3b_bn_shortcut/Reshape/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_4a_conv_1/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_4a_conv_1/bias, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_4a_bn_1/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_4a_bn_1/Reshape_1/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_4a_bn_1/batchnorm/add/y, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_4a_bn_1/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_4a_bn_1/Reshape_3/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_4a_bn_1/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_4a_bn_1/Reshape_2/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_4a_bn_1/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_4a_bn_1/Reshape/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_4a_conv_2/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_4a_conv_2/bias, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_4a_bn_2/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_4a_bn_2/Reshape_1/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_4a_bn_2/batchnorm/add/y, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_4a_bn_2/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_4a_bn_2/Reshape_3/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_4a_bn_2/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_4a_bn_2/Reshape_2/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_4a_bn_2/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_4a_bn_2/Reshape/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_4a_conv_shortcut/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_4a_conv_shortcut/bias, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_4a_bn_shortcut/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_4a_bn_shortcut/Reshape_1/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_4a_bn_shortcut/batchnorm/add/y, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_4a_bn_shortcut/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_4a_bn_shortcut/Reshape_3/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_4a_bn_shortcut/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_4a_bn_shortcut/Reshape_2/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_4a_bn_shortcut/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_4a_bn_shortcut/Reshape/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_4b_conv_1/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_4b_conv_1/bias, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_4b_bn_1/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_4b_bn_1/Reshape_1/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_4b_bn_1/batchnorm/add/y, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_4b_bn_1/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_4b_bn_1/Reshape_3/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_4b_bn_1/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_4b_bn_1/Reshape_2/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_4b_bn_1/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_4b_bn_1/Reshape/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_4b_conv_2/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_4b_conv_2/bias, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_4b_bn_2/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_4b_bn_2/Reshape_1/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_4b_bn_2/batchnorm/add/y, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_4b_bn_2/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_4b_bn_2/Reshape_3/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_4b_bn_2/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_4b_bn_2/Reshape_2/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_4b_bn_2/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_4b_bn_2/Reshape/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_4b_conv_shortcut/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_4b_conv_shortcut/bias, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_4b_bn_shortcut/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_4b_bn_shortcut/Reshape_1/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_4b_bn_shortcut/batchnorm/add/y, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_4b_bn_shortcut/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_4b_bn_shortcut/Reshape_3/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_4b_bn_shortcut/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_4b_bn_shortcut/Reshape_2/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_4b_bn_shortcut/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor block_4b_bn_shortcut/Reshape/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor output_bbox/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor output_cov/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
WARNING: [TRT]: Missing scale and zero-point for tensor output_cov/bias, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
0:00:28.551797863 230 0x555556e95d60 INFO nvinfer gstnvinfer.cpp:682:gst_nvinfer_logger:<primary-inference> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:2138> [UID = 1]: serialize cuda engine to file: /opt/nvidia/deepstream/deepstream-6.4/samples/models/Primary_Detector/resnet18_trafficcamnet.etlt_b2_gpu0_int8.engine successfully
INFO: ../nvdsinfer/nvdsinfer_model_builder.cpp:610 [Implicit Engine Info]: layers num: 3
0 INPUT kFLOAT input_1 3x544x960
1 OUTPUT kFLOAT output_bbox/BiasAdd 16x34x60
2 OUTPUT kFLOAT output_cov/Sigmoid 4x34x60
[New Thread 0x7fff31fff640 (LWP 264)]
[New Thread 0x7fff317fe640 (LWP 265)]
[New Thread 0x7fff30ffd640 (LWP 266)]
0:00:28.653660332 230 0x555556e95d60 INFO nvinfer gstnvinfer_impl.cpp:328:notifyLoadModelStatus:<primary-inference> [UID 1]: Load new model:dstest_imagedata_config.txt sucessfully
Decodebin child added: source
[New Thread 0x7fff25903640 (LWP 267)]
Decodebin child added: source
[New Thread 0x7fff25102640 (LWP 268)]
**PERF: {'stream0': 0.0, 'stream1': 0.0}
[New Thread 0x7fff24901640 (LWP 269)]
[New Thread 0x7fff0ffff640 (LWP 270)]
[Thread 0x7fffc79ed640 (LWP 257) exited]
[Thread 0x7fffd49f3640 (LWP 251) exited]
[Thread 0x7fffd99f5640 (LWP 249) exited]
[Thread 0x7fffe31f8640 (LWP 246) exited]
[Thread 0x7fffe39f9640 (LWP 245) exited]
[Thread 0x7fffea9fb640 (LWP 243) exited]
[Thread 0x7fffef9fd640 (LWP 241) exited]
[Thread 0x7ffff21fe640 (LWP 240) exited]
[Thread 0x7fffcc9ef640 (LWP 255) exited]
[Thread 0x7fffca1ee640 (LWP 256) exited]
[Thread 0x7fffde1f6640 (LWP 248) exited]
[Thread 0x7fffeb1fc640 (LWP 242) exited]
[Thread 0x7fffd71f4640 (LWP 250) exited]
[Thread 0x7fffd19f1640 (LWP 253) exited]
[Thread 0x7fffcd1f0640 (LWP 254) exited]
[Thread 0x7fffd21f2640 (LWP 252) exited]
[Thread 0x7fffe61fa640 (LWP 244) exited]
[Thread 0x7ffff29ff640 (LWP 239) exited]
[Thread 0x7fffe09f7640 (LWP 247) exited]
[Detaching after fork from child process 271]
[New Thread 0x7fffc79ed640 (LWP 272)]
[New Thread 0x7fffca1ee640 (LWP 273)]
[New Thread 0x7fffcc9ef640 (LWP 274)]
[New Thread 0x7fffcd1f0640 (LWP 275)]
[New Thread 0x7ffff289e640 (LWP 276)]
[New Thread 0x7fffef9fd640 (LWP 277)]
[New Thread 0x7fffeb1fc640 (LWP 278)]
[New Thread 0x7fffea9fb640 (LWP 279)]
[New Thread 0x7fffe61fa640 (LWP 280)]
[New Thread 0x7fffe39f9640 (LWP 281)]
[New Thread 0x7fffe31f8640 (LWP 282)]
[New Thread 0x7fffe09f7640 (LWP 283)]
[New Thread 0x7fffde1f6640 (LWP 284)]
[New Thread 0x7fffd99f5640 (LWP 285)]
[New Thread 0x7fffd71f4640 (LWP 286)]
Decodebin child added: decodebin0
Decodebin child added: rtppcmadepay0
Decodebin child added: alawdec0
In cb_newpad
[New Thread 0x7fffd49f3640 (LWP 287)]
[New Thread 0x7fffd21f2640 (LWP 288)]
[New Thread 0x7fffd19f1640 (LWP 289)]
[New Thread 0x7fff0f7fe640 (LWP 290)]
[New Thread 0x7fff0effd640 (LWP 291)]
[New Thread 0x7fff0e7fc640 (LWP 292)]
[New Thread 0x7fff0dffb640 (LWP 293)]
[New Thread 0x7fff0d7fa640 (LWP 294)]
Decodebin child added: decodebin1
Decodebin child added: rtppcmadepay1
Decodebin child added: alawdec1
In cb_newpad
[New Thread 0x7fff0cff9640 (LWP 295)]
[New Thread 0x7ffee7fff640 (LWP 296)]
[New Thread 0x7ffee77fe640 (LWP 297)]
[New Thread 0x7ffee6ffd640 (LWP 298)]
[Thread 0x7fffca1ee640 (LWP 273) exited]
Decodebin child added: decodebin2
Decodebin child added: rtph265depay0
Decodebin child added: h265parse0
Decodebin child added: capsfilter0
Decodebin child added: nvv4l2decoder0
[New Thread 0x7fffca1ee640 (LWP 299)]
Decodebin child added: decodebin3
Decodebin child added: rtph264depay0
Decodebin child added: h264parse0
Decodebin child added: capsfilter1
Decodebin child added: nvv4l2decoder1
[New Thread 0x7ffee4a84640 (LWP 300)]
Warning: Color primaries 5 not present and will be treated BT.601
In cb_newpad
[New Thread 0x7ffe8dfff640 (LWP 301)]
[New Thread 0x7ffe8d7fe640 (LWP 302)]
[New Thread 0x7ffe8cffd640 (LWP 303)]
In cb_newpad
[New Thread 0x7ffe84fde640 (LWP 304)]
[New Thread 0x7ffe7dfff640 (LWP 305)]
[New Thread 0x7ffe7d7fe640 (LWP 306)]
[New Thread 0x7ffe7cffd640 (LWP 307)]
[New Thread 0x7ffe65fff640 (LWP 308)]
[New Thread 0x7ffe657fe640 (LWP 309)]
[New Thread 0x7ffe64ffd640 (LWP 310)]
[New Thread 0x7ffe5dfff640 (LWP 311)]
[New Thread 0x7ffe5d7fe640 (LWP 312)]
[New Thread 0x7ffe5cffd640 (LWP 313)]
[New Thread 0x7ffe3ffff640 (LWP 314)]
[New Thread 0x7ffe3f7fe640 (LWP 315)]
[New Thread 0x7ffe3effd640 (LWP 316)]
[New Thread 0x7ffe3e7fc640 (LWP 317)]
[New Thread 0x7ffe3dffb640 (LWP 318)]
[New Thread 0x7ffe3d7fa640 (LWP 319)]
[New Thread 0x7ffe3cff9640 (LWP 320)]
[New Thread 0x7ffe1ffff640 (LWP 321)]
[New Thread 0x7ffe177fe640 (LWP 322)]
[New Thread 0x7ffe1f7fe640 (LWP 323)]
[New Thread 0x7ffe1effd640 (LWP 324)]
[New Thread 0x7ffe1e7fc640 (LWP 325)]
Frame Number= 0 Number of Objects= 9 Vehicle_count= 9 Person_count= 0
Frame Number= 0 Number of Objects= 1 Vehicle_count= 1 Person_count= 0
[New Thread 0x7ffe1dffb640 (LWP 326)]
[Thread 0x7ffe1dffb640 (LWP 326) exited]
[New Thread 0x7ffe1dffb640 (LWP 327)]
[Thread 0x7ffe1dffb640 (LWP 327) exited]
[New Thread 0x7ffe1d7fa640 (LWP 328)]
[Thread 0x7ffe1d7fa640 (LWP 328) exited]
[New Thread 0x7ffe1d7fa640 (LWP 329)]
[Thread 0x7ffe1d7fa640 (LWP 329) exited]
[New Thread 0x7ffe1d7fa640 (LWP 330)]
[Thread 0x7ffe1d7fa640 (LWP 330) exited]
[New Thread 0x7ffe1d7fa640 (LWP 331)]
[New Thread 0x7ffe1dffb640 (LWP 332)]
Frame Number= 1 Number of Objects= 8 Vehicle_count= 8 Person_count= 0
Frame Number= 1 Number of Objects= 1 Vehicle_count= 1 Person_count= 0
Frame Number= 2 Number of Objects= 11 Vehicle_count= 11 Person_count= 0
Frame Number= 2 Number of Objects= 2 Vehicle_count= 2 Person_count= 0
Frame Number= 3 Number of Objects= 2 Vehicle_count= 2 Person_count= 0
**PERF: {'stream0': 29.17, 'stream1': 44.64}
Frame Number= 101 Number of Objects= 3 Vehicle_count= 3 Person_count= 0
Frame Number= 102 Number of Objects= 2 Vehicle_count= 2 Person_count= 0
Frame Number= 67 Number of Objects= 10 Vehicle_count= 8 Person_count= 2
Frame Number= 103 Number of Objects= 2 Vehicle_count= 2 Person_count= 0
Frame Number= 104 Number of Objects= 1 Vehicle_count= 1 Person_count= 0
Frame Number= 105 Number of Objects= 1 Vehicle_count= 0 Person_count= 1
Frame Number= 68 Number of Objects= 11 Vehicle_count= 9 Person_count= 2
Frame Number= 106 Number of Objects= 2 Vehicle_count= 1 Person_count= 1
Frame Number= 107 Number of Objects= 4 Vehicle_count= 2 Person_count= 2
Frame Number= 69 Number of Objects= 10 Vehicle_count= 9 Person_count= 1
Frame Number= 108 Number of Objects= 3 Vehicle_count= 2 Person_count= 1
Frame Number= 109 Number of Objects= 5 Vehicle_count= 4 Person_count= 1
Frame Number= 110 Number of Objects= 3 Vehicle_count= 3 Person_count= 0
Frame Number= 70 Number of Objects= 12 Vehicle_count= 9 Person_count= 2
**PERF: {'stream0': 15.0, 'stream1': 26.99}
Frame Number= 236 Number of Objects= 1 Vehicle_count= 1 Person_count= 0
Frame Number= 142 Number of Objects= 10 Vehicle_count= 9 Person_count= 1
Frame Number= 237 Number of Objects= 0 Vehicle_count= 0 Person_count= 0
Frame Number= 238 Number of Objects= 0 Vehicle_count= 0 Person_count= 0
Frame Number= 143 Number of Objects= 10 Vehicle_count= 9 Person_count= 1
Frame Number= 239 Number of Objects= 0 Vehicle_count= 0 Person_count= 0
Frame Number= 240 Number of Objects= 1 Vehicle_count= 1 Person_count= 0
Frame Number= 144 Number of Objects= 9 Vehicle_count= 9 Person_count= 0
Frame Number= 241 Number of Objects= 1 Vehicle_count= 1 Person_count= 0
Frame Number= 242 Number of Objects= 0 Vehicle_count= 0 Person_count= 0
Frame Number= 145 Number of Objects= 8 Vehicle_count= 8 Person_count= 0
Frame Number= 243 Number of Objects= 0 Vehicle_count= 0 Person_count= 0
Frame Number= 244 Number of Objects= 1 Vehicle_count= 1 Person_count= 0
Frame Number= 146 Number of Objects= 9 Vehicle_count= 9 Person_count= 0
**PERF: {'stream0': 14.6, 'stream1': 18.19}
Frame Number= 215 Number of Objects= 18 Vehicle_count= 10 Person_count= 5
Frame Number= 216 Number of Objects= 14 Vehicle_count= 10 Person_count= 3
Frame Number= 217 Number of Objects= 12 Vehicle_count= 8 Person_count= 2
Frame Number= 218 Number of Objects= 13 Vehicle_count= 9 Person_count= 2
Frame Number= 219 Number of Objects= 13 Vehicle_count= 9 Person_count= 2
Frame Number= 220 Number of Objects= 14 Vehicle_count= 9 Person_count= 3
Frame Number= 221 Number of Objects= 15 Vehicle_count= 10 Person_count= 3
Frame Number= 222 Number of Objects= 16 Vehicle_count= 10 Person_count= 4
Frame Number= 223 Number of Objects= 14 Vehicle_count= 8 Person_count= 3
Frame Number= 224 Number of Objects= 16 Vehicle_count= 10 Person_count= 4
Frame Number= 225 Number of Objects= 14 Vehicle_count= 9 Person_count= 3
Frame Number= 226 Number of Objects= 14 Vehicle_count= 9 Person_count= 3
Frame Number= 227 Number of Objects= 14 Vehicle_count= 10 Person_count= 2
Frame Number= 228 Number of Objects= 13 Vehicle_count= 9 Person_count= 2
Frame Number= 229 Number of Objects= 13 Vehicle_count= 9 Person_count= 3
Frame Number= 230 Number of Objects= 13 Vehicle_count= 9 Person_count= 2
Frame Number= 231 Number of Objects= 13 Vehicle_count= 10 Person_count= 2
Frame Number= 232 Number of Objects= 12 Vehicle_count= 9 Person_count= 2
Frame Number= 259 Number of Objects= 10 Vehicle_count= 10 Person_count= 0
Frame Number= 260 Number of Objects= 9 Vehicle_count= 9 Person_count= 0
Frame Number= 261 Number of Objects= 9 Vehicle_count= 9 Person_count= 0
Frame Number= 262 Number of Objects= 9 Vehicle_count= 9 Person_count= 0
Frame Number= 263 Number of Objects= 11 Vehicle_count= 11 Person_count= 0
Frame Number= 264 Number of Objects= 11 Vehicle_count= 10 Person_count= 1
[Thread 0x7ffe1dffb640 (LWP 332) exited]
Frame Number= 265 Number of Objects= 13 Vehicle_count= 13 Person_count= 0
Frame Number= 266 Number of Objects= 11 Vehicle_count= 11 Person_count= 0
Frame Number= 267 Number of Objects= 13 Vehicle_count= 13 Person_count= 0
Frame Number= 268 Number of Objects= 12 Vehicle_count= 12 Person_count= 0
Frame Number= 269 Number of Objects= 10 Vehicle_count= 10 Person_count= 0
Frame Number= 270 Number of Objects= 13 Vehicle_count= 13 Person_count= 0
Frame Number= 271 Number of Objects= 14 Vehicle_count= 14 Person_count= 0
Frame Number= 272 Number of Objects= 16 Vehicle_count= 16 Person_count= 0
Frame Number= 273 Number of Objects= 14 Vehicle_count= 14 Person_count= 0
Frame Number= 274 Number of Objects= 14 Vehicle_count= 14 Person_count= 0
Frame Number= 275 Number of Objects= 13 Vehicle_count= 13 Person_count= 0
Frame Number= 276 Number of Objects= 17 Vehicle_count= 17 Person_count= 0
Frame Number= 277 Number of Objects= 17 Vehicle_count= 17 Person_count= 0
Frame Number= 278 Number of Objects= 15 Vehicle_count= 15 Person_count= 0
Frame Number= 279 Number of Objects= 12 Vehicle_count= 12 Person_count= 0
Frame Number= 280 Number of Objects= 13 Vehicle_count= 13 Person_count= 0
Frame Number= 281 Number of Objects= 14 Vehicle_count= 14 Person_count= 0
Frame Number= 282 Number of Objects= 14 Vehicle_count= 14 Person_count= 0
Frame Number= 283 Number of Objects= 15 Vehicle_count= 15 Person_count= 0
Frame Number= 284 Number of Objects= 14 Vehicle_count= 14 Person_count= 0
Frame Number= 285 Number of Objects= 12 Vehicle_count= 12 Person_count= 0
Frame Number= 286 Number of Objects= 11 Vehicle_count= 11 Person_count= 0
Frame Number= 287 Number of Objects= 12 Vehicle_count= 12 Person_count= 0
Frame Number= 288 Number of Objects= 12 Vehicle_count= 12 Person_count= 0
Frame Number= 289 Number of Objects= 12 Vehicle_count= 12 Person_count= 0
Frame Number= 290 Number of Objects= 11 Vehicle_count= 11 Person_count= 0
**PERF: {'stream0': 15.19, 'stream1': 0.0}
Frame Number= 291 Number of Objects= 14 Vehicle_count= 13 Person_count= 1
Frame Number= 292 Number of Objects= 12 Vehicle_count= 11 Person_count= 1
Frame Number= 293 Number of Objects= 13 Vehicle_count= 13 Person_count= 0
Frame Number= 294 Number of Objects= 11 Vehicle_count= 11 Person_count= 0
Frame Number= 295 Number of Objects= 13 Vehicle_count= 13 Person_count= 0
Frame Number= 296 Number of Objects= 15 Vehicle_count= 15 Person_count= 0
Frame Number= 297 Number of Objects= 8 Vehicle_count= 8 Person_count= 0
Frame Number= 298 Number of Objects= 9 Vehicle_count= 9 Person_count= 0
Frame Number= 299 Number of Objects= 12 Vehicle_count= 12 Person_count= 0
Frame Number= 300 Number of Objects= 10 Vehicle_count= 10 Person_count= 0
Frame Number= 301 Number of Objects= 12 Vehicle_count= 12 Person_count= 0
Frame Number= 302 Number of Objects= 9 Vehicle_count= 9 Person_count= 0
Frame Number= 303 Number of Objects= 7 Vehicle_count= 7 Person_count= 0
Frame Number= 304 Number of Objects= 8 Vehicle_count= 8 Person_count= 0
Frame Number= 305 Number of Objects= 8 Vehicle_count= 8 Person_count= 0
**PERF: {'stream0': 14.99, 'stream1': 0.0}
Frame Number= 366 Number of Objects= 12 Vehicle_count= 12 Person_count= 0
Frame Number= 367 Number of Objects= 10 Vehicle_count= 10 Person_count= 0
Frame Number= 368 Number of Objects= 9 Vehicle_count= 9 Person_count= 0
Frame Number= 369 Number of Objects= 10 Vehicle_count= 10 Person_count= 0
Frame Number= 370 Number of Objects= 12 Vehicle_count= 12 Person_count= 0
Frame Number= 371 Number of Objects= 9 Vehicle_count= 9 Person_count= 0
Frame Number= 372 Number of Objects= 11 Vehicle_count= 11 Person_count= 0
Frame Number= 373 Number of Objects= 10 Vehicle_count= 10 Person_count= 0
Frame Number= 374 Number of Objects= 12 Vehicle_count= 12 Person_count= 0
Frame Number= 375 Number of Objects= 11 Vehicle_count= 11 Person_count= 0
Frame Number= 376 Number of Objects= 13 Vehicle_count= 13 Person_count= 0
Frame Number= 377 Number of Objects= 13 Vehicle_count= 13 Person_count= 0
Frame Number= 378 Number of Objects= 10 Vehicle_count= 10 Person_count= 0
Frame Number= 379 Number of Objects= 10 Vehicle_count= 10 Person_count= 0
Frame Number= 380 Number of Objects= 10 Vehicle_count= 10 Person_count= 0
**PERF: {'stream0': 14.99, 'stream1': 0.0}
Frame Number= 591 Number of Objects= 12 Vehicle_count= 9 Person_count= 2
Frame Number= 604 Number of Objects= 11 Vehicle_count= 7 Person_count= 3
Frame Number= 605 Number of Objects= 14 Vehicle_count= 10 Person_count= 3
Frame Number= 606 Number of Objects= 14 Vehicle_count= 9 Person_count= 3
Frame Number= 607 Number of Objects= 13 Vehicle_count= 10 Person_count= 2
Frame Number= 608 Number of Objects= 12 Vehicle_count= 10 Person_count= 2
Frame Number= 327 Number of Objects= 2 Vehicle_count= 1 Person_count= 1
Frame Number= 609 Number of Objects= 13 Vehicle_count= 10 Person_count= 2
[Thread 0x7ffee4a84640 (LWP 300) exited]
Frame Number= 328 Number of Objects= 2 Vehicle_count= 1 Person_count= 1
nvstreammux: Successfully handled EOS for source_id=1
Frame Number= 329 Number of Objects= 1 Vehicle_count= 0 Person_count= 1
Frame Number= 330 Number of Objects= 0 Vehicle_count= 0 Person_count= 0
Frame Number= 610 Number of Objects= 11 Vehicle_count= 9 Person_count= 1
Frame Number= 331 Number of Objects= 0 Vehicle_count= 0 Person_count= 0
Frame Number= 611 Number of Objects= 14 Vehicle_count= 12 Person_count= 1
Frame Number= 612 Number of Objects= 13 Vehicle_count= 10 Person_count= 2
Frame Number= 613 Number of Objects= 12 Vehicle_count= 10 Person_count=
Frame Number= 839 Number of Objects= 12 Vehicle_count= 12 Person_count= 0
Frame Number= 840 Number of Objects= 13 Vehicle_count= 13 Person_count= 0
Frame Number= 841 Number of Objects= 13 Vehicle_count= 13 Person_count= 0
Frame Number= 842 Number of Objects= 12 Vehicle_count= 12 Person_count= 0
[Thread 0x7ffe7dfff640 (LWP 305) exited]
Frame Number= 843 Number of Objects= 11 Vehicle_count= 11 Person_count= 0
Frame Number= 844 Number of Objects= 13 Vehicle_count= 13 Person_count= 0
Frame Number= 845 Number of Objects= 10 Vehicle_count= 10 Person_count= 0
Frame Number= 852 Number of Objects= 13 Vehicle_count= 13 Person_count= 0
Frame Number= 853 Number of Objects= 12 Vehicle_count= 12 Person_count= 0
**PERF: {'stream0': 14.99, 'stream1': 0.0}
Frame Number= 1941 Number of Objects= 12 Vehicle_count= 12 Person_count= 0
Frame Number= 1942 Number of Objects= 9 Vehicle_count= 9 Person_count= 0
Frame Number= 1943 Number of Objects= 9 Vehicle_count= 9 Person_count= 0
Frame Number= 1944 Number of Objects= 8 Vehicle_count= 8 Person_count= 0
Frame Number= 1945 Number of Objects= 10 Vehicle_count= 10 Person_count= 0
Frame Number= 1946 Number of Objects= 10 Vehicle_count= 10 Person_count= 0
Frame Number= 1947 Number of Objects= 10 Vehicle_count= 10 Person_count= 0
Frame Number= 1948 Number of Objects= 9 Vehicle_count= 9 Person_count= 0
Frame Number= 1949 Number of Objects= 9 Vehicle_count= 9 Person_count= 0
Frame Number= 1950 Number of Objects= 10 Vehicle_count= 10 Person_count= 0
Frame Number= 1951 Number of Objects= 10 Vehicle_count= 10 Person_count= 0
Frame Number= 1952 Number of Objects= 11 Vehicle_count= 11 Person_count= 0
Frame Number= 1953 Number of Objects= 10 Vehicle_count= 10 Person_count= 0
Frame Number= 1954 Number of Objects= 10 Vehicle_count= 10 Person_count= 0
Frame Number= 1955 Number of Objects= 11 Vehicle_count= 11 Person_count= 0
Frame Number= 1956 Number of Objects= 11 Vehicle_count= 11 Person_count= 0
Frame Number= 1957 Number of Objects= 11 Vehicle_count= 11 Person_count= 0
Frame Number= 1958 Number of Objects= 11 Vehicle_count= 11 Person_count= 0
Frame Number= 1959 Number of Objects= 9 Vehicle_count= 9 Person_count= 0
Warning: gst-resource-error-quark: Could not read from resource. (9): ../gst/rtsp/gstrtspsrc.c(5832): gst_rtspsrc_loop_udp (): /GstPipeline:pipeline0/GstBin:source-bin-01/GstURIDecodeBin:uri-decode-bin/GstRTSPSrc:source:
Unhandled return value -7.
Error: gst-resource-error-quark: Could not read from resource. (9): ../gst/rtsp/gstrtspsrc.c(5900): gst_rtspsrc_loop_udp (): /GstPipeline:pipeline0/GstBin:source-bin-01/GstURIDecodeBin:uri-decode-bin/GstRTSPSrc:source:
Could not receive message. (System error)
Exiting app
[New Thread 0x7ffe7dfff640 (LWP 333)]
Frame Number= 1960 Number of Objects= 11 Vehicle_count= 11 Person_count= 0
[Thread 0x7fff30ffd640 (LWP 266) exited]
[Thread 0x7fff317fe640 (LWP 265) exited]
[Thread 0x7fff31fff640 (LWP 264) exited]
[Thread 0x7fffeb1fc640 (LWP 278) exited]
[Thread 0x7fffef9fd640 (LWP 277) exited]
[Thread 0x7fffca1ee640 (LWP 299) exited]
[Thread 0x7ffee77fe640 (LWP 297) exited]
[Thread 0x7fffe09f7640 (LWP 283) exited]
[Thread 0x7ffe8dfff640 (LWP 301) exited]
[Thread 0x7fffd49f3640 (LWP 287) exited]
[Thread 0x7fffd21f2640 (LWP 288) exited]
[Thread 0x7fff0cff9640 (LWP 295) exited]
[Thread 0x7fff0dffb640 (LWP 293) exited]
[Thread 0x7ffe8cffd640 (LWP 303) exited]
[Thread 0x7ffe8d7fe640 (LWP 302) exited]
[Thread 0x7fff46fde640 (LWP 263) exited]
[Thread 0x7ffe1d7fa640 (LWP 331) exited]
[Thread 0x7fffaffff640 (LWP 258) exited]
[Thread 0x7ffe1e7fc640 (LWP 325) exited]
[Thread 0x7ffe7dfff640 (LWP 333) exited]
[Thread 0x7ffe1effd640 (LWP 324) exited]
[Thread 0x7ffe1f7fe640 (LWP 323) exited]
[Thread 0x7ffe177fe640 (LWP 322) exited]
[Thread 0x7ffe1ffff640 (LWP 321) exited]
[Thread 0x7ffe3cff9640 (LWP 320) exited]
[Thread 0x7ffe3d7fa640 (LWP 319) exited]
[Thread 0x7ffe3dffb640 (LWP 318) exited]
[Thread 0x7ffe3e7fc640 (LWP 317) exited]
[Thread 0x7ffe3effd640 (LWP 316) exited]
[Thread 0x7ffe3f7fe640 (LWP 315) exited]
[Thread 0x7ffe3ffff640 (LWP 314) exited]
[Thread 0x7ffe5d7fe640 (LWP 312) exited]
[Thread 0x7ffe5dfff640 (LWP 311) exited]
[Thread 0x7ffe64ffd640 (LWP 310) exited]
[Thread 0x7ffe657fe640 (LWP 309) exited]
[Thread 0x7ffe65fff640 (LWP 308) exited]
[Thread 0x7ffe7cffd640 (LWP 307) exited]
[Thread 0x7ffe7d7fe640 (LWP 306) exited]
[Thread 0x7ffe84fde640 (LWP 304) exited]
[Thread 0x7ffee6ffd640 (LWP 298) exited]
[Thread 0x7ffee7fff640 (LWP 296) exited]
[Thread 0x7fff0d7fa640 (LWP 294) exited]
[Thread 0x7fff0e7fc640 (LWP 292) exited]
[Thread 0x7fff0effd640 (LWP 291) exited]
[Thread 0x7fff0f7fe640 (LWP 290) exited]
[Thread 0x7fffd19f1640 (LWP 289) exited]
[Thread 0x7fffd71f4640 (LWP 286) exited]
[Thread 0x7fffd99f5640 (LWP 285) exited]
[Thread 0x7fffde1f6640 (LWP 284) exited]
[Thread 0x7fffe31f8640 (LWP 282) exited]
[Thread 0x7fffe39f9640 (LWP 281) exited]
[Thread 0x7fffe61fa640 (LWP 280) exited]
[Thread 0x7fffea9fb640 (LWP 279) exited]
[Thread 0x7ffff289e640 (LWP 276) exited]
[Thread 0x7fffcd1f0640 (LWP 275) exited]
[Thread 0x7fffcc9ef640 (LWP 274) exited]
[Thread 0x7fffc79ed640 (LWP 272) exited]
[Thread 0x7fff0ffff640 (LWP 270) exited]
[Thread 0x7fff24901640 (LWP 269) exited]
[Thread 0x7fff25102640 (LWP 268) exited]
[Thread 0x7fff25903640 (LWP 267) exited]
[Thread 0x7fff4cdfe640 (LWP 262) exited]
[Thread 0x7fff4d5ff640 (LWP 261) exited]
[Thread 0x7fff6e5ff640 (LWP 260) exited]
[Thread 0x7ffff7c4d480 (LWP 230) exited]
[Thread 0x7ffe5cffd640 (LWP 313) exited]
[New process 230]
[Inferior 1 (process 230) exited normally]
(gdb) bt
No stack.
(gdb)
Note: I have summarized the logs a little bit.
Question:
While this issue is manageable when using OpenCV Python simply, how can we solve this problem in DeepStream? Is there any recommended approach or solution for handling RTSP stream reconnections in DeepStream? Is there any DeepStream test sample or documentation that addresses this specific issue to ensure similar robustness as in OpenCV?
Your guidance and support on this matter would be greatly appreciated.