I am working on Jetson Orin Nano with a deepstream_python_app and an inbuilt test app like deepstream_test_1.py is working fine with available stream sample videos (sample_720p.h264 are in format: Binary (application/octet-stream)).
but what if I have some captured video of vehicles and humans, how can I use these test apps on this video? Do I need a binary file for my video? do I need to make any changes in the config file?
Please provide complete information as applicable to your setup.
• Hardware Platform (Jetson / GPU) : Jetson Orin Nano • DeepStream Version: deepstream-6.4 • JetPack Version (valid for Jetson only) :Version: 6.0-b52 • TensorRT Version :Version: 8.6.2 • NVIDIA GPU Driver Version (valid for GPU only) : NA
0:23:19.718131093 4871 0xaaaadd8ee430 INFO nvinfer gstnvinfer_impl.cpp:328:notifyLoadModelStatus: [UID 1]: Load new model:dstest1_pgie_config.txt sucessfully
Warning: gst-stream-error-quark: Could not decode stream. (7): …/gst/videoparsers/gsth264parse.c(1464): gst_h264_parse_handle_frame (): /GstPipeline:pipeline0/GstH264Parse:h264-parser:
Broken bit stream
.
.
.
Error: gst-stream-error-quark: No valid frames decoded before end of stream (7): …/gst-libs/gst/video/gstvideodecoder.c(1416): gst_video_decoder_sink_event_default (): /GstPipeline:pipeline0/nvv4l2decoder:nvv4l2-decoder:
no valid frames found
nvstreammux: Successfully handled EOS for source_id=0
I think so there is some problem with the conversion of bitstream, may you guide me where I am going wrong? Thanks.