How to run deepstream_python_app on our own video streams?

I am working on Jetson Orin Nano with a deepstream_python_app and an inbuilt test app like deepstream_test_1.py is working fine with available stream sample videos (sample_720p.h264 are in format: Binary (application/octet-stream)).

but what if I have some captured video of vehicles and humans, how can I use these test apps on this video? Do I need a binary file for my video? do I need to make any changes in the config file?

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU) : Jetson Orin Nano
• DeepStream Version: deepstream-6.4
• JetPack Version (valid for Jetson only) :Version: 6.0-b52
• TensorRT Version :Version: 8.6.2
• NVIDIA GPU Driver Version (valid for GPU only) : NA

Just store your video in h264/h265 format, no need to change the configuration file.

You can use ffmpeg for video processing

@junshengy Thanks for your response.

So I am very new to FFmpeg, I use the following command to convert my Test.webm video to output.mp4 as a format h264.

ffmpeg -i Downloads/Test.webm -c:v libx264 -c:a aac -strict experimental -b:a 192k output.mp4

Then I simply run the python script as

t-tech@ubuntu:/opt/nvidia/deepstream/deepstream-6.4/sources/deepstream_python_apps/apps/deepstream-test1$ python3 deepstream_test_1.py /opt/nvidia/deepstream/deepstream-6.4/samples/streams/output.mp4

and got the following error

infer_context_impl.cpp:2133> [UID = 1]: failed to serialize cude engine to file: /opt/nvidia/deepstream/deepstream-6.4/samples/models/Primary_Detector/resnet18_trafficcamnet.etlt_b30_gpu0_int8.engine
INFO: [Implicit Engine Info]: layers num: 3
0 INPUT kFLOAT input_1 3x544x960
1 OUTPUT kFLOAT output_bbox/BiasAdd 16x34x60
2 OUTPUT kFLOAT output_cov/Sigmoid 4x34x60

0:23:19.718131093 4871 0xaaaadd8ee430 INFO nvinfer gstnvinfer_impl.cpp:328:notifyLoadModelStatus: [UID 1]: Load new model:dstest1_pgie_config.txt sucessfully
Warning: gst-stream-error-quark: Could not decode stream. (7): …/gst/videoparsers/gsth264parse.c(1464): gst_h264_parse_handle_frame (): /GstPipeline:pipeline0/GstH264Parse:h264-parser:
Broken bit stream
.
.
.
Error: gst-stream-error-quark: No valid frames decoded before end of stream (7): …/gst-libs/gst/video/gstvideodecoder.c(1416): gst_video_decoder_sink_event_default (): /GstPipeline:pipeline0/nvv4l2decoder:nvv4l2-decoder:
no valid frames found
nvstreammux: Successfully handled EOS for source_id=0

I think so there is some problem with the conversion of bitstream, may you guide me where I am going wrong? Thanks.

Please read REAME first, test1 only supports h264 input

ffmpeg -i Downloads/Test.webm -c:v libx264  output.h264

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.