Can deepstream5.0 support HLS live stram?

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU) GPU
• DeepStream Version 5.0
• JetPack Version (valid for Jetson only) Non
• TensorRT Version Same as deepstream 5.0 devel docker
• NVIDIA GPU Driver Version (valid for GPU only) Same as deepstream 5.0 devel docker

Can deepstream5.0 support HLS live stream as source input? If this is doable, can anyone give me some hint how to implement this?

Currently Deepstream 5.0 does not support HLS stream.
HLS is the streaming technology from Apple, https://developer.apple.com/streaming/
Gstreamer has some plugins which can support HLS, but these plugins are in gst-plugins-bad https://gstreamer.freedesktop.org/modules/gst-plugins-bad.html. You can try these HLS related plugins to build your own pipeline for involving HLS source.

@horacehxw

The reference deepstream_app itself does not support this, but the deepstream elements are just gstreamer elements and can be used in an ordinary gstreamer pipeline like this:

deepstream pipeline ! {encoder} ! h264parse ! mpegtsmux ! hlssink sync=true max-files=15 target-duration=5 playlist-root={playlist_root} location={video_root}/video_%05d.ts playlist-location={video_root}/playlist.m3u8'

Then you serve up video_root with nginx or whatever. (eg. /var/lib/www/video)

playlist_root must be the public web path containing the playlist.m3u8. (eg. foo.com/video

encoder can be an h264 or h265 encoder. ( eg. nvv4l2h264enc).

You can use Gst.parse_bin_from_description(pipe_string, True) (or equivalent in your language of choice) to convert a string pipline to a bin you can use in a dynamic pipeline. I’d recommend making a “sink bin” of the hls end of the pipeline.

Reminder: you do need a web server running to serve up the files hlssink creates. Hlssink is just a kind of multifilesink that creates a sequence of video chunks and a text file playlist pointing to those chunks.

2 Likes

Thank you! That helps a lot. The pipeline you provided works perfectly in gst-lanuch-1.0 command line tool. I tried to set the hlssink at the same location of smart record bin, so it doesn’t need to decode and encode again.

However, after I implement exactly the same pipeline tee->queue->mpegtxmux->hlssink, both the decode bin and hls sink stay at READY state and cannot change to PLAYING state. Do you know how to solve this?

@mdegans

Just fyi, forgot an element and updated the pipeline above. You need an extra h264parse element as is shown in your diagram. Otherwise .ts files don’t get split and there are possibly other issues.

That could be part of your what’s happening to you, but I can’t say for sure why your pipeline isn’t going to PLAYING. Could you post log with GST_DEBUG set to 4 and/or a pdf from a generated.dot file when it hangs?

What do you mean by an extra h264parse element? Do you mean a pipeline like this:

gst-launch-1.0 rtspsrc ! rtph264depay ! h264parse ! tee name=t t. ! queue !  nvv4l2decoder ! DS_PIPELINE t.! queue ! h264parse ! mpegtsmux ! hlssink 

@horacehxw
It looks like you already have one so it looks fine the way it is at a glance.

Your best friend is the GStreamer debug tools in this kind of situation. Likely there is a warning or an error somewhere pointing to the problem.

You can also divide and conquer by cutting up your pipeline until you find out what breaks it. Elements like fakesink and various test sources can be very useful for this kind of thing.

1 Like