Pipeline Achitecture - Sometimes freezes

Please provide complete information as applicable to your setup.

• Hardware Platform GPU
• DeepStream Version 6.4
• NVIDIA GPU Driver Version - NVIDIA GeForce GTX 1650 / Driver Version: 525.147.05 / CUDA Version: 12.0
• Issue Type( questions, new requirements, bugs)

It is not happening everytime and no error shows up but my pipeleine freezes sometimes after a couple of frames.
In addition, the inference only starts to work after 3 or 4 frames.

Please find attached the PDF of my architecture.
graph.pdf (31.3 KB)

I would assume, that there is queue somewhere missing or I did the design of the pipeline not correct.

On purpose for testing purposes, I am using only one hw_encoder, as later there will 13 streams running over this pipeline.

Would it be better if I use mux first for source bins and then tee to split between the two different paths (1. save original video as hls and 2. create ai video as hls)? Wouldn’t be a slight overhead for to copy first to GPU and then back to CPU again?

Can you give some critical feeback please?

We can start by investigating the cause of the freezing problem. Is there a problem if we use soft encoders in all paths?

Thanks for reply, I will try the pipeline first without software encoder.

However, on more question, as you can see from my architecture I have 2 input streams and want also to produce output hls streams.

In my pipeline I have workflow as follows:

2 streams -> mux -> pgie -> tracker -> nvdconvert -> osd -> demux
 
for each pad in demux -> nvv4l2h264enc -> h264parse -> hlssink2

However, only one stream gets annotated and the other one is not. From what I understood, osd cannot can process batch but demux cannot propoerly unbatch it.

So in order to fix, should the process be:

2 streams -> mux -> pgie -> tracker -> demux 
 
for each pad in demux -> nvdconvert -> osd -> nvv4l2h264enc -> h264parse -> hlssink2

I could not find any proper documentation for nvv4l2h264enc. is this the right module use?

I also found this one: https://docs.nvidia.com/metropolis/deepstream/dev-guide/text/DS_plugin_gst-nvvideo4linux2.html

Is this one which should be used instead?

Please advice :)

Best regards

Yes. Use the 2nd way you attached and nvv4l2h264enc is in the encoder section.

Hey,

thanks fpr your reply.

In the link you posted, I cannot find anything with regards to “nvv4l2h264enc”… this is about “nvvideo4linux2”… Should this module be used instead in the pipeline?

Or am I on the wrong path?

There is no update from you for a period, assuming this is not an issue anymore. Hence we are closing this topic. If need further support, please open a new one. Thanks

Nvvideo4linux2 is a general term for codec plug-ins. It contains decoders and encoders. Nvv4l2h264enc is one of the encoder plugin.

gst-inspect-1.0 nvvideo4linux2
Plugin Details:
  Name                     nvvideo4linux2
  Description              Nvidia elements for Video 4 Linux
  Filename                 /usr/lib/x86_64-linux-gnu/gstreamer-1.0/deepstream/libgstnvvideo4linux2.so
  Version                  1.14.0
  License                  LGPL
  Source module            nvvideo4linux2
  Binary package           nvvideo4linux2
  Origin URL               http://nvidia.com/

  nvv4l2av1enc: V4L2 AV1 Encoder
  nvv4l2decoder: NVIDIA v4l2 video decoder
  nvv4l2h264enc: V4L2 H.264 Encoder
  nvv4l2h265enc: V4L2 H.265 Encoder

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.