How to continuously feed h264 data to HW decoder

Hi,
I’m trying to implement custom RTP/H264 decoding/rendering without using gstreamer base on /usr/src/jetson_multimedia_api/samples/00_video_decode

But problem is sample app read all data from video file first then start decoder and render.
How can we feed data to HW decoder continuously like:
RTP/H264-> h264 data → hw decode → render →
RTP/H264-> h264 data → hw decode → render…

Thank you!

Hi,
You would need to refer to the default sample and do customization to replace the file source with RTSP source. Would need to have the code same as

rtspsrc ! rtph264depay ! h264parse ! ...

For feeding h264 stream, there are two modes:

        --input-nalu         Input to the decoder will be nal units
        --input-chunks       Input to the decoder will be a chunk of bytes [Default]

Generally it should be nal units for RTSP so that you can set timestamp to each frame.

Thank you for your quick feedback.
We implemented that feature in nvidia drive cx, and performance was very good.
But when changed to Nvidia Jetson Xavier, media decode api and process changed as well.
And problem is in example app follow is:Read and queue all buffer from file → decode buffer → render

We don’t want to read all buffer on start up for media playback and streaming we want to make one thread for reading buffer and one thread for decoder and one for render. could you suggest us how to do it with sample app.
Possible to continue push new buffer after decoder finish decode or decoder is decoding queued buffers and how to push new buffers?

Decoder process is more clearly with other sample app:
https://docs.nvidia.com/jetson/l4t-multimedia/l4t_mm_decoder_unit_sample.html

Hi,
We have two samples for video decoding. You may use NvVideoDecoder class, or direct v4l2_ioctl calls. If the input h264 stream is in IDR P P P…, decoder would need to keep at least one reference frames. You would need to preload minimum two frames. For law latency, please set --disable-dpb.