Custom plugin to read from a custom file format

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU)
T4
• DeepStream Version
5.0
• JetPack Version (valid for Jetson only)
NA
• TensorRT Version
7.0.0.11
• NVIDIA GPU Driver Version (valid for GPU only)
440.64.00
• Issue Type( questions, new requirements, bugs)
I have a custom file format which has data in the format [ (ntp timestamp of 8 bytes, H.264 chunk), (ntp timestamp of 8 bytes, H.264 chunk), … ]. Now I need to write a new plugin which can read this file and send it through nvinfer etc. The data flow should be something like:-

filesrc → custom-parser-plugin → h264parse → nvv4l2decoder → nvstreammux → nvvideoconvert → capsfilter → nvinfer → appsink.

  • Is there any existing plugin which can provide a starting point?
  • I am guessing this need not be a DeepStream-aware plugin; instead it can be a simple Gstreamer plugin which operates on data in main memory.
  • How do I assign the timestamp that I read from the file to the frame_meta.ntp_timestamp so that it is available in my appsink? Also I have a probe function at nvinfer’s source pad and I want to use the ntp_timestamp in that function too.

As to your description, what you want to develop is a common gstreamer plugin and it is not specific for deepstream, no deepstream API will be used in your new plugin. There are instructions in Gstreamer community Plugin Writer's Guide, and there are also many 3rd party resources in internet. It is a must to study and learn the basic knowledge of Gstreamer before you develop new gstreamer plugin. https://gstreamer.freedesktop.org/

Thanks Fiona. I am looking at the different parser plugins of gstreamer.

Reg. the third part of my question – assigning to Nvidia’s frame_meta.ntp_timestamp – I don’t see any open source code (gstreamer or Nvidia) which assigns to it and I am not sure if it is preserved end-to-end. frame_meta seems to be an Nvidia specific structure. Can you throw some light on where it gets assigned for the first time and if I can override this assignment in a plugin which sits much before all the Nvidia plugins?

As you have known, it is Nvidia defined parameter and it has its own meaning and logic of calculation, if you change its value, it will destroy our internal logic.
Since it is Nvidia private data structure and parameters, it can not be handled by common gstreamer plugin. To be simple, the frame_meta does not exist untill the nvstreammux element. So the things you want to do conflict to the deepstream’s own logic.

If you are not using rtsp source, there are 2 ways to attach ntp timestamp data to frame metadata

  1. Attach users ntp timestamp as gst metadata inside custom parser plugin. Access this metadata on sourcepad of nvstreammux component by attaching a probe and then overwrite ntp timestamp filled by Deepstream by users ntp timestamp by accessing gst metadata. Mechanism to attach metadata can be checked here - deepstream-gst-metadata-test C/C++ Sample Apps Source Details — DeepStream 6.1.1 Release documentation, The code can be found in /opt/nvidia/deepstream/deepstream-5.0/sources/apps/sample-apps/deepstream-user-metadata-test
  2. You can implement a book-keeping login inside the application to store and overwrite ntp timestamp at nvstreammux sourcepad probe

Hello @Fiona.Chen,
Thanks for the clarifications. I saw the first solution that you described. It is using the UserMeta of DeepStream (which is build on the BatchMeta of DeepStream) and not the Gst base metadata. So I am guessing I will have to use the GstMeta and hope that all the Nvidia plugins will retain them as-is and also preserve 1:1 mapping of the frames before and after nvstreammux. Is that a good assumption to make?

Reg. the solution 2, that is also workable provided there is a 1:1 mapping of frames before and after nvstreammux and there is some identifier that I can use from the frames (say the buffer pointer) which can be used to lookup the timestamp later in the nvstreammux sourcepad probe.

Deepstream metadata is also based on gst base metadata, so your new gst metadata will not be touched by deepstream plugins and the information will be retained. But nvstreammux is a little different, the gstbuffer in src pad contained batch data but not frame data, so the metadata is different to sink pad. So the information in your gst metadata may be lost.

To attach the timestamp using tcpclientsrc below property can be used.
By default this property is set to false, need to enable this property to value true.
This will attach current stream time to gstreamer buffers which are flowing downstream.

$ gst-inspect-1.0 tcpclientsrc
…….

Element Properties:
do-timestamp : Apply current stream time to buffers
flags: readable, writable
Boolean. Default: false”

Thanks @kayccc.

  • Will the same option apply to filesrc also?
  • If I set this to true, does the gstreamer node use the current wall clock time as the buffer’s time? If so, that may not solve my problem. What I need is to be able to do something like below in my node.
gstbuffer.timestamp = 1234

If this becomes possible, then instead of using a constant value of 1234, I can set the timestamp to values from my custom file format or set the timestamps from any other source.

– Yes, same option is available for filesrc plugin as well.

– Gstreamer attaches pipeline running timestamp to the gstreamer buffer.

– timestamp value will not remain constant value when do-timestamp property is set to true. It attaches the current running time to the buffer while pushing it to downstream plugin, so this timestamp value is monotonically increasing timestamp value.
You have choice of using pipeline’s running clock as timestamp by setting do-timestamp property or in the custom plugin you can attach new timestamp based on your custom logic of timestamp.

@kayccc Thanks for the clarification. Can you give some code pointers which can be used to attach new timestamp to the buffer? Is it something like

GST_BUFFER_PTS (buf) = custom_timestamp;
GST_BUFFER_DTS (buf) = custom_timestamp;

Which one of these survives through the Nvidia modules? Will any of them get assigned to frame_meta.ntp_timestamp by nvvstreammux OR the assignment to that field is completely separate.

-GST_BUFFER_PTS to be used. But if do-timestamp property is used then you do not need to set the timestamp explicitly, src plugin would take care of this.

-GST_BUFFER_PTS timestamp will not be touched by Nvidia Modules. Nvstreammux attaches the NTP timestamp for RTSP sources using RTCP SR reports. If you want to make use of this field in the non RTSP pipeline then add probe on src pad of nvstreammux, update the frame_meta.ntp_timestamp value with the desired value based on your custom logic.

@kayccc Thanks for the clarifications. I have tried out a few things and below is the result.

My requirement is that the source plugin should NOT be setting the time based on current time. Instead it should read the time from some other place (a file or network) and set it on the buffer. So I won’t be able to set do-timestamp on the source plugin.

I tried out an appsrc which sets PTS and DTS. My first frame has:-

t = 1611000000000000000
buf.pts = buf.dts = t

Then for every frame I do this:

t += int(3.3e7)
buf.pts = buf.dts = t

Now I pass it through all the modules in this pipeline:-

my-appsrc → h264parse → nvv4l2decoder → nvstreammux → nvvideoconvert → capsfilter → nvinfer → appsink

In the appsink, I find that pts, dts, and frame_meta.ntp_timestamp have all been mangled. A sample of the values below:-

ntp_timestamp=1611303041519445000
pts=5266666614
dts=18446744073709551615

ntp_timestamp has been set to current wall clock time (is not derived from pts or dts). PTS and DTS themselves have been set to values which I don’t understand. How do I make sure that the timestamp is really carried through by the pipeline?

The difference between successive PTS values is actually 33 ms. So Somehow my timestamp values have made some impact on the PTS. But I am not able to correlate these values with what I had set. But interestingly, while ntp_timestamp keeps changing on every run of the program, the PTS/DTS remain the same indicating, again, that my change has had some impact.

Adding source.set_property("format", Gst.Format.TIME) on the appsrc did not change anything.

In this case use has to write his own logic in the application to update the PTS with desired value using probe functions or write appsrc plugin to handle custom requirement.

We think probably setting of DTS and PTS is problematic because user is getting random pts=5266666614 values than expected t = 1611000000000000000

You can debug at your end by adding “identity silent = 0” in the pipeline you are running and investigate if PTS values are printed as expected below.

@Timestamp handling – need to use GST_BUFFER_PTS() API to set the desired PTS value

unsigned long pts = <set_expected_pts_in_nano_seconds>;
GST_BUFFER_PTS (gst_buffer) = pts;

Few debugging tips →

gst-launch my-appsrc → identity silent=0 → h264parse → nvv4l2decoder → nvstreammux → nvvideoconvert → capsfilter → nvinfer → appsink -e -v

Expected prints after adding identity element.

/GstPipeline:pipeline0/GstIdentity:identity0: last-message = chain ******* (identity0:sink) (40048 bytes, dts: 0:00:00.033333333, pts: 0:00:00.133333333, duration: 0:00:00.033333333, offset: -1, offset_end: -1, flags: 00002000 delta-unit , meta: none) 0x7fcb100a00d0

/GstPipeline:pipeline0/GstIdentity:identity0: last-message = chain ******* (identity0:sink) (14921 bytes, dts: 0:00:00.066666666, pts: 0:00:00.100000000 , duration: 0:00:00.033333334, offset: -1, offset_end: -1, flags: 00002000 delta-unit , meta: none) 0x7fcb100a0840

/GstPipeline:pipeline0/GstIdentity:identity0: last-message = chain ******* (identity0:sink) (41474 bytes, dts: 0:00:00.100000000, pts: 0:00:00.200000000 , duration: 0:00:00.033333333, offset: -1, offset_end: -1, flags: 00002000 delta-unit , meta: none) 0x7fcb100a0840

/GstPipeline:pipeline0/GstIdentity:identity0: last-message = chain ******* (identity0:sink) (13242 bytes, dts: 0:00:00.133333333, pts: 0:00:00.166666666 , duration: 0:00:00.033333333, offset: -1, offset_end: -1, flags: 00002000 delta-unit , meta: none) 0x7fcb100a0840

If PTS are not printed as expected then setting up of the PTS value in the appsrc could be a problem.

Also the presentation timestamp (pts) in nanoseconds (as a GstClockTime) of the data in the buffer. This needs to be rechecked in the appsrc logic.

This would help you to narrow down the issue.
You can also test and analyze the working pipeline setup and see how PTSes are getting set for better understanding.

@kayccc

In this case use has to write his own logic in the application to update the PTS with desired value using probe functions or write appsrc plugin to handle custom requirement.

Yes, I am writing a new appsrc plugin to set pts/dts.

We think probably setting of DTS and PTS is problematic because user is getting random pts=5266666614 values than expected t = 1611000000000000000
You can debug at your end by adding “identity silent = 0” in the pipeline you are running and investigate if PTS values are printed as expected below.

Thanks for this tip. I was able to debug this using the topology below:-

appsrc → identity → h264parse → nvv4l2decoder → sink

In this case I see the following:-

0:00:10.516349583 171 0x190d8a0 DEBUG GST_SCHEDULING gstpad.c:4320:gst_pad_chain_data_unchecked:identity:sink calling chainfunction &gst_base_transform_chain with buffer buffer: 0x1906070, pts 0:00:05.577000000, dts 0:00:06.577000000, dur 0:00:00.033333333, size 4530, offset none, offset_end none, flags 0x0

0:00:10.516514865 171 0x190d8a0 DEBUG GST_SCHEDULING gstpad.c:4320:gst_pad_chain_data_unchecked:h264-parser:sink calling chainfunction &gst_base_parse_chain with buffer buffer: 0x1906070, pts 0:00:05.577000000, dts 0:00:06.577000000, dur 0:00:00.033333333, size 4530, offset none, offset_end none, flags 0x0

0:00:10.519396667 171 0x190d8a0 DEBUG GST_SCHEDULING gstpad.c:4320:gst_pad_chain_data_unchecked:nvv4l2-decoder:sink calling chainfunction &gst_video_decoder_chain with buffer buffer: 0x1906180, pts 0:00:05.577000000, dts 0:00:06.577000000, dur 99:99:99.999999999, size 4811, offset 900794, offset_end none, flags 0x6000

0:00:10.520841746 171 0x7f0d1c004800 DEBUG GST_SCHEDULING gstpad.c:4320:gst_pad_chain_data_unchecked:filesink-element:sink calling chainfunction &gst_base_sink_chain with buffer buffer: 0x7f0d1c05a350, pts 0:00:05.577000000, dts 99:99:99.999999999, dur 99:99:99.999999999, size 64, offset none, offset_end none, flags 0x0

Basically it appears that the h264parser is mangling the duration and the decoder is mangling the dts. The appsrc itself seems to be doing the right thing. Also note that in the appsrc, I am start pts at 0 and dts at 1e9 (1s worth) just to make sure pts and dts values are different.

@kayccc

I could not get the PTS/DTS to work. Instead, I have now created my own GstMeta substructure and I am putting the timestamp there. My appsrc plugin is able to inject the timestamps and my probe function at the end is able to consume the timestamps. So my requirement for the file-based input is solved.

Now to the second part of my problem which is to read the same byte-stream over tcp and then inject it into DeepStream. Here the problem is not with the timestamp but being able to write a parser which can parse a TCP byte stream into H.264 frames before feeding it to h264parse.

Is there a simple gstreamer parser plugin which I can use as the starting point to write my custom parser plugin. It will be used in this fashion:-
tcpclientsrc → custom-parser-plugin → h264parse → fakesink

Basically it will use the byte-stream as input (which I have outlined in the first message on this thread), split it to extract the timestamp, then read the H.264 NALU into a GstBuffer, and then feed each GstBuffer to h264parse.

  • Is gstrawvideoparse.c a good starting point?
  • What about gsttransform.c in gst-templates repo?
  • What about vorbisparse or other parser plugin?

Hi

Below is our dev team’s suggestions as your reference:

This is custom requirement as they using their proprietary format, hence they would need custom parser.

In this scenario, we can suggest them to refer to rtph264depay which is open source plugin (“Extracts H264 video from RTP packets”), so on similar lines they can extract H264 from their custom packets and feed gstbuffers to h264parse plugin.

There were 3 related, but separate, problems that I brought up in this ticket. All three of them are now solved. Thanks for the patience and all the help.

JFYI I was able to build my plugin loosely based on the control flow in gstgdpdepay.c.