Use DeepStream pipeline with appsrc to save frames to file

• Hardware Platform (Jetson / GPU) Orin AGX
• DeepStream Version 6.1.1
• JetPack Version (valid for Jetson only) 5.0.2
• TensorRT Version 8.4.1.5
• How to reproduce the issue ?

I have the following pipeline that uses an appsrc element to feed frames into a pipeline in form of numpy arrays. It works as expected, i.e. I can see the visualization of the frames.

 GObject.threads_init()
 Gst.init(None)

 pipeline = Gst.Pipeline()

 appsource = Gst.ElementFactory.make("appsrc", "numpy-source")
 caps_in = Gst.Caps.from_string("video/x-raw, format=RGBA, width=640,  height=480,  framerate=30/1") 
 appsource.set_property('caps', caps_in)

 nvvideoconvert = Gst.ElementFactory.make("nvvideoconvert","nv-videoconv")
 caps_filter = Gst.ElementFactory.make("capsfilter","capsfilter1")
 caps = Gst.Caps.from_string("video/x-raw(memory:NVMM),format=NV12,width=640,height=480,framerate=30/1")        
 caps_filter.set_property('caps',caps)
                 
 egltransform = Gst.ElementFactory.make("nvegltransform", "nvegl-transform")
 sink = Gst.ElementFactory.make("nveglglessink", "nvvideo-renderer")
            
 # --- Add elements to pipeline                
 pipeline.add(appsource)
 pipeline.add(nvvideoconvert)
 pipeline.add(caps_filter)
 pipeline.add(egltransform)
 pipeline.add(sink)

 # --- Link elements 
 appsource.link(nvvideoconvert)
 nvvideoconvert.link(caps_filter)
 caps_filter.link(egltransform)
 egltransform.link(sink)

 # --- Create an event loop and feed gstreamer bus mesages to it
 loop = GObject.MainLoop()
 bus = pipeline.get_bus()
 bus.add_signal_watch()
 bus.connect ("message", bus_call, loop)

 # --- Start play back and listen to events
 pipeline.set_state(Gst.State.PLAYING)

  # --- Push numpy array to appsrc 
  for _ in range(10):
     arr = np.random.randint(low=0,high=255,size=(480,640,3), dtype=np.uint8)
     arr = cv2.cvtColor(arr, cv2.COLOR_BGR2RGBA)
     appsource.emit("push-buffer", self._ndarray_to_gst_buffer(arr))
     time.sleep(0.3)
     appsource.emit("end-of-stream")

  try:
     loop.run()
  except:
     pass

  # --- Cleanup
  pipeline.set_state(Gst.State.NULL)

Now I am trying to modify the pipeline in order to save the output to a file. So I have modified the above pipeline by substituting:

nvegltransform -> nveglglessink

with:

nvv4l2h264enc -> h264parse -> qtmux -> filesink

To create the new elements I use:

encoder = Gst.ElementFactory.make("nvv4l2h264enc", "encoder")
encoder.set_property('bitrate', 4000000)
encoder.set_property('preset-level', 1)
encoder.set_property('insert-sps-pps', 1)

 parser = Gst.ElementFactory.make("h264parse", "parser")
 qtmux = Gst.ElementFactory.make("qtmux", "muxer")

 filesink = Gst.ElementFactory.make("filesink", "filesink")
 filesink.set_property("location", 'out.mp4')
 filesink.set_property("sync", 0)
 filesink.set_property("async", 0)

This is how I am linking the elements:

 appsource.link(nvvideoconvert)
 nvvideoconvert.link(caps_filter)
 caps_filter.link(encoder)       
 encoder.link(parser)
 parser.link(qtmux)
 qtmux.link(filesink)

and this is how my pipeline terminates:


  try:
      loop.run()
  except:
      pass
 
  Gst.Element.send_event(pipeline, Gst.Event.new_eos())       
  pipeline.set_state(Gst.State.NULL)

The above code generates an mp4 file but unfortunately the file is not playable, e.g. using vlc. I would expect to see a video similar to the output I get displayed on screen when using nveglglessink.

Could you refer this FAQ:
https://forums.developer.nvidia.com/t/deepstream-sdk-faq/80236/29.
You can attach your mp4 file to us.

This is the output file I get

As you can see it just blinks and terminates immediately. I would expect to see the random frames displayed when I set the screen as output.

I have also attached the complete script I am using. You can execute it in a Xavier or Orin device where DeepStream 6+ is installed.

test_ds_videowriter.py (4.7 KB)

If I modify my app according to:

https://forums.developer.nvidia.com/t/deepstream-sdk-faq/80236/30

I get the following output:

This file isn’t playable in vlc.

I also attach the modifed version of the script as per faq#23
test_ds_videowriter.py (4.8 KB)

You commented out qtdemux in your code, so it’s just a h264 file. You just send 1 picture in your code, so it’s just 1 frame in the video.

Can you please check the first version of my code where qtdemux isn’t commented out? I am sending 100 randomly generated images

 # --- Start play back and listen to events
 print("[DeepStreamVideoWriter] Starting pipeline ...")
 pipeline.set_state(Gst.State.PLAYING)

 # --- Push buffer and check
 for _ in range(100):
   arr = np.random.randint(low=0,high=255,size=(480,640,3),dtype=np.uint8)
   arr = cv2.cvtColor(arr, cv2.COLOR_BGR2RGBA)
   appsource.emit("push-buffer", self._ndarray_to_gst_buffer(arr))
   time.sleep(0.3)
   appsource.emit("end-of-stream")

try:
   loop.run()
except:
   pass

print("Send EoS")
Gst.Element.send_event(pipeline, Gst.Event.new_eos())
 
# --- Cleanup
pipeline.set_state(Gst.State.NULL)

You can try to send the “end-of-stream” signal after the for loop instead of in it.
Also, if you have any confusion just about gstreamer python, we suggest you can also ask questions in the Gstreamer forum. https://gstreamer.freedesktop.org/

I am already sending EOS after the for loop.

There is no update from you for a period, assuming this is not an issue anymore.
Hence we are closing this topic. If need further support, please open a new one.
Thanks

I mean the appsource.emit("end-of-stream") code. You put it in the for loop. Also you should set some paras, like timestamp, framerate etc… for the gstbuffer. You can refer the C code below and transfer it to python.
https://github.com/jjungle/VideoReaderWriteOfRB5/blob/master/VideoWriter.cc

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.