Use DeepStream pipeline with appsrc to save frames to file

• Hardware Platform (Jetson / GPU) Orin AGX
• DeepStream Version 6.1.1
• JetPack Version (valid for Jetson only) 5.0.2
• TensorRT Version
• How to reproduce the issue ?

I have the following pipeline that uses an appsrc element to feed frames into a pipeline in form of numpy arrays. It works as expected, i.e. I can see the visualization of the frames.


 pipeline = Gst.Pipeline()

 appsource = Gst.ElementFactory.make("appsrc", "numpy-source")
 caps_in = Gst.Caps.from_string("video/x-raw, format=RGBA, width=640,  height=480,  framerate=30/1") 
 appsource.set_property('caps', caps_in)

 nvvideoconvert = Gst.ElementFactory.make("nvvideoconvert","nv-videoconv")
 caps_filter = Gst.ElementFactory.make("capsfilter","capsfilter1")
 caps = Gst.Caps.from_string("video/x-raw(memory:NVMM),format=NV12,width=640,height=480,framerate=30/1")        
 egltransform = Gst.ElementFactory.make("nvegltransform", "nvegl-transform")
 sink = Gst.ElementFactory.make("nveglglessink", "nvvideo-renderer")
 # --- Add elements to pipeline                

 # --- Link elements

 # --- Create an event loop and feed gstreamer bus mesages to it
 loop = GObject.MainLoop()
 bus = pipeline.get_bus()
 bus.connect ("message", bus_call, loop)

 # --- Start play back and listen to events

  # --- Push numpy array to appsrc 
  for _ in range(10):
     arr = np.random.randint(low=0,high=255,size=(480,640,3), dtype=np.uint8)
     arr = cv2.cvtColor(arr, cv2.COLOR_BGR2RGBA)
     appsource.emit("push-buffer", self._ndarray_to_gst_buffer(arr))


  # --- Cleanup

Now I am trying to modify the pipeline in order to save the output to a file. So I have modified the above pipeline by substituting:

nvegltransform -> nveglglessink


nvv4l2h264enc -> h264parse -> qtmux -> filesink

To create the new elements I use:

encoder = Gst.ElementFactory.make("nvv4l2h264enc", "encoder")
encoder.set_property('bitrate', 4000000)
encoder.set_property('preset-level', 1)
encoder.set_property('insert-sps-pps', 1)

 parser = Gst.ElementFactory.make("h264parse", "parser")
 qtmux = Gst.ElementFactory.make("qtmux", "muxer")

 filesink = Gst.ElementFactory.make("filesink", "filesink")
 filesink.set_property("location", 'out.mp4')
 filesink.set_property("sync", 0)
 filesink.set_property("async", 0)

This is how I am linking the elements:

and this is how my pipeline terminates:

  Gst.Element.send_event(pipeline, Gst.Event.new_eos())       

The above code generates an mp4 file but unfortunately the file is not playable, e.g. using vlc. I would expect to see a video similar to the output I get displayed on screen when using nveglglessink.

Could you refer this FAQ:
You can attach your mp4 file to us.

This is the output file I get

As you can see it just blinks and terminates immediately. I would expect to see the random frames displayed when I set the screen as output.

I have also attached the complete script I am using. You can execute it in a Xavier or Orin device where DeepStream 6+ is installed. (4.7 KB)

If I modify my app according to:

I get the following output:

This file isn’t playable in vlc.

I also attach the modifed version of the script as per faq#23 (4.8 KB)

You commented out qtdemux in your code, so it’s just a h264 file. You just send 1 picture in your code, so it’s just 1 frame in the video.

Can you please check the first version of my code where qtdemux isn’t commented out? I am sending 100 randomly generated images

 # --- Start play back and listen to events
 print("[DeepStreamVideoWriter] Starting pipeline ...")

 # --- Push buffer and check
 for _ in range(100):
   arr = np.random.randint(low=0,high=255,size=(480,640,3),dtype=np.uint8)
   arr = cv2.cvtColor(arr, cv2.COLOR_BGR2RGBA)
   appsource.emit("push-buffer", self._ndarray_to_gst_buffer(arr))


print("Send EoS")
Gst.Element.send_event(pipeline, Gst.Event.new_eos())
# --- Cleanup

You can try to send the “end-of-stream” signal after the for loop instead of in it.
Also, if you have any confusion just about gstreamer python, we suggest you can also ask questions in the Gstreamer forum.

I am already sending EOS after the for loop.

There is no update from you for a period, assuming this is not an issue anymore.
Hence we are closing this topic. If need further support, please open a new one.

I mean the appsource.emit("end-of-stream") code. You put it in the for loop. Also you should set some paras, like timestamp, framerate etc… for the gstbuffer. You can refer the C code below and transfer it to python.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.