Unable to capture video from CSI MIPI Camera using filesink

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU)
Jetson
• DeepStream Version
5.0
• JetPack Version (valid for Jetson only)
4.4
• TensorRT Version
7.0
• NVIDIA GPU Driver Version (valid for GPU only)
• Issue Type( questions, new requirements, bugs)
questions
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)
The following is the script I am using for recording stream : stream-app.py (5.5 KB)
Also, this is the link I was referring to : https://gist.github.com/NBonaparte/89fb1b645c99470bc0f6
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)

I am trying to simultaneously record and stream using RTSP server a live video feed from a CSI MIPI Camera.
The start_record() method has been separated because I want to do it on trigger as next steps.
The code starts the RTSP stream and then gives a segmentation fault(core dumped) error.
Any help is appreciated.

First, tee is not necessary. Secondary, if you must use tee, you need to request src pad but not link to other element directly.

The framerate should not be set after nvvideoconvert because nvvideoconvert can not convert framerate.

The following pipeline works fine in my board:

gst-launch-1.0 --gst-debug=v4l2videoenc:5 nvarguscamerasrc sensor-id=0 bufapi-version=1 ! nvvideoconvert ! ‘video/x-raw(memory:NVMM), width=3280, height=2464’ ! queue ! nvv4l2h264enc maxperf-enable=1 bitrate=10000000 insert-sps-pps=1 preset-level=1 bufapi-version=1 ! rtph264pay ! udpsink host=224.224.255.255 port=5400 async=0 sync=0

You need to check the coding errors by yourself.

@Fiona.Chen, I followed your advice and changed my start_recording() and stop_recording() methods. Also, I have implemented a threaded script which I am attaching. stream-app-record.py (9.4 KB) .
The code is running without errors. but the .avi file which is saved is corrupt, so I guess the stop_recording() method has issues.
So my question is how can I unlink the self.recordpipe element and the tee element?

It is not deepstream related. If you know about avi file format (https://cdn.hackaday.io/files/274271173436768/avi.pdf )you can know that the avi file header should be generated after the video/audio data finished. So you need to send EOS event to pipeline when you finish writing avi file, so the avimux can do the correct job. Or else, the avi file header is wrong and the file is corrupted.

It is not recommended to record avi video file in your way. You can refer to open source gstreamer resources for the multimedia problems. Let’s focus on deepstream topics in deepstream forum.