I am using Jetson Nano with accelerated GStreamer to record video using IMX219 sensor, using nvoverlaysink to preview video and nvv4l2h264enc to record h264 video.
Is there a way to simultaneously record and display what camera captures even if preview is at slower FPS?
You can use the tee element in gstreamer in order to split the data into multiple pads,
In your case something like this should work for you:
FILE_A=filenameA.mp4 gst-launch-1.0 nvarguscamerasrc ! ‘video/x-raw(memory:NVMM), width=1920, height=1080, format=NV12, framerate=30/1’ ! tee name=streams streams. ! queue ! nvv4l2h264enc bitrate=8000000 ! h264parse ! queue ! qtmux ! filesink location=$FILE_A streams. ! nvoverlaysink
You can find more pipelines like this in our wikis:
Please tell me if this works for you.
Embedded Software Engineer
I followed your syntax but I am getting error as follows:
WARNING: erroneous pipeline: could not link filesink0 to nvoverlaysink-nvoverlaysink0
I am using Jetson Xavier, arm64 , ubuntu 18.0.4. I have Leopard cameras. I tried for 1 camera to record and display video simultaneously but it didnt work
FILE_A=filenameA.mp4 gst-launch-1.0 nvarguscamerasrc sensor-id=0 ee-mode=0 ! “video/x-raw(memory:NVMM), width=1920, height=1080, format=NV12, framerate=30/1” ! tee name=streams streams. ! queue ! nvv4l2h264enc bitrate=8000000 ! h264parse ! queue ! qtmux ! filesink location=$FILE_A streams. ! nvoverlaysink
Can you please let me know. HOw to solve this issue?