Try to dewarp multiple fisheye cameras

Hi, I’m trying to dewarp 2 fisheye camera at the same time. I used a pipeline as the following:

gst-launch-1.0 \
rtspsrc location=<a rtsp link of fisheye camera> name=stream_0 ! rtpjitterbuffer ! queue ! rtph264depay ! h264parse ! nvv4l2decoder enable-max-performance=1 ! nvvideoconvert ! 'video/x-raw(memory:NVMM), width=(int)2560, height=(int)1920 , format=(string)RGBA' ! videorate ! 'video/x-raw(memory:NVMM), framerate=(fraction)2/1' ! nvdewarper config-file=config_dewarper.txt source-id=6 ! queue ! mux_0.sink_0 \
filesrc location=<path to a fisheye video> name=stream_1 ! queue ! decodebin ! nvvideoconvert ! 'video/x-raw(memory:NVMM), width=(int)2560, height=(int)1920 , format=(string)RGBA' ! videorate ! 'video/x-raw(memory:NVMM), framerate=(fraction)2/1' ! nvdewarper config-file=config_dewarper.txt source-id=6 ! queue ! mux_0.sink_1 \
nvstreammux name=mux_0 live-source=1 width=1902 height=1500 batch-size=8 num-surfaces-per-frame=4 batched-push-timeout=200000 enable-padding=0 ! nvmultistreamtiler rows=1 columns=1 width=1902 height=12000 \
! nvvideoconvert ! videoconvert ! xvimagesink draw-borders=0 window-width=300 window-height=1000

xvimagesink show video, but it get flicker and gstreamer output a error like this:
NvRmPrivFlush: NvRmChannelSubmit failed (err = 196623, SyncPointIdx = 34, SyncPointValue = 0)

I do test on a Jetson Nano with jetpack 4.3. What did I do wrong? Or someone can suggest me a method to do dewarping 2 or up input fisheye streams and create only one output stream for others elements behind it ? Caused at the position of xvimagesink, I will replace by my own custom plugin. Many thanks~

Hi,
Please try the default sample:

/opt/nvidia/deepstream/deepstream-5.1/sources/apps/sample_apps/deepstream-dewarper-test

See if you can successfully run it with your sources first.

I successfully run with 2 rtspsrc as input. I used the pipeline as bellow:

gst-launch-1.0 \
rtspsrc location=<rtsp_link> name=stream_0 ! rtpjitterbuffer ! queue ! rtph264depay ! h264parse ! nvv4l2decoder enable-max-performance=1 ! nvvideoconvert ! 'video/x-raw(memory:NVMM), width=(int)2560, height=(int)1920 , format=(string)RGBA' ! videorate ! 'video/x-raw(memory:NVMM), framerate=(fraction)2/1' ! nvdewarper config-file=config_dewarper.txt source-id=6 ! queue leaky=upstream ! mux_0.sink_0 \
filesrc location=<fisheye_video_path> name=stream_1 ! queue ! decodebin ! nvvideoconvert ! 'video/x-raw(memory:NVMM), width=(int)2560, height=(int)1920 , format=(string)RGBA' ! videorate ! 'video/x-raw(memory:NVMM), framerate=(fraction)2/1' ! nvdewarper config-file=config_dewarper.txt source-id=6 ! queue leaky=upstream ! mux_0.sink_1 \
nvstreammux name=mux_0 live-source=1 width=1902 height=1500 batch-size=8 num-surfaces-per-frame=4 batched-push-timeout=500000 enable-padding=0 compute-hw=2 \
! nvmultistreamtiler rows=1 columns=2 width=3804 height=6000 ! nvvideoconvert ! videoconvert ! jpegenc ! multifilesink location=result/image_%06d.jpg

but when I check time created of my images and my timer on camera frame, the delay is about 6s. And I can raise fps above 2fps for each input stream. Is Jetson Nano weak to handle this pipeline? Or I do wrong in somewhere?

hi @DaneLLL , can u give me some advices?

Hi,
Do yo observe the issue if you use demo video file as the two sources :

/opt/nvidia/deepstream/deepstream-6.0/samples/streams/sample_cam6.mp4

Would like to know if it is specific to the RTSP source.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.