Failed to add fence error during pipeline execution

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU)
AGX Xavier
• DeepStream Version
• JetPack Version (valid for Jetson only)
• TensorRT Version
• Issue Type( questions, new requirements, bugs)

I have a pipeline that runs a segmentation and object detection neural network in parallel. It works well for the first 20 frames but after that I get the following error for every new frame:

ImgAddFences: Couldn’t find a place to store the fences
NVM_LDC : 765, ERR: sAddFences: Failed to add fence to input 0, status:7
NVM_LDC : 1654, ERR: NvMediaLDCAddFences failed. status: 7

In my output I can see that every frame from that moment on is just frame number 20 and I’m not receiving any new frames from my source.
I couldn’t find any information related to this error so I would like to get some advice on the possible scenarios where this error could be thrown so I can get an idea for debugging this issue.

Here is my pipeline:

gcamsrc fileread=true devid=1234 num_buffers=100 !
capsfilter caps=video/x-raw,format=NV12,width=4000,height=3000 !
tee name=t ! queue !
remap configyaml=./remap-config-left.yml !
‘video/x-raw(memory:NVMM),format=(string)NV12’ !
m.sink_0 nvstreammux name=m batch-size=1 width=4000 height=3000 enable-padding=0 !
nvinfer config-file-path="./config_infer_primary_yoloV4.txt" !
nvvideoconvert ! nvdsosd ! nvvideoconvert ! nvv4l2h264enc ! h264parse ! mp4mux !
filesink location="./result_gst_yolo.mp4"
t. ! queue !
remap configyaml=./remap-config-left.yml !
‘video/x-raw(memory:NVMM),format=(string)NV12’ !
n.sink_0 nvstreammux name=n batch-size=1 width=4000 height=3000 enable-padding=0 !
nvinfer config-file-path="./config_infer_primary_deeplab.txt" !
nvvideoconvert ! nvv4l2h264enc ! h264parse ! mp4mux !
filesink location="./

I resolved the issue myself.
The issue was calling vpiCreateRemap after every frame that reached my remap plugin which was the wrong behavior.

Glad to know issue resolved, thanks for the update.