Toggling nvinfer inference processing on/off at runtime

Hello,

• Hardware Platform: Jetson Xavier NX
• DeepStream Version: 6.1.1
• JetPack Version: 5.0.2

When using nvinfer in a GStreamer pipeline, it’s sometimes necessary to allow toggling inference (on/off) at runtime, there are probably a few options:

  1. Dynamically attaching and detaching pipeline pads to remove nvinfer, doesn’t sound like a good idea since it can lead to frames dropping at they flow through the pipeline, so it won’t just turn inference on/off.
  2. Using a tee element along with a valve and/or input-selector, probably “safer” than the first option, but still messes with the pipeline and flowing frames.
  3. Somehow change something in nvinfer at runtime such as a gst property or metadata, which will cause nvinfer to stop processing frames (and just act as a minimal dummy element). This will be ideal since the pipeline will stay intact while allowing any inference processing to totally stop.
g_object_set(G_OBJECT(nvinfer_element), "enabled", false, NULL);

Any ideas?

Thank you in advance!

Currently, there is no this kind of toggle, please refer to Gst-nvinfer — DeepStream 6.1.1 Release documentation
but nvinfer plugin is opensource, you can modify the code to add this function.

Hi @superware2

You could use a combination of GstInterpipes and GstD to achieve what you want by switching listeners on runtime. For example:


# Video source and primary AI pipeline 
gstd-client pipeline_create p0 \
uridecodebin3 uri=file:///home/nvidia/EDGESTREAM/toolkit_home/movies/Street-FHD@30p-4MBs-faststart.mp4 ! queue ! \
nvstreammux0.sink_0 nvstreammux name=nvstreammux0 batch-size=1 batched-push-timeout=40000 width=1920 height=1080 live-source=TRUE ! queue ! \
nvvideoconvert ! queue ! \
nvinfer name=nvinfer config-file-path="/opt/nvidia/deepstream/deepstream-5.0/sources/apps/sample_apps/deepstream-test1/dstest1_pgie_config.txt" ! queue ! \
nvtracker tracker-width=240 tracker-height=200 ll-lib-file=/opt/nvidia/deepstream/deepstream-5.0/lib/libnvds_mot_iou.so ll-config-file=/opt/nvidia/deepstream/deepstream-5.0/samples/configs/deepstream-app/iou_config.txt ! queue ! \
interpipesink sync=true async=false name=primary

# Secondary pipeline
gstd-client pipeline_create s1 \
interpipesrc name=src1 is-live=true allow-renegotiation=true stream-sync=2 listen-to=primary ! \
queue leaky=2 max-size-buffers=10 ! \
nvinfer name=nvinfer1 process-mode=secondary infer-on-gie-id=1 infer-on-class-ids="0:" batch-size=16 config-file-path="/opt/nvidia/deepstream/deepstream-5.0/sources/apps/sample_apps/deepstream-test2/dstest2_sgie1_config.txt" ! queue ! \
interpipesink sync=false async=false name=s1

# Display pipeline
gstd-client pipeline_create sink \
interpipesrc name=sinksrc is-live=true allow-renegotiation=true stream-sync=0 listen-to=s1 ! \
queue leaky=2 max-size-buffers=10 ! \
nvvideoconvert ! nvdsosd ! fakesink

# Play all pipelines
gstd-client pipeline_play p0
gstd-client pipeline_play s1
gstd-client pipeline_play sink

sleep 5

# Turn off the secondary
gstd-client pipeline_stop s1
gstd-client element_set sink sinksrc listen-to p0

sleep 5

# Reconnect the secondary
gstd-client pipeline_stop sink
gstd-client element_set sink sinksrc listen-to s1
gstd-client pipeline_play s1
gstd-client pipeline_play sink

sleep 5

# Delete all pipelines
gstd-client pipeline_stop p0
gstd-client pipeline_stop s1
gstd-client pipeline_stop sink
gstd-client pipeline_delete p0
gstd-client pipeline_delete s1
gstd-client pipeline_delete sink

It is probably easier just to add a property to nvinfer as @fanzh suggested.

Thanks @fanzh.

I’ve successfully caused nvinfer to disable by setting the “interval” GST property to G_MAXINT.

Can you please advise regarding the nvtracker element (which follows the above nvinfer), the problem is that after nvinfer is “disabled” - the tracker keeps tracking previously detected objects, is there a way to “reset” the state of nvtracker so it stops tracking any existing objects until new ones arrive?

Thank you!

there will no bbox if nvinfer’s interval is G_MAXINT, nvtracker 's workflow is based on bboxes, so nvtracker can’t work if no bboxes.
why do you want to stop all calculations? for performance?

Hello @fanzh,

My pipeline is something like:

... ! nvinfer ! nvtracker ! osd ! ...

When nvinfer’s interval is 25 for example - everything works well, it is detecting and nvtracker is tracking the provided objects.

When I “turn off” inference by setting nvinfer’s interval to G_MAXINT - indeed no more metadata is being downstreamed by nvinfer, but nvtracker is still tracking previously provided objects, so I’m looking for a way to force nvtracker to reset state and stop any active tracking immediately.

My general quest is after a way to “turn off” inference (nvinfer ! nvtracker) at runtime.

Any ideas? Thanks!

did you modify code? could you share the code diff and configuration file?

I haven’t change anything in code, and I’m using config_tracker_NvDCF_perf.yaml which comes with DP 6.1 samples.

But how is this relevant? nvtracker’s mission is to track objects continuously as long as age or quality thresholds aren’t met. nvinfer’s interval can be 50 (two seconds for 25 fps video) and nvtracker will keep tracking objects independently.

I’m just looking for a way to “disable” inference at runtime. One possibility is to “stop” nvinfer by setting it’s interval to a very big number and somehow reset nvtracker’s state so all existing tracked objects disappear.

I will be very grateful if you can ask your engineers how they see the requirement to seamlessly toggle inference (enable/disable) at runtime without stopping the pipeline.

Thanks.

There is no update from you for a period, assuming this is not an issue anymore. Hence we are closing this topic. If need further support, please open a new one. Thanks

nvtracker dose not support to modify parameters dynamically. there is a workaround:
remove object meta after nvtracker, then nvosd will draw nothing on the frame. please refer to this topic: Cannot remove obj from frame meta

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.