Sliced inferencing over multiple streams

How do I perform tiled inferencing using deepstream in python?
Also i would want to do this over multiple streams.

What does “tiled inferencing” mean? Is it something like GitHub - obss/sahi: Framework agnostic sliced/tiled inference + interactive ui + error analysis plots??

yeah like sahi. I want to slide over tiles across an image and do inferencing for each tile

Currently it is not supported in DeepStream.

A possible way is to use nvpreprocess and nvinfer, the custom nvpreprocess library can use the ROIs as the sliced images.

yeah i tried this. its working. but the speed is reducing significantly for every roi added. with one roi its 17 fps, with 2 its 8

And i had another doubt. If there is only one video source/stream, how is deepstream processing it. Is it processing frame by frame or collecting a number of frames from the video and batching it.
Because for multiple videos I know you guys are batching it.

The videos/audios/other data can be handled in different status inside DeepStream. For PGIE, the inferencing module works with frames batches. For SGIE, the inferencing module works with objects batches. For nvpreprocess case, the inferencing module works with customized batches.

My question with pgie is, assume there is only one video. is every frame of the video getting processed individually ?

The pipeline handle batches but not frames. It is configurable. E.G. If you set “batch-size” of nvstreammux as 2 and set the “batched-push-timeout” of nvstreammux as the double of 2 frames duration, then you also need to configure the “batch-size” of nvinfer as at least 2, then the pipeline will handle the batch with two frames.

I am running . On one source video yolov5l is giving 33 fps . But when we try inferring a single image in our local we get 111 fps. Any idea why this is happening?

Does this have anything to do with sliced inferencing? If not, please create a new topic for it.

There is no update from you for a period, assuming this is not an issue anymore. Hence we are closing this topic. If need further support, please open a new one. Thanks

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.