How to integrate custom data preprocessing before infering with Tensorrt in DeepStream

I use Xavier and DS 4.2.1

Hello, Experts, I have already converted some models to TensorRT, and wish to use them in DS.

The postprocessing is ok. But i am confused to how to specify these preprocessing parts. For example, i need to normalize the input data buffer, some specical transformation like affine transform.

whats the best strategy to make it? any example there?

Hi
Normalization and conversion done from convertFcn which is a typedef function pointer,
sources/libs/nvdsinfer/nvdsinfer_context_impl.cpp::queueInputBatch
and normalize scale factor set to
net-scale-factor=0.0039215697906911373
your special transformation can be done here.

Hello, amycao, thanks for your reply a lot.
I met another problem, how could I get the data of inputFrames. I wish I could print the data or save the imge with opencv in nvdsinfer_context_impl.cpp::queueInputBatch. But it displays only zeros and prompt an error: “imrite” is not a member of “cv”


PS. I am tring to test this github repo:

Hi
You can not use GPU buffer to create CV mat, you need to map GPU buffer for CPU access,
we have example, you can refer to this,
sources/gst-plugins/gst-dsexample/gstdsexample.cpp::gst_dsexample_transform_ip