Can't configure DeepStream classifier to give the same softmax outputs as the TRT engine it builds

The config file I pasted is already customized for my application. If it weren’t customized, the pipeline would not run successfully.

No. The algorithm in our sample nvpreprocess is not compatible with pytorch. Please write the algorithm by yourself.

Let’s take a step back.

What makes you say that nvdspreprocess and nvinfer both do a conversion that’s not compatible with PyTorch? They are simply calculations with a specific output, the documentations claim to do the required calculations, and a previous user has claimed to get exactly the same results as a model PyTorch simply by using the correct normalisation/offsets setting in nvinfer config. What would “not compatible” even mean? The calculation is wrong? The tensor output format is wrong?

Even if I did implement a custom preprocessing plugin, currently nvinfer is still doing its own preprocessing regardless and I don’t know how to disable that.

1 Like

There is no update from you for a period, assuming this is not an issue anymore. Hence we are closing this topic. If need further support, please open a new one. Thanks

We have bit by bit comparation internally. There are differences. It is not wrong, but some calculation bias. Please write the algorithm by yourself to be bit by bit equal to pytorch functions.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.