Classification inference huge performance degradation

The .etlt is not the cause of degradation.
And officially end users can only deploy the .etlt model in deepstream to run inference.
As mentioned in above link Issue with image classification tutorial and testing with deepstream-app - #21 by Morganh , please note that there are several hints to improve the inference accuracy.

No, it is not expected. And previously we found that the .engine can have similar result as .tlt model, using deepstream way or standalone inference way.