Inference using ONNX, or .hdf5 tao model differes

Hello,
I am using classification_tf1 to train a classifier on TAO 5.
I exported the model to onnx model and use it (based onnxruntime python library), but outputs, i.e. softmax probabilities are different if I use the hdf5 model in the TAO container.
Then I tried myself to download the hdf5 model and use it locally (based on h5py library), but also softmax probabilities seems to be different again.
So for the same image, same model (in two different versions hdf5 or onnx) in TAO container or outside results are all different.
Can you please assist in solving this.

1 Like

There is no update from you for a period, assuming this is not an issue anymore. Hence we are closing this topic. If need further support, please open a new one. Thanks

Please check the preprocessing.
For hdf5 , you can refer to the https://github.com/NVIDIA/tao_tensorflow1_backend/blob/main/nvidia_tao_tf1/cv/makenet/scripts/inference.py.
For onnx, you can refer to https://github.com/NVIDIA/tao_deploy/blob/main/nvidia_tao_deploy/cv/classification_tf1/scripts/inference.py.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.