TensorRT : UFF model output and network weights validation

Hi,

We are trying to use TensorRT 3.0 for Squeezedet based object detection and classification.
We have converted SqueezeDet model to UFF model as recommended for TensorRT.
Once we parse this UFF model for TensorRT Engine Output is not correct.

Could you please suggest the ways to verify the correctness of the UFF model generated.
And also how we could verify whether TensorRT engine is taking the correct network weights.

Regards,
njs

Hi,

Could you check where the difference comes from?

1. Check which operation causes the difference by registering it as output
2. Create a simple network with the operation
3. Random input + all 1 weights
4. Check the output in TensorFlow and TensorRT
5. Check the sum of output in TensorFlow and TensorRT

Here is a similar topic for your reference:
https://devtalk.nvidia.com/default/topic/1027424/jetson-tx2/incorrect-results-during-inference-using-tensorrt3-0-c-uff-parser/

Thanks.