Hi,
Can you share a small script (preferably python, but C++ is fine) to do inference on your sample TRT engine, and compare the outputs of your TRT6 and TRT7 engines?
I’m seeing the same outputs from TF, TRT6, and TRT7 given the same input with an internal tool, for both TF → UFF → TRT and TF → ONNX → TRT.
As a side note, the UFF parser will be deprecated in the future per the TRT7 release notes, so ONNX parser gets more support / new features / etc. at the moment. You could convert your sample model above to ONNX with tf2onnx like so:
python3 -m pip install tf2onnx
python3 -m tf2onnx.convert --input model.pb --inputs "input_image:0" --outputs "pred:0" --output "model.onnx"