So I was searching the net, but I still can’t find a clear answer to the question: is TensorRT inference deterministic?
I have seen that the developer guide has a section about determinism of the builder:
My question is about what happens afterwards:
- If I use the same engine to do inference on the same data, will I always get the same (bit correct) results?
- Are there perhaps some layers that are deterministic and others that are not? If so, is there a list somewhere available?
- If I use the algorithm selector to make the builder deterministic, can I also reproduce the same inference results on two different GPUs?