Converted Decoder's outputs are corrupted sometimes

Description

We have encoder and decoder that are converted with torch2trt. Encoder part works correctly. Yet, Decoder part gives mostly correct output but sometimes outputs are distorted. Both networks almost have similar functions. Most different function is interpolate. Is Interpolate function could be triggered while calculating some outputs? We tried converting only intepolate function with torch2trt. It’s outputs are looking wrong. Do you think, is there any relation between interpolate and our problem ?

You can find issue of intepolation output at here.

Environment

TensorRT Version: 7.1.3
Torch2TRT Version : 0.2.0
GPU Type: RTX2080
Nvidia Driver Version: 440.33.01
CUDA Version: 10.2
CUDNN Version: 7.6.5
Operating System + Version: Ubuntu 18.04.5 LTS
Python Version (if applicable): 3.6.7
PyTorch Version (if applicable): 1.6.0

Hi @mhmdpkts,

We recommend you to wait for torch2trt team reply in git issue. It would be better to have them look at this first.

Thank you.