2022-06-01 02:14:16,901 [INFO] root: Registry: ['nvcr.io'] 2022-06-01 02:14:17,309 [INFO] tlt.components.instance_handler.local_instance: Running command in container: nvcr.io/nvidia/tao/tao-toolkit-tf:v3.21.11-tf1.15.5-py3 2022-06-01 02:14:17,398 [WARNING] tlt.components.docker_handler.docker_handler: Docker will run the commands as root. If you would like to retain your local host permissions, please add the "user":"UID:GID" in the DockerOptions portion of the "/home/vignesh/.tao_mounts.json" file. You can obtain your users UID and GID by using the "id -u" and "id -g" commands on the terminal. [INFO] [MemUsageChange] Init CUDA: CPU +332, GPU +0, now: CPU 338, GPU 318 (MiB) [INFO] [MemUsageSnapshot] Builder begin: CPU 690 MiB, GPU 318 MiB [INFO] Reading Calibration Cache for calibrator: EntropyCalibration2 [INFO] Generated calibration scales using calibration cache. Make sure that calibration cache has latest scales. [INFO] To regenerate calibration cache, please delete the existing one. TensorRT will generate a new calibration cache. [WARNING] Missing scale and zero-point for tensor mask_fcn_logits/bias, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor conv1/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor bn_conv1/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor bn_conv1/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor bn_conv1/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor bn_conv1/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_1a_conv_1/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_1a_bn_1/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_1a_bn_1/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_1a_bn_1/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_1a_bn_1/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_1a_conv_2/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_1a_bn_2/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_1a_bn_2/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_1a_bn_2/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_1a_bn_2/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_1a_conv_3/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_1a_bn_3/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_1a_bn_3/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_1a_bn_3/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_1a_bn_3/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_1a_conv_shortcut/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_1a_bn_shortcut/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_1a_bn_shortcut/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_1a_bn_shortcut/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_1a_bn_shortcut/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_1b_conv_1/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_1b_bn_1/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_1b_bn_1/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_1b_bn_1/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_1b_bn_1/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_1b_conv_2/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_1b_bn_2/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_1b_bn_2/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_1b_bn_2/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_1b_bn_2/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_1b_conv_3/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_1b_bn_3/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_1b_bn_3/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_1b_bn_3/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_1b_bn_3/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_1c_conv_1/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_1c_bn_1/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_1c_bn_1/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_1c_bn_1/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_1c_bn_1/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_1c_conv_2/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_1c_bn_2/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_1c_bn_2/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_1c_bn_2/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_1c_bn_2/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_1c_conv_3/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_1c_bn_3/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_1c_bn_3/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_1c_bn_3/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_1c_bn_3/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor l2/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor l2/bias, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_2a_conv_1/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_2a_bn_1/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_2a_bn_1/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_2a_bn_1/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_2a_bn_1/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_2a_conv_2/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_2a_bn_2/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_2a_bn_2/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_2a_bn_2/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_2a_bn_2/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_2a_conv_3/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_2a_bn_3/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_2a_bn_3/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_2a_bn_3/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_2a_bn_3/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_2a_conv_shortcut/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_2a_bn_shortcut/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_2a_bn_shortcut/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_2a_bn_shortcut/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_2a_bn_shortcut/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_2b_conv_1/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_2b_bn_1/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_2b_bn_1/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_2b_bn_1/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_2b_bn_1/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_2b_conv_2/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_2b_bn_2/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_2b_bn_2/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_2b_bn_2/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_2b_bn_2/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_2b_conv_3/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_2b_bn_3/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_2b_bn_3/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_2b_bn_3/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_2b_bn_3/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_2c_conv_1/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_2c_bn_1/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_2c_bn_1/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_2c_bn_1/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_2c_bn_1/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_2c_conv_2/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_2c_bn_2/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_2c_bn_2/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_2c_bn_2/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_2c_bn_2/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_2c_conv_3/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_2c_bn_3/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_2c_bn_3/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_2c_bn_3/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_2c_bn_3/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_2d_conv_1/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_2d_bn_1/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_2d_bn_1/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_2d_bn_1/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_2d_bn_1/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_2d_conv_2/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_2d_bn_2/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_2d_bn_2/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_2d_bn_2/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_2d_bn_2/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_2d_conv_3/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_2d_bn_3/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_2d_bn_3/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_2d_bn_3/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_2d_bn_3/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor l3/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor l3/bias, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3a_conv_1/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3a_bn_1/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3a_bn_1/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3a_bn_1/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3a_bn_1/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3a_conv_2/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3a_bn_2/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3a_bn_2/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3a_bn_2/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3a_bn_2/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3a_conv_3/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3a_bn_3/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3a_bn_3/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3a_bn_3/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3a_bn_3/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3a_conv_shortcut/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3a_bn_shortcut/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3a_bn_shortcut/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3a_bn_shortcut/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3a_bn_shortcut/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3b_conv_1/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3b_bn_1/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3b_bn_1/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3b_bn_1/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3b_bn_1/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3b_conv_2/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3b_bn_2/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3b_bn_2/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3b_bn_2/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3b_bn_2/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3b_conv_3/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3b_bn_3/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3b_bn_3/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3b_bn_3/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3b_bn_3/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3c_conv_1/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3c_bn_1/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3c_bn_1/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3c_bn_1/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3c_bn_1/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3c_conv_2/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3c_bn_2/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3c_bn_2/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3c_bn_2/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3c_bn_2/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3c_conv_3/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3c_bn_3/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3c_bn_3/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3c_bn_3/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3c_bn_3/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3d_conv_1/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3d_bn_1/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3d_bn_1/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3d_bn_1/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3d_bn_1/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3d_conv_2/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3d_bn_2/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3d_bn_2/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3d_bn_2/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3d_bn_2/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3d_conv_3/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3d_bn_3/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3d_bn_3/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3d_bn_3/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3d_bn_3/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3e_conv_1/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3e_bn_1/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3e_bn_1/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3e_bn_1/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3e_bn_1/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3e_conv_2/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3e_bn_2/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3e_bn_2/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3e_bn_2/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3e_bn_2/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3e_conv_3/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3e_bn_3/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3e_bn_3/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3e_bn_3/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3e_bn_3/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3f_conv_1/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3f_bn_1/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3f_bn_1/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3f_bn_1/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3f_bn_1/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3f_conv_2/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3f_bn_2/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3f_bn_2/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3f_bn_2/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3f_bn_2/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3f_conv_3/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3f_bn_3/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3f_bn_3/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3f_bn_3/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_3f_bn_3/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor l4/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor l4/bias, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_4a_conv_1/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_4a_bn_1/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_4a_bn_1/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_4a_bn_1/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_4a_bn_1/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_4a_conv_2/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_4a_bn_2/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_4a_bn_2/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_4a_bn_2/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_4a_bn_2/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_4a_conv_3/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_4a_bn_3/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_4a_bn_3/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_4a_bn_3/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_4a_bn_3/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_4a_conv_shortcut/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_4a_bn_shortcut/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_4a_bn_shortcut/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_4a_bn_shortcut/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_4a_bn_shortcut/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_4b_conv_1/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_4b_bn_1/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_4b_bn_1/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_4b_bn_1/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_4b_bn_1/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_4b_conv_2/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_4b_bn_2/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_4b_bn_2/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_4b_bn_2/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_4b_bn_2/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_4b_conv_3/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_4b_bn_3/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_4b_bn_3/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_4b_bn_3/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_4b_bn_3/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_4c_conv_1/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_4c_bn_1/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_4c_bn_1/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_4c_bn_1/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_4c_bn_1/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_4c_conv_2/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_4c_bn_2/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_4c_bn_2/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_4c_bn_2/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_4c_bn_2/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_4c_conv_3/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_4c_bn_3/gamma, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_4c_bn_3/beta, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_4c_bn_3/moving_mean, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor block_4c_bn_3/moving_variance, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor l5/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor l5/bias, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor post_hoc_d2/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor post_hoc_d2/bias, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor rpn/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor rpn/bias, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor rpn-box/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor rpn-box/bias, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor permute_1/transpose, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor MLP/multilevel_propose_rois/level_2/Reshape_1/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor rpn-class/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor rpn-class/bias, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor permute/transpose, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor MLP/multilevel_propose_rois/level_2/Reshape/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor post_hoc_d3/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor post_hoc_d3/bias, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor permute_3/transpose, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor MLP/multilevel_propose_rois/level_3/Reshape_1/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor permute_2/transpose, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor MLP/multilevel_propose_rois/level_3/Reshape/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor post_hoc_d4/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor post_hoc_d4/bias, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor permute_5/transpose, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor MLP/multilevel_propose_rois/level_4/Reshape_1/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor permute_4/transpose, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor MLP/multilevel_propose_rois/level_4/Reshape/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor post_hoc_d5/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor post_hoc_d5/bias, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor permute_7/transpose, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor MLP/multilevel_propose_rois/level_5/Reshape_1/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor permute_6/transpose, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor MLP/multilevel_propose_rois/level_5/Reshape/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor permute_9/transpose, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor MLP/multilevel_propose_rois/level_6/Reshape_1/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor permute_8/transpose, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor MLP/multilevel_propose_rois/level_6/Reshape/shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor fc6/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor fc6/bias, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor fc7/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor fc7/bias, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor box-predict/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor box-predict/bias, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor class-predict/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor class-predict/bias, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor mask-conv-l0/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor mask-conv-l0/bias, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor mask-conv-l1/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor mask-conv-l1/bias, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor mask-conv-l2/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor mask-conv-l2/bias, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor mask-conv-l3/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor mask-conv-l3/bias, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor conv5-mask/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor conv5-mask/Shape, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor conv5-mask/strided_slice/stack, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor conv5-mask/strided_slice/stack_1, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor conv5-mask/strided_slice/stack_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor conv5-mask/stack/1, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor conv5-mask/strided_slice_1/stack, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor conv5-mask/strided_slice_1/stack_1, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor conv5-mask/strided_slice_1/stack_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor conv5-mask/mul/y, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor conv5-mask/add/y, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor conv5-mask/strided_slice_2/stack, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor conv5-mask/strided_slice_2/stack_1, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor conv5-mask/strided_slice_2/stack_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor conv5-mask/mul_1/y, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor conv5-mask/add_1/y, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor conv5-mask/bias, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor mask_fcn_logits/kernel, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor rpn-class/bias_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor rpn/bias_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor rpn-box/bias_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor rpn-class/bias_1, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor rpn/bias_1, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor rpn-box/bias_1, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor rpn-class/bias_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor rpn/bias_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor rpn-box/bias_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor rpn-class/bias_3, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor rpn/bias_3, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [WARNING] Missing scale and zero-point for tensor rpn-box/bias_3, expect fall back to non-int8 implementation for any layer consuming or producing given tensor [INFO] [MemUsageChange] Init cuBLAS/cuBLASLt: CPU +483, GPU +206, now: CPU 1268, GPU 524 (MiB) [INFO] [MemUsageChange] Init cuDNN: CPU +394, GPU +172, now: CPU 1662, GPU 696 (MiB) [WARNING] Detected invalid timing cache, setup a local cache instead [INFO] Some tactics do not have sufficient workspace memory to run. Increasing workspace size may increase performance, please check verbose output. [WARNING] No implementation of layer nearest_upsampling_2 obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [WARNING] No implementation of layer nearest_upsampling_1 obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [WARNING] No implementation of layer nearest_upsampling obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [WARNING] No implementation of layer rpn-box/Conv2D || rpn-class/Conv2D obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [WARNING] No implementation of layer (Unnamed Layer* 475) [Shuffle] + MLP/multilevel_propose_rois/level_2/Reshape_1 obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [WARNING] No implementation of layer (Unnamed Layer* 483) [Shuffle] + MLP/multilevel_propose_rois/level_2/Reshape obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [WARNING] No implementation of layer PWN(MLP/multilevel_propose_rois/level_2/Sigmoid) obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [WARNING] No implementation of layer rpn-box_1/Conv2D || rpn-class_1/Conv2D obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [WARNING] No implementation of layer (Unnamed Layer* 497) [Shuffle] + MLP/multilevel_propose_rois/level_3/Reshape_1 obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [WARNING] No implementation of layer (Unnamed Layer* 503) [Shuffle] + MLP/multilevel_propose_rois/level_3/Reshape obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [WARNING] No implementation of layer PWN(MLP/multilevel_propose_rois/level_3/Sigmoid) obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [WARNING] No implementation of layer rpn-box_2/Conv2D || rpn-class_2/Conv2D obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [WARNING] No implementation of layer (Unnamed Layer* 517) [Shuffle] + MLP/multilevel_propose_rois/level_4/Reshape_1 obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [WARNING] No implementation of layer (Unnamed Layer* 523) [Shuffle] + MLP/multilevel_propose_rois/level_4/Reshape obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [WARNING] No implementation of layer PWN(MLP/multilevel_propose_rois/level_4/Sigmoid) obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [WARNING] No implementation of layer rpn-box_3/Conv2D || rpn-class_3/Conv2D obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [WARNING] No implementation of layer (Unnamed Layer* 537) [Shuffle] + MLP/multilevel_propose_rois/level_5/Reshape_1 obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [WARNING] No implementation of layer (Unnamed Layer* 543) [Shuffle] + MLP/multilevel_propose_rois/level_5/Reshape obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [WARNING] No implementation of layer PWN(MLP/multilevel_propose_rois/level_5/Sigmoid) obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [WARNING] No implementation of layer rpn-box_4/Conv2D || rpn-class_4/Conv2D obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [WARNING] No implementation of layer (Unnamed Layer* 554) [Shuffle] + MLP/multilevel_propose_rois/level_6/Reshape_1 obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [WARNING] No implementation of layer (Unnamed Layer* 560) [Shuffle] + MLP/multilevel_propose_rois/level_6/Reshape obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [WARNING] No implementation of layer PWN(MLP/multilevel_propose_rois/level_6/Sigmoid) obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [WARNING] No implementation of layer multilevel_propose_rois obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [WARNING] No implementation of layer pyramid_crop_and_resize_box obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [WARNING] No implementation of layer box-predict/MatMul obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [WARNING] No implementation of layer class-predict/MatMul obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [WARNING] No implementation of layer box_head_softmax obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [WARNING] No implementation of layer generate_detections obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [WARNING] No implementation of layer mrcnn_detection_bboxes obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [WARNING] No implementation of layer pyramid_crop_and_resize_mask obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [WARNING] No implementation of layer mask_fcn_logits/Conv2D obeys the requested constraints in strict mode. No conforming implementation was found i.e. requested layer computation precision and output precision types are ignored, using the fastest implementation. [WARNING] No implementation obeys reformatting-free rules, at least 1 reformatting nodes are needed, now picking the fastest path instead. [INFO] Detected 1 inputs and 2 output network tensors. [INFO] Total Host Persistent Memory: 208064 [INFO] Total Device Persistent Memory: 51850752 [INFO] Total Scratch Memory: 85368832 [INFO] [MemUsageStats] Peak memory usage of TRT CPU/GPU memory allocators: CPU 143 MiB, GPU 4 MiB [INFO] [MemUsageChange] Init cuBLAS/cuBLASLt: CPU +0, GPU +8, now: CPU 2119, GPU 918 (MiB) [INFO] [MemUsageChange] Init cuDNN: CPU +1, GPU +8, now: CPU 2120, GPU 926 (MiB) [INFO] [MemUsageChange] Init cuBLAS/cuBLASLt: CPU +0, GPU +0, now: CPU 2127, GPU 920 (MiB) [INFO] [MemUsageChange] Init cuBLAS/cuBLASLt: CPU +0, GPU +0, now: CPU 2127, GPU 904 (MiB) [INFO] [MemUsageSnapshot] Builder end: CPU 2033 MiB, GPU 904 MiB 2022-06-01 02:16:03,779 [INFO] tlt.components.docker_handler.docker_handler: Stopping container. !tao mask_rcnn inference -i $DATA_DOWNLOAD_DIR/v0_clean/test/images \ -o $USER_EXPERIMENT_DIR/experiment0/test_predicted_images_int8 \ -e $SPECS_DIR/wisrd-v0-mask-rcnn_train_resnet50.txt \ -m $USER_EXPERIMENT_DIR/experiment0/export_int/trt.int8.engine \ -l $USER_EXPERIMENT_DIR/experiment0/