&&&& RUNNING TensorRT.trtexec [TensorRT v8203] # trtexec --onnx=resnet50_quant_sparse.onnx --int8 --sparsity=enable --shapes=input:128x3x224x224 --verbose [03/25/2022-13:24:01] [I] === Model Options === [03/25/2022-13:24:01] [I] Format: ONNX [03/25/2022-13:24:01] [I] Model: resnet50_quant_sparse.onnx [03/25/2022-13:24:01] [I] Output: [03/25/2022-13:24:01] [I] === Build Options === [03/25/2022-13:24:01] [I] Max batch: explicit batch [03/25/2022-13:24:01] [I] Workspace: 16 MiB [03/25/2022-13:24:01] [I] minTiming: 1 [03/25/2022-13:24:01] [I] avgTiming: 8 [03/25/2022-13:24:01] [I] Precision: FP32+INT8 [03/25/2022-13:24:01] [I] Calibration: Dynamic [03/25/2022-13:24:01] [I] Refit: Disabled [03/25/2022-13:24:01] [I] Sparsity: Enabled [03/25/2022-13:24:01] [I] Safe mode: Disabled [03/25/2022-13:24:01] [I] DirectIO mode: Disabled [03/25/2022-13:24:01] [I] Restricted mode: Disabled [03/25/2022-13:24:01] [I] Save engine: [03/25/2022-13:24:01] [I] Load engine: [03/25/2022-13:24:01] [I] Profiling verbosity: 0 [03/25/2022-13:24:01] [I] Tactic sources: Using default tactic sources [03/25/2022-13:24:01] [I] timingCacheMode: local [03/25/2022-13:24:01] [I] timingCacheFile: [03/25/2022-13:24:01] [I] Input(s)s format: fp32:CHW [03/25/2022-13:24:01] [I] Output(s)s format: fp32:CHW [03/25/2022-13:24:01] [I] Input build shape: input=128x3x224x224+128x3x224x224+128x3x224x224 [03/25/2022-13:24:01] [I] Input calibration shapes: model [03/25/2022-13:24:01] [I] === System Options === [03/25/2022-13:24:01] [I] Device: 0 [03/25/2022-13:24:01] [I] DLACore: [03/25/2022-13:24:01] [I] Plugins: [03/25/2022-13:24:01] [I] === Inference Options === [03/25/2022-13:24:01] [I] Batch: Explicit [03/25/2022-13:24:01] [I] Input inference shape: input=128x3x224x224 [03/25/2022-13:24:01] [I] Iterations: 10 [03/25/2022-13:24:01] [I] Duration: 3s (+ 200ms warm up) [03/25/2022-13:24:01] [I] Sleep time: 0ms [03/25/2022-13:24:01] [I] Idle time: 0ms [03/25/2022-13:24:01] [I] Streams: 1 [03/25/2022-13:24:01] [I] ExposeDMA: Disabled [03/25/2022-13:24:01] [I] Data transfers: Enabled [03/25/2022-13:24:01] [I] Spin-wait: Disabled [03/25/2022-13:24:01] [I] Multithreading: Disabled [03/25/2022-13:24:01] [I] CUDA Graph: Disabled [03/25/2022-13:24:01] [I] Separate profiling: Disabled [03/25/2022-13:24:01] [I] Time Deserialize: Disabled [03/25/2022-13:24:01] [I] Time Refit: Disabled [03/25/2022-13:24:01] [I] Skip inference: Disabled [03/25/2022-13:24:01] [I] Inputs: [03/25/2022-13:24:01] [I] === Reporting Options === [03/25/2022-13:24:01] [I] Verbose: Enabled [03/25/2022-13:24:01] [I] Averages: 10 inferences [03/25/2022-13:24:01] [I] Percentile: 99 [03/25/2022-13:24:01] [I] Dump refittable layers:Disabled [03/25/2022-13:24:01] [I] Dump output: Disabled [03/25/2022-13:24:01] [I] Profile: Disabled [03/25/2022-13:24:01] [I] Export timing to JSON file: [03/25/2022-13:24:01] [I] Export output to JSON file: [03/25/2022-13:24:01] [I] Export profile to JSON file: [03/25/2022-13:24:01] [I] [03/25/2022-13:24:01] [I] === Device Information === [03/25/2022-13:24:01] [I] Selected Device: A100-SXM4-40GB [03/25/2022-13:24:01] [I] Compute Capability: 8.0 [03/25/2022-13:24:01] [I] SMs: 108 [03/25/2022-13:24:01] [I] Compute Clock Rate: 1.41 GHz [03/25/2022-13:24:01] [I] Device Global Memory: 40536 MiB [03/25/2022-13:24:01] [I] Shared Memory per SM: 164 KiB [03/25/2022-13:24:01] [I] Memory Bus Width: 5120 bits (ECC enabled) [03/25/2022-13:24:01] [I] Memory Clock Rate: 1.215 GHz [03/25/2022-13:24:01] [I] [03/25/2022-13:24:01] [I] TensorRT version: 8.2.3 [03/25/2022-13:24:01] [V] [TRT] Registered plugin creator - ::BatchTilePlugin_TRT version 1 [03/25/2022-13:24:01] [V] [TRT] Registered plugin creator - ::BatchedNMS_TRT version 1 [03/25/2022-13:24:01] [V] [TRT] Registered plugin creator - ::BatchedNMSDynamic_TRT version 1 [03/25/2022-13:24:01] [V] [TRT] Registered plugin creator - ::CoordConvAC version 1 [03/25/2022-13:24:01] [V] [TRT] Registered plugin creator - ::CropAndResize version 1 [03/25/2022-13:24:01] [V] [TRT] Registered plugin creator - ::CropAndResizeDynamic version 1 [03/25/2022-13:24:01] [V] [TRT] Registered plugin creator - ::DecodeBbox3DPlugin version 1 [03/25/2022-13:24:01] [V] [TRT] Registered plugin creator - ::DetectionLayer_TRT version 1 [03/25/2022-13:24:01] [V] [TRT] Registered plugin creator - ::EfficientNMS_TRT version 1 [03/25/2022-13:24:01] [V] [TRT] Registered plugin creator - ::EfficientNMS_ONNX_TRT version 1 [03/25/2022-13:24:01] [V] [TRT] Registered plugin creator - ::EfficientNMS_TFTRT_TRT version 1 [03/25/2022-13:24:01] [V] [TRT] Registered plugin creator - ::FlattenConcat_TRT version 1 [03/25/2022-13:24:01] [V] [TRT] Registered plugin creator - ::GenerateDetection_TRT version 1 [03/25/2022-13:24:01] [V] [TRT] Registered plugin creator - ::GridAnchor_TRT version 1 [03/25/2022-13:24:01] [V] [TRT] Registered plugin creator - ::GridAnchorRect_TRT version 1 [03/25/2022-13:24:01] [V] [TRT] Registered plugin creator - ::InstanceNormalization_TRT version 1 [03/25/2022-13:24:01] [V] [TRT] Registered plugin creator - ::LReLU_TRT version 1 [03/25/2022-13:24:01] [V] [TRT] Registered plugin creator - ::MultilevelCropAndResize_TRT version 1 [03/25/2022-13:24:01] [V] [TRT] Registered plugin creator - ::MultilevelProposeROI_TRT version 1 [03/25/2022-13:24:01] [V] [TRT] Registered plugin creator - ::NMS_TRT version 1 [03/25/2022-13:24:01] [V] [TRT] Registered plugin creator - ::NMSDynamic_TRT version 1 [03/25/2022-13:24:01] [V] [TRT] Registered plugin creator - ::Normalize_TRT version 1 [03/25/2022-13:24:01] [V] [TRT] Registered plugin creator - ::PillarScatterPlugin version 1 [03/25/2022-13:24:01] [V] [TRT] Registered plugin creator - ::PriorBox_TRT version 1 [03/25/2022-13:24:01] [V] [TRT] Registered plugin creator - ::ProposalLayer_TRT version 1 [03/25/2022-13:24:01] [V] [TRT] Registered plugin creator - ::Proposal version 1 [03/25/2022-13:24:01] [V] [TRT] Registered plugin creator - ::ProposalDynamic version 1 [03/25/2022-13:24:01] [V] [TRT] Registered plugin creator - ::PyramidROIAlign_TRT version 1 [03/25/2022-13:24:01] [V] [TRT] Registered plugin creator - ::Region_TRT version 1 [03/25/2022-13:24:01] [V] [TRT] Registered plugin creator - ::Reorg_TRT version 1 [03/25/2022-13:24:01] [V] [TRT] Registered plugin creator - ::ResizeNearest_TRT version 1 [03/25/2022-13:24:01] [V] [TRT] Registered plugin creator - ::RPROI_TRT version 1 [03/25/2022-13:24:01] [V] [TRT] Registered plugin creator - ::ScatterND version 1 [03/25/2022-13:24:01] [V] [TRT] Registered plugin creator - ::SpecialSlice_TRT version 1 [03/25/2022-13:24:01] [V] [TRT] Registered plugin creator - ::Split version 1 [03/25/2022-13:24:01] [V] [TRT] Registered plugin creator - ::VoxelGeneratorPlugin version 1 [03/25/2022-13:24:02] [I] [TRT] [MemUsageChange] Init CUDA: CPU +425, GPU +0, now: CPU 437, GPU 686 (MiB) [03/25/2022-13:24:02] [I] [TRT] [MemUsageSnapshot] Begin constructing builder kernel library: CPU 438 MiB, GPU 686 MiB [03/25/2022-13:24:02] [I] [TRT] [MemUsageSnapshot] End constructing builder kernel library: CPU 654 MiB, GPU 758 MiB [03/25/2022-13:24:02] [I] Start parsing network model [03/25/2022-13:24:02] [I] [TRT] ---------------------------------------------------------------- [03/25/2022-13:24:02] [I] [TRT] Input filename: resnet50_quant_sparse.onnx [03/25/2022-13:24:02] [I] [TRT] ONNX IR version: 0.0.8 [03/25/2022-13:24:02] [I] [TRT] Opset version: 11 [03/25/2022-13:24:02] [I] [TRT] Producer name: [03/25/2022-13:24:02] [I] [TRT] Producer version: [03/25/2022-13:24:02] [I] [TRT] Domain: [03/25/2022-13:24:02] [I] [TRT] Model version: 0 [03/25/2022-13:24:02] [I] [TRT] Doc string: [03/25/2022-13:24:02] [I] [TRT] ---------------------------------------------------------------- [03/25/2022-13:24:02] [V] [TRT] Plugin creator already registered - ::BatchTilePlugin_TRT version 1 [03/25/2022-13:24:02] [V] [TRT] Plugin creator already registered - ::BatchedNMS_TRT version 1 [03/25/2022-13:24:02] [V] [TRT] Plugin creator already registered - ::BatchedNMSDynamic_TRT version 1 [03/25/2022-13:24:02] [V] [TRT] Plugin creator already registered - ::CoordConvAC version 1 [03/25/2022-13:24:02] [V] [TRT] Plugin creator already registered - ::CropAndResize version 1 [03/25/2022-13:24:02] [V] [TRT] Plugin creator already registered - ::CropAndResizeDynamic version 1 [03/25/2022-13:24:02] [V] [TRT] Plugin creator already registered - ::DecodeBbox3DPlugin version 1 [03/25/2022-13:24:02] [V] [TRT] Plugin creator already registered - ::DetectionLayer_TRT version 1 [03/25/2022-13:24:02] [V] [TRT] Plugin creator already registered - ::EfficientNMS_TRT version 1 [03/25/2022-13:24:02] [V] [TRT] Plugin creator already registered - ::EfficientNMS_ONNX_TRT version 1 [03/25/2022-13:24:02] [V] [TRT] Plugin creator already registered - ::EfficientNMS_TFTRT_TRT version 1 [03/25/2022-13:24:02] [V] [TRT] Plugin creator already registered - ::FlattenConcat_TRT version 1 [03/25/2022-13:24:02] [V] [TRT] Plugin creator already registered - ::GenerateDetection_TRT version 1 [03/25/2022-13:24:02] [V] [TRT] Plugin creator already registered - ::GridAnchor_TRT version 1 [03/25/2022-13:24:02] [V] [TRT] Plugin creator already registered - ::GridAnchorRect_TRT version 1 [03/25/2022-13:24:02] [V] [TRT] Plugin creator already registered - ::InstanceNormalization_TRT version 1 [03/25/2022-13:24:02] [V] [TRT] Plugin creator already registered - ::LReLU_TRT version 1 [03/25/2022-13:24:02] [V] [TRT] Plugin creator already registered - ::MultilevelCropAndResize_TRT version 1 [03/25/2022-13:24:02] [V] [TRT] Plugin creator already registered - ::MultilevelProposeROI_TRT version 1 [03/25/2022-13:24:02] [V] [TRT] Plugin creator already registered - ::NMS_TRT version 1 [03/25/2022-13:24:02] [V] [TRT] Plugin creator already registered - ::NMSDynamic_TRT version 1 [03/25/2022-13:24:02] [V] [TRT] Plugin creator already registered - ::Normalize_TRT version 1 [03/25/2022-13:24:02] [V] [TRT] Plugin creator already registered - ::PillarScatterPlugin version 1 [03/25/2022-13:24:02] [V] [TRT] Plugin creator already registered - ::PriorBox_TRT version 1 [03/25/2022-13:24:02] [V] [TRT] Plugin creator already registered - ::ProposalLayer_TRT version 1 [03/25/2022-13:24:02] [V] [TRT] Plugin creator already registered - ::Proposal version 1 [03/25/2022-13:24:02] [V] [TRT] Plugin creator already registered - ::ProposalDynamic version 1 [03/25/2022-13:24:02] [V] [TRT] Plugin creator already registered - ::PyramidROIAlign_TRT version 1 [03/25/2022-13:24:02] [V] [TRT] Plugin creator already registered - ::Region_TRT version 1 [03/25/2022-13:24:02] [V] [TRT] Plugin creator already registered - ::Reorg_TRT version 1 [03/25/2022-13:24:02] [V] [TRT] Plugin creator already registered - ::ResizeNearest_TRT version 1 [03/25/2022-13:24:02] [V] [TRT] Plugin creator already registered - ::RPROI_TRT version 1 [03/25/2022-13:24:02] [V] [TRT] Plugin creator already registered - ::ScatterND version 1 [03/25/2022-13:24:02] [V] [TRT] Plugin creator already registered - ::SpecialSlice_TRT version 1 [03/25/2022-13:24:02] [V] [TRT] Plugin creator already registered - ::Split version 1 [03/25/2022-13:24:02] [V] [TRT] Plugin creator already registered - ::VoxelGeneratorPlugin version 1 [03/25/2022-13:24:02] [V] [TRT] Adding network input: input with dtype: float32, dimensions: (-1, 3, 224, 224) [03/25/2022-13:24:02] [V] [TRT] Registering tensor: input for ONNX tensor: input [03/25/2022-13:24:02] [V] [TRT] Importing initializer: input.conv.module.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.0.0.conv1.module.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.0.0.conv2.module.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.0.0.conv3.module.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.0.0.identity.conv.module.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.0.1.conv1.module.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.0.1.conv2.module.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.0.1.conv3.module.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.0.2.conv1.module.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.0.2.conv2.module.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.0.2.conv3.module.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.1.0.conv1.module.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.1.0.conv2.module.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.1.0.conv3.module.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.1.0.identity.conv.module.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.1.1.conv1.module.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.1.1.conv2.module.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.1.1.conv3.module.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.1.2.conv1.module.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.1.2.conv2.module.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.1.2.conv3.module.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.1.3.conv1.module.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.1.3.conv2.module.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.1.3.conv3.module.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.0.conv1.module.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.0.conv2.module.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.0.conv3.module.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.0.identity.conv.module.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.1.conv1.module.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.1.conv2.module.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.1.conv3.module.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.2.conv1.module.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.2.conv2.module.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.2.conv3.module.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.3.conv1.module.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.3.conv2.module.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.3.conv3.module.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.4.conv1.module.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.4.conv2.module.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.4.conv3.module.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.5.conv1.module.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.5.conv2.module.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.5.conv3.module.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.3.0.conv1.module.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.3.0.conv2.module.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.3.0.conv3.module.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.3.0.identity.conv.module.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.3.1.conv1.module.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.3.1.conv2.module.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.3.1.conv3.module.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.3.2.conv1.module.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.3.2.conv2.module.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.3.2.conv3.module.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: input.bn.bn.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: input.bn.bn.bias [03/25/2022-13:24:02] [V] [TRT] Importing initializer: input.bn.bn.running_mean [03/25/2022-13:24:02] [V] [TRT] Importing initializer: input.bn.bn.running_var [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.0.0.bn1.bn.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.0.0.bn1.bn.bias [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.0.0.bn1.bn.running_mean [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.0.0.bn1.bn.running_var [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.0.0.identity.bn.bn.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.0.0.identity.bn.bn.bias [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.0.0.identity.bn.bn.running_mean [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.0.0.identity.bn.bn.running_var [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.0.0.bn2.bn.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.0.0.bn2.bn.bias [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.0.0.bn2.bn.running_mean [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.0.0.bn2.bn.running_var [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.0.0.bn3.bn.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.0.0.bn3.bn.bias [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.0.0.bn3.bn.running_mean [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.0.0.bn3.bn.running_var [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.0.1.bn1.bn.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.0.1.bn1.bn.bias [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.0.1.bn1.bn.running_mean [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.0.1.bn1.bn.running_var [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.0.1.bn2.bn.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.0.1.bn2.bn.bias [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.0.1.bn2.bn.running_mean [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.0.1.bn2.bn.running_var [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.0.1.bn3.bn.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.0.1.bn3.bn.bias [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.0.1.bn3.bn.running_mean [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.0.1.bn3.bn.running_var [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.0.2.bn1.bn.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.0.2.bn1.bn.bias [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.0.2.bn1.bn.running_mean [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.0.2.bn1.bn.running_var [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.0.2.bn2.bn.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.0.2.bn2.bn.bias [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.0.2.bn2.bn.running_mean [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.0.2.bn2.bn.running_var [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.0.2.bn3.bn.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.0.2.bn3.bn.bias [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.0.2.bn3.bn.running_mean [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.0.2.bn3.bn.running_var [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.1.0.bn1.bn.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.1.0.bn1.bn.bias [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.1.0.bn1.bn.running_mean [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.1.0.bn1.bn.running_var [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.1.0.identity.bn.bn.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.1.0.identity.bn.bn.bias [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.1.0.identity.bn.bn.running_mean [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.1.0.identity.bn.bn.running_var [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.1.0.bn2.bn.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.1.0.bn2.bn.bias [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.1.0.bn2.bn.running_mean [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.1.0.bn2.bn.running_var [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.1.0.bn3.bn.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.1.0.bn3.bn.bias [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.1.0.bn3.bn.running_mean [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.1.0.bn3.bn.running_var [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.1.1.bn1.bn.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.1.1.bn1.bn.bias [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.1.1.bn1.bn.running_mean [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.1.1.bn1.bn.running_var [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.1.1.bn2.bn.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.1.1.bn2.bn.bias [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.1.1.bn2.bn.running_mean [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.1.1.bn2.bn.running_var [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.1.1.bn3.bn.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.1.1.bn3.bn.bias [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.1.1.bn3.bn.running_mean [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.1.1.bn3.bn.running_var [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.1.2.bn1.bn.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.1.2.bn1.bn.bias [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.1.2.bn1.bn.running_mean [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.1.2.bn1.bn.running_var [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.1.2.bn2.bn.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.1.2.bn2.bn.bias [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.1.2.bn2.bn.running_mean [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.1.2.bn2.bn.running_var [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.1.2.bn3.bn.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.1.2.bn3.bn.bias [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.1.2.bn3.bn.running_mean [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.1.2.bn3.bn.running_var [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.1.3.bn1.bn.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.1.3.bn1.bn.bias [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.1.3.bn1.bn.running_mean [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.1.3.bn1.bn.running_var [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.1.3.bn2.bn.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.1.3.bn2.bn.bias [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.1.3.bn2.bn.running_mean [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.1.3.bn2.bn.running_var [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.1.3.bn3.bn.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.1.3.bn3.bn.bias [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.1.3.bn3.bn.running_mean [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.1.3.bn3.bn.running_var [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.0.bn1.bn.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.0.bn1.bn.bias [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.0.bn1.bn.running_mean [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.0.bn1.bn.running_var [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.0.identity.bn.bn.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.0.identity.bn.bn.bias [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.0.identity.bn.bn.running_mean [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.0.identity.bn.bn.running_var [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.0.bn2.bn.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.0.bn2.bn.bias [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.0.bn2.bn.running_mean [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.0.bn2.bn.running_var [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.0.bn3.bn.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.0.bn3.bn.bias [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.0.bn3.bn.running_mean [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.0.bn3.bn.running_var [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.1.bn1.bn.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.1.bn1.bn.bias [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.1.bn1.bn.running_mean [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.1.bn1.bn.running_var [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.1.bn2.bn.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.1.bn2.bn.bias [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.1.bn2.bn.running_mean [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.1.bn2.bn.running_var [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.1.bn3.bn.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.1.bn3.bn.bias [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.1.bn3.bn.running_mean [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.1.bn3.bn.running_var [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.2.bn1.bn.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.2.bn1.bn.bias [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.2.bn1.bn.running_mean [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.2.bn1.bn.running_var [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.2.bn2.bn.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.2.bn2.bn.bias [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.2.bn2.bn.running_mean [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.2.bn2.bn.running_var [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.2.bn3.bn.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.2.bn3.bn.bias [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.2.bn3.bn.running_mean [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.2.bn3.bn.running_var [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.3.bn1.bn.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.3.bn1.bn.bias [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.3.bn1.bn.running_mean [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.3.bn1.bn.running_var [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.3.bn2.bn.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.3.bn2.bn.bias [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.3.bn2.bn.running_mean [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.3.bn2.bn.running_var [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.3.bn3.bn.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.3.bn3.bn.bias [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.3.bn3.bn.running_mean [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.3.bn3.bn.running_var [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.4.bn1.bn.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.4.bn1.bn.bias [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.4.bn1.bn.running_mean [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.4.bn1.bn.running_var [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.4.bn2.bn.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.4.bn2.bn.bias [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.4.bn2.bn.running_mean [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.4.bn2.bn.running_var [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.4.bn3.bn.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.4.bn3.bn.bias [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.4.bn3.bn.running_mean [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.4.bn3.bn.running_var [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.5.bn1.bn.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.5.bn1.bn.bias [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.5.bn1.bn.running_mean [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.5.bn1.bn.running_var [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.5.bn2.bn.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.5.bn2.bn.bias [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.5.bn2.bn.running_mean [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.5.bn2.bn.running_var [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.5.bn3.bn.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.5.bn3.bn.bias [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.5.bn3.bn.running_mean [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.2.5.bn3.bn.running_var [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.3.0.bn1.bn.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.3.0.bn1.bn.bias [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.3.0.bn1.bn.running_mean [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.3.0.bn1.bn.running_var [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.3.0.identity.bn.bn.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.3.0.identity.bn.bn.bias [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.3.0.identity.bn.bn.running_mean [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.3.0.identity.bn.bn.running_var [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.3.0.bn2.bn.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.3.0.bn2.bn.bias [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.3.0.bn2.bn.running_mean [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.3.0.bn2.bn.running_var [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.3.0.bn3.bn.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.3.0.bn3.bn.bias [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.3.0.bn3.bn.running_mean [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.3.0.bn3.bn.running_var [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.3.1.bn1.bn.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.3.1.bn1.bn.bias [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.3.1.bn1.bn.running_mean [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.3.1.bn1.bn.running_var [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.3.1.bn2.bn.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.3.1.bn2.bn.bias [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.3.1.bn2.bn.running_mean [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.3.1.bn2.bn.running_var [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.3.1.bn3.bn.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.3.1.bn3.bn.bias [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.3.1.bn3.bn.running_mean [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.3.1.bn3.bn.running_var [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.3.2.bn1.bn.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.3.2.bn1.bn.bias [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.3.2.bn1.bn.running_mean [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.3.2.bn1.bn.running_var [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.3.2.bn2.bn.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.3.2.bn2.bn.bias [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.3.2.bn2.bn.running_mean [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.3.2.bn2.bn.running_var [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.3.2.bn3.bn.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.3.2.bn3.bn.bias [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.3.2.bn3.bn.running_mean [03/25/2022-13:24:02] [V] [TRT] Importing initializer: sections.3.2.bn3.bn.running_var [03/25/2022-13:24:02] [V] [TRT] Importing initializer: 2090 [03/25/2022-13:24:02] [V] [TRT] Importing initializer: classifier.fc.weight [03/25/2022-13:24:02] [V] [TRT] Importing initializer: classifier.fc.bias [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_0 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_0 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_0 [Constant] outputs: [1175 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_1 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_1 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_1 [Constant] outputs: [1176 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_3 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_3 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_3 [Constant] outputs: [1178 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_4 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_4 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_4 [Constant] outputs: [1179 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_6 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_6 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_6 [Constant] outputs: [1181 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_7 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_7 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_7 [Constant] outputs: [1182 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_9 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_9 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_9 [Constant] outputs: [1184 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_10 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_10 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_10 [Constant] outputs: [1185 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_16 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_16 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_16 [Constant] outputs: [1191 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_17 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_17 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_17 [Constant] outputs: [1192 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_19 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_19 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_19 [Constant] outputs: [1194 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_20 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_20 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_20 [Constant] outputs: [1195 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_22 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_22 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_22 [Constant] outputs: [1197 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_23 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_23 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_23 [Constant] outputs: [1198 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_25 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_25 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_25 [Constant] outputs: [1200 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_26 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_26 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_26 [Constant] outputs: [1201 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_31 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_31 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_31 [Constant] outputs: [1206 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_32 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_32 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_32 [Constant] outputs: [1207 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_34 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_34 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_34 [Constant] outputs: [1209 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_35 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_35 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_35 [Constant] outputs: [1210 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_37 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_37 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_37 [Constant] outputs: [1212 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_38 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_38 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_38 [Constant] outputs: [1213 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_40 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_40 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_40 [Constant] outputs: [1215 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_41 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_41 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_41 [Constant] outputs: [1216 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_46 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_46 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_46 [Constant] outputs: [1221 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_47 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_47 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_47 [Constant] outputs: [1222 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_49 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_49 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_49 [Constant] outputs: [1224 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_50 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_50 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_50 [Constant] outputs: [1225 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_52 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_52 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_52 [Constant] outputs: [1227 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_53 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_53 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_53 [Constant] outputs: [1228 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_55 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_55 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_55 [Constant] outputs: [1230 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_56 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_56 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_56 [Constant] outputs: [1231 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_60 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_60 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_60 [Constant] outputs: [1235 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_61 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_61 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_61 [Constant] outputs: [1236 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_63 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_63 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_63 [Constant] outputs: [1238 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_64 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_64 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_64 [Constant] outputs: [1239 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_66 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_66 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_66 [Constant] outputs: [1241 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_67 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_67 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_67 [Constant] outputs: [1242 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_69 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_69 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_69 [Constant] outputs: [1244 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_70 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_70 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_70 [Constant] outputs: [1245 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_74 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_74 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_74 [Constant] outputs: [1249 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_75 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_75 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_75 [Constant] outputs: [1250 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_77 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_77 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_77 [Constant] outputs: [1252 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_78 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_78 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_78 [Constant] outputs: [1253 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_82 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_82 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_82 [Constant] outputs: [1257 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_83 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_83 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_83 [Constant] outputs: [1258 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_85 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_85 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_85 [Constant] outputs: [1260 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_86 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_86 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_86 [Constant] outputs: [1261 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_88 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_88 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_88 [Constant] outputs: [1263 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_89 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_89 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_89 [Constant] outputs: [1264 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_91 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_91 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_91 [Constant] outputs: [1266 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_92 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_92 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_92 [Constant] outputs: [1267 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_97 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_97 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_97 [Constant] outputs: [1272 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_98 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_98 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_98 [Constant] outputs: [1273 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_100 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_100 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_100 [Constant] outputs: [1275 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_101 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_101 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_101 [Constant] outputs: [1276 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_103 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_103 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_103 [Constant] outputs: [1278 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_104 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_104 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_104 [Constant] outputs: [1279 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_106 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_106 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_106 [Constant] outputs: [1281 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_107 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_107 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_107 [Constant] outputs: [1282 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_112 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_112 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_112 [Constant] outputs: [1287 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_113 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_113 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_113 [Constant] outputs: [1288 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_115 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_115 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_115 [Constant] outputs: [1290 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_116 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_116 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_116 [Constant] outputs: [1291 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_118 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_118 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_118 [Constant] outputs: [1293 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_119 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_119 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_119 [Constant] outputs: [1294 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_121 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_121 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_121 [Constant] outputs: [1296 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_122 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_122 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_122 [Constant] outputs: [1297 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_126 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_126 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_126 [Constant] outputs: [1301 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_127 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_127 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_127 [Constant] outputs: [1302 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_129 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_129 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_129 [Constant] outputs: [1304 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_130 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_130 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_130 [Constant] outputs: [1305 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_134 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_134 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_134 [Constant] outputs: [1309 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_135 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_135 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_135 [Constant] outputs: [1310 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_137 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_137 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_137 [Constant] outputs: [1312 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_138 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_138 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_138 [Constant] outputs: [1313 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_140 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_140 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_140 [Constant] outputs: [1315 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_141 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_141 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_141 [Constant] outputs: [1316 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_143 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_143 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_143 [Constant] outputs: [1318 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_144 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_144 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_144 [Constant] outputs: [1319 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_149 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_149 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_149 [Constant] outputs: [1324 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_150 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_150 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_150 [Constant] outputs: [1325 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_152 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_152 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_152 [Constant] outputs: [1327 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_153 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_153 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_153 [Constant] outputs: [1328 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_155 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_155 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_155 [Constant] outputs: [1330 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_156 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_156 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_156 [Constant] outputs: [1331 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_158 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_158 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_158 [Constant] outputs: [1333 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_159 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_159 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_159 [Constant] outputs: [1334 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_164 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_164 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_164 [Constant] outputs: [1339 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_165 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_165 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_165 [Constant] outputs: [1340 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_167 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_167 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_167 [Constant] outputs: [1342 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_168 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_168 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_168 [Constant] outputs: [1343 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_170 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_170 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_170 [Constant] outputs: [1345 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_171 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_171 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_171 [Constant] outputs: [1346 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_173 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_173 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_173 [Constant] outputs: [1348 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_174 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_174 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_174 [Constant] outputs: [1349 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_178 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_178 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_178 [Constant] outputs: [1353 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_179 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_179 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_179 [Constant] outputs: [1354 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_181 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_181 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_181 [Constant] outputs: [1356 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_182 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_182 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_182 [Constant] outputs: [1357 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_186 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_186 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_186 [Constant] outputs: [1361 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_187 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_187 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_187 [Constant] outputs: [1362 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_189 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_189 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_189 [Constant] outputs: [1364 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_190 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_190 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_190 [Constant] outputs: [1365 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_192 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_192 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_192 [Constant] outputs: [1367 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_193 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_193 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_193 [Constant] outputs: [1368 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_195 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_195 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_195 [Constant] outputs: [1370 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_196 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_196 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_196 [Constant] outputs: [1371 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_201 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_201 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_201 [Constant] outputs: [1376 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_202 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_202 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_202 [Constant] outputs: [1377 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_204 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_204 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_204 [Constant] outputs: [1379 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_205 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_205 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_205 [Constant] outputs: [1380 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_207 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_207 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_207 [Constant] outputs: [1382 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_208 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_208 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_208 [Constant] outputs: [1383 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_210 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_210 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_210 [Constant] outputs: [1385 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_211 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_211 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_211 [Constant] outputs: [1386 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_216 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_216 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_216 [Constant] outputs: [1391 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_217 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_217 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_217 [Constant] outputs: [1392 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_219 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_219 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_219 [Constant] outputs: [1394 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_220 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_220 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_220 [Constant] outputs: [1395 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_222 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_222 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_222 [Constant] outputs: [1397 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_223 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_223 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_223 [Constant] outputs: [1398 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_225 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_225 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_225 [Constant] outputs: [1400 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_226 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_226 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_226 [Constant] outputs: [1401 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_230 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_230 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_230 [Constant] outputs: [1405 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_231 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_231 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_231 [Constant] outputs: [1406 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_233 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_233 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_233 [Constant] outputs: [1408 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_234 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_234 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_234 [Constant] outputs: [1409 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_236 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_236 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_236 [Constant] outputs: [1411 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_237 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_237 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_237 [Constant] outputs: [1412 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_239 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_239 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_239 [Constant] outputs: [1414 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_240 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_240 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_240 [Constant] outputs: [1415 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_244 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_244 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_244 [Constant] outputs: [1419 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_245 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_245 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_245 [Constant] outputs: [1420 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_247 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_247 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_247 [Constant] outputs: [1422 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_248 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_248 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_248 [Constant] outputs: [1423 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_252 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_252 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_252 [Constant] outputs: [1427 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_253 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_253 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_253 [Constant] outputs: [1428 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_255 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_255 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_255 [Constant] outputs: [1430 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_256 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_256 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_256 [Constant] outputs: [1431 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_258 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_258 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_258 [Constant] outputs: [1433 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_259 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_259 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_259 [Constant] outputs: [1434 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_261 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_261 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_261 [Constant] outputs: [1436 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_262 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_262 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_262 [Constant] outputs: [1437 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_267 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_267 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_267 [Constant] outputs: [1442 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_268 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_268 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_268 [Constant] outputs: [1443 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_270 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_270 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_270 [Constant] outputs: [1445 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_271 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_271 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_271 [Constant] outputs: [1446 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_273 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_273 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_273 [Constant] outputs: [1448 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_274 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_274 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_274 [Constant] outputs: [1449 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_276 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_276 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_276 [Constant] outputs: [1451 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_277 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_277 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_277 [Constant] outputs: [1452 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_282 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_282 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_282 [Constant] outputs: [1457 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_283 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_283 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_283 [Constant] outputs: [1458 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_285 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_285 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_285 [Constant] outputs: [1460 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_286 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_286 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_286 [Constant] outputs: [1461 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_288 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_288 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_288 [Constant] outputs: [1463 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_289 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_289 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_289 [Constant] outputs: [1464 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_291 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_291 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_291 [Constant] outputs: [1466 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_292 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_292 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_292 [Constant] outputs: [1467 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_296 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_296 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_296 [Constant] outputs: [1471 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_297 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_297 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_297 [Constant] outputs: [1472 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_299 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_299 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_299 [Constant] outputs: [1474 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_300 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_300 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_300 [Constant] outputs: [1475 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_304 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_304 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_304 [Constant] outputs: [1479 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_305 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_305 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_305 [Constant] outputs: [1480 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_307 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_307 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_307 [Constant] outputs: [1482 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_308 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_308 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_308 [Constant] outputs: [1483 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_310 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_310 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_310 [Constant] outputs: [1485 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_311 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_311 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_311 [Constant] outputs: [1486 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_313 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_313 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_313 [Constant] outputs: [1488 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_314 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_314 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_314 [Constant] outputs: [1489 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_319 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_319 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_319 [Constant] outputs: [1494 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_320 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_320 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_320 [Constant] outputs: [1495 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_322 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_322 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_322 [Constant] outputs: [1497 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_323 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_323 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_323 [Constant] outputs: [1498 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_325 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_325 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_325 [Constant] outputs: [1500 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_326 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_326 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_326 [Constant] outputs: [1501 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_328 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_328 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_328 [Constant] outputs: [1503 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_329 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_329 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_329 [Constant] outputs: [1504 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_334 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_334 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_334 [Constant] outputs: [1509 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_335 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_335 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_335 [Constant] outputs: [1510 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_337 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_337 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_337 [Constant] outputs: [1512 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_338 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_338 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_338 [Constant] outputs: [1513 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_340 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_340 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_340 [Constant] outputs: [1515 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_341 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_341 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_341 [Constant] outputs: [1516 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_343 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_343 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_343 [Constant] outputs: [1518 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_344 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_344 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_344 [Constant] outputs: [1519 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_348 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_348 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_348 [Constant] outputs: [1523 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_349 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_349 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_349 [Constant] outputs: [1524 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_351 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_351 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_351 [Constant] outputs: [1526 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_352 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_352 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_352 [Constant] outputs: [1527 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_356 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_356 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_356 [Constant] outputs: [1531 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_357 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_357 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_357 [Constant] outputs: [1532 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_359 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_359 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_359 [Constant] outputs: [1534 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_360 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_360 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_360 [Constant] outputs: [1535 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_362 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_362 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_362 [Constant] outputs: [1537 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_363 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_363 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_363 [Constant] outputs: [1538 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_365 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_365 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_365 [Constant] outputs: [1540 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_366 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_366 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_366 [Constant] outputs: [1541 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_371 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_371 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_371 [Constant] outputs: [1546 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_372 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_372 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_372 [Constant] outputs: [1547 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_374 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_374 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_374 [Constant] outputs: [1549 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_375 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_375 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_375 [Constant] outputs: [1550 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_377 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_377 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_377 [Constant] outputs: [1552 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_378 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_378 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_378 [Constant] outputs: [1553 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_380 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_380 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_380 [Constant] outputs: [1555 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_381 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_381 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_381 [Constant] outputs: [1556 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_386 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_386 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_386 [Constant] outputs: [1561 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_387 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_387 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_387 [Constant] outputs: [1562 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_389 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_389 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_389 [Constant] outputs: [1564 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_390 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_390 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_390 [Constant] outputs: [1565 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_392 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_392 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_392 [Constant] outputs: [1567 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_393 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_393 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_393 [Constant] outputs: [1568 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_395 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_395 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_395 [Constant] outputs: [1570 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_396 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_396 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_396 [Constant] outputs: [1571 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_400 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_400 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_400 [Constant] outputs: [1575 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_401 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_401 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_401 [Constant] outputs: [1576 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_403 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_403 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_403 [Constant] outputs: [1578 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_404 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_404 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_404 [Constant] outputs: [1579 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_408 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_408 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_408 [Constant] outputs: [1583 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_409 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_409 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_409 [Constant] outputs: [1584 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_411 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_411 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_411 [Constant] outputs: [1586 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_412 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_412 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_412 [Constant] outputs: [1587 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_414 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_414 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_414 [Constant] outputs: [1589 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_415 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_415 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_415 [Constant] outputs: [1590 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_417 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_417 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_417 [Constant] outputs: [1592 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_418 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_418 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_418 [Constant] outputs: [1593 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_423 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_423 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_423 [Constant] outputs: [1598 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_424 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_424 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_424 [Constant] outputs: [1599 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_426 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_426 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_426 [Constant] outputs: [1601 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_427 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_427 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_427 [Constant] outputs: [1602 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_429 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_429 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_429 [Constant] outputs: [1604 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_430 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_430 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_430 [Constant] outputs: [1605 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_432 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_432 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_432 [Constant] outputs: [1607 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_433 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_433 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_433 [Constant] outputs: [1608 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_438 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_438 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_438 [Constant] outputs: [1613 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_439 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_439 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_439 [Constant] outputs: [1614 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_441 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_441 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_441 [Constant] outputs: [1616 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_442 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_442 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_442 [Constant] outputs: [1617 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_444 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_444 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_444 [Constant] outputs: [1619 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_445 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_445 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_445 [Constant] outputs: [1620 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_447 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_447 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_447 [Constant] outputs: [1622 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_448 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_448 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_448 [Constant] outputs: [1623 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_452 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_452 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_452 [Constant] outputs: [1627 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_453 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_453 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_453 [Constant] outputs: [1628 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_455 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_455 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_455 [Constant] outputs: [1630 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_456 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_456 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_456 [Constant] outputs: [1631 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_458 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_458 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_458 [Constant] outputs: [1633 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_459 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_459 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_459 [Constant] outputs: [1634 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_461 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_461 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_461 [Constant] outputs: [1636 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_462 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_462 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_462 [Constant] outputs: [1637 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_466 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_466 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_466 [Constant] outputs: [1641 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_467 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_467 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_467 [Constant] outputs: [1642 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_469 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_469 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_469 [Constant] outputs: [1644 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_470 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_470 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_470 [Constant] outputs: [1645 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_474 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_474 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_474 [Constant] outputs: [1649 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_475 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_475 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_475 [Constant] outputs: [1650 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_477 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_477 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_477 [Constant] outputs: [1652 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_478 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_478 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_478 [Constant] outputs: [1653 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_480 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_480 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_480 [Constant] outputs: [1655 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_481 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_481 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_481 [Constant] outputs: [1656 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_483 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_483 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_483 [Constant] outputs: [1658 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_484 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_484 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_484 [Constant] outputs: [1659 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_489 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_489 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_489 [Constant] outputs: [1664 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_490 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_490 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_490 [Constant] outputs: [1665 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_492 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_492 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_492 [Constant] outputs: [1667 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_493 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_493 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_493 [Constant] outputs: [1668 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_495 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_495 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_495 [Constant] outputs: [1670 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_496 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_496 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_496 [Constant] outputs: [1671 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_498 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_498 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_498 [Constant] outputs: [1673 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_499 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_499 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_499 [Constant] outputs: [1674 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_504 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_504 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_504 [Constant] outputs: [1679 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_505 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_505 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_505 [Constant] outputs: [1680 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_507 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_507 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_507 [Constant] outputs: [1682 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_508 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_508 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_508 [Constant] outputs: [1683 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_510 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_510 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_510 [Constant] outputs: [1685 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_511 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_511 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_511 [Constant] outputs: [1686 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_513 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_513 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_513 [Constant] outputs: [1688 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_514 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_514 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_514 [Constant] outputs: [1689 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_518 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_518 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_518 [Constant] outputs: [1693 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_519 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_519 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_519 [Constant] outputs: [1694 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_521 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_521 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_521 [Constant] outputs: [1696 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_522 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_522 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_522 [Constant] outputs: [1697 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_526 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_526 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_526 [Constant] outputs: [1701 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_527 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_527 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_527 [Constant] outputs: [1702 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_529 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_529 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_529 [Constant] outputs: [1704 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_530 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_530 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_530 [Constant] outputs: [1705 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_532 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_532 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_532 [Constant] outputs: [1707 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_533 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_533 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_533 [Constant] outputs: [1708 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_535 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_535 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_535 [Constant] outputs: [1710 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_536 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_536 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_536 [Constant] outputs: [1711 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_541 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_541 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_541 [Constant] outputs: [1716 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_542 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_542 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_542 [Constant] outputs: [1717 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_544 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_544 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_544 [Constant] outputs: [1719 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_545 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_545 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_545 [Constant] outputs: [1720 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_547 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_547 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_547 [Constant] outputs: [1722 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_548 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_548 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_548 [Constant] outputs: [1723 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_550 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_550 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_550 [Constant] outputs: [1725 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_551 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_551 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_551 [Constant] outputs: [1726 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_556 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_556 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_556 [Constant] outputs: [1731 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_557 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_557 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_557 [Constant] outputs: [1732 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_559 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_559 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_559 [Constant] outputs: [1734 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_560 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_560 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_560 [Constant] outputs: [1735 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_562 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_562 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_562 [Constant] outputs: [1737 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_563 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_563 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_563 [Constant] outputs: [1738 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_565 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_565 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_565 [Constant] outputs: [1740 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_566 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_566 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_566 [Constant] outputs: [1741 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_570 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_570 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_570 [Constant] outputs: [1745 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_571 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_571 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_571 [Constant] outputs: [1746 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_573 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_573 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_573 [Constant] outputs: [1748 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_574 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_574 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_574 [Constant] outputs: [1749 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_578 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_578 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_578 [Constant] outputs: [1753 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_579 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_579 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_579 [Constant] outputs: [1754 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_581 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_581 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_581 [Constant] outputs: [1756 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_582 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_582 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_582 [Constant] outputs: [1757 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_584 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_584 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_584 [Constant] outputs: [1759 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_585 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_585 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_585 [Constant] outputs: [1760 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_587 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_587 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_587 [Constant] outputs: [1762 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_588 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_588 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_588 [Constant] outputs: [1763 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_593 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_593 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_593 [Constant] outputs: [1768 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_594 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_594 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_594 [Constant] outputs: [1769 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_596 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_596 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_596 [Constant] outputs: [1771 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_597 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_597 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_597 [Constant] outputs: [1772 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_599 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_599 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_599 [Constant] outputs: [1774 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_600 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_600 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_600 [Constant] outputs: [1775 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_602 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_602 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_602 [Constant] outputs: [1777 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_603 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_603 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_603 [Constant] outputs: [1778 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_608 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_608 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_608 [Constant] outputs: [1783 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_609 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_609 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_609 [Constant] outputs: [1784 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_611 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_611 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_611 [Constant] outputs: [1786 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_612 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_612 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_612 [Constant] outputs: [1787 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_614 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_614 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_614 [Constant] outputs: [1789 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_615 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_615 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_615 [Constant] outputs: [1790 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_617 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_617 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_617 [Constant] outputs: [1792 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_618 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_618 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_618 [Constant] outputs: [1793 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_622 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_622 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_622 [Constant] outputs: [1797 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_623 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_623 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_623 [Constant] outputs: [1798 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_625 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_625 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_625 [Constant] outputs: [1800 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_626 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_626 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_626 [Constant] outputs: [1801 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_630 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_630 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_630 [Constant] outputs: [1805 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_631 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_631 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_631 [Constant] outputs: [1806 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_633 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_633 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_633 [Constant] outputs: [1808 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_634 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_634 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_634 [Constant] outputs: [1809 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_636 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_636 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_636 [Constant] outputs: [1811 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_637 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_637 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_637 [Constant] outputs: [1812 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_639 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_639 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_639 [Constant] outputs: [1814 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_640 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_640 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_640 [Constant] outputs: [1815 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_645 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_645 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_645 [Constant] outputs: [1820 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_646 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_646 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_646 [Constant] outputs: [1821 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_648 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_648 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_648 [Constant] outputs: [1823 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_649 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_649 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_649 [Constant] outputs: [1824 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_651 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_651 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_651 [Constant] outputs: [1826 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_652 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_652 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_652 [Constant] outputs: [1827 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_654 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_654 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_654 [Constant] outputs: [1829 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_655 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_655 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_655 [Constant] outputs: [1830 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_660 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_660 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_660 [Constant] outputs: [1835 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_661 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_661 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_661 [Constant] outputs: [1836 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_663 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_663 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_663 [Constant] outputs: [1838 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_664 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_664 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_664 [Constant] outputs: [1839 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_666 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_666 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_666 [Constant] outputs: [1841 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_667 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_667 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_667 [Constant] outputs: [1842 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_669 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_669 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_669 [Constant] outputs: [1844 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_670 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_670 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_670 [Constant] outputs: [1845 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_674 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_674 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_674 [Constant] outputs: [1849 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_675 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_675 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_675 [Constant] outputs: [1850 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_677 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_677 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_677 [Constant] outputs: [1852 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_678 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_678 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_678 [Constant] outputs: [1853 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_682 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_682 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_682 [Constant] outputs: [1857 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_683 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_683 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_683 [Constant] outputs: [1858 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_685 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_685 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_685 [Constant] outputs: [1860 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_686 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_686 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_686 [Constant] outputs: [1861 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_688 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_688 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_688 [Constant] outputs: [1863 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_689 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_689 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_689 [Constant] outputs: [1864 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_691 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_691 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_691 [Constant] outputs: [1866 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_692 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_692 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_692 [Constant] outputs: [1867 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_697 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_697 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_697 [Constant] outputs: [1872 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_698 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_698 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_698 [Constant] outputs: [1873 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_700 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_700 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_700 [Constant] outputs: [1875 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_701 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_701 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_701 [Constant] outputs: [1876 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_703 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_703 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_703 [Constant] outputs: [1878 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_704 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_704 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_704 [Constant] outputs: [1879 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_706 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_706 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_706 [Constant] outputs: [1881 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_707 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_707 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_707 [Constant] outputs: [1882 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_712 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_712 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_712 [Constant] outputs: [1887 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_713 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_713 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_713 [Constant] outputs: [1888 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_715 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_715 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_715 [Constant] outputs: [1890 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_716 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_716 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_716 [Constant] outputs: [1891 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_718 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_718 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_718 [Constant] outputs: [1893 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_719 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_719 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_719 [Constant] outputs: [1894 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_721 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_721 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_721 [Constant] outputs: [1896 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_722 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_722 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_722 [Constant] outputs: [1897 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_726 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_726 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_726 [Constant] outputs: [1901 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_727 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_727 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_727 [Constant] outputs: [1902 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_729 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_729 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_729 [Constant] outputs: [1904 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_730 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_730 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_730 [Constant] outputs: [1905 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_734 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_734 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_734 [Constant] outputs: [1909 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_735 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_735 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_735 [Constant] outputs: [1910 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_737 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_737 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_737 [Constant] outputs: [1912 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_738 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_738 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_738 [Constant] outputs: [1913 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_740 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_740 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_740 [Constant] outputs: [1915 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_741 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_741 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_741 [Constant] outputs: [1916 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_743 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_743 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_743 [Constant] outputs: [1918 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_744 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_744 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_744 [Constant] outputs: [1919 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_749 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_749 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_749 [Constant] outputs: [1924 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_750 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_750 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_750 [Constant] outputs: [1925 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_752 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_752 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_752 [Constant] outputs: [1927 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_753 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_753 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_753 [Constant] outputs: [1928 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_755 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_755 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_755 [Constant] outputs: [1930 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_756 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_756 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_756 [Constant] outputs: [1931 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_758 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_758 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_758 [Constant] outputs: [1933 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_759 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_759 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_759 [Constant] outputs: [1934 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_764 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_764 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_764 [Constant] outputs: [1939 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_765 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_765 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_765 [Constant] outputs: [1940 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_767 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_767 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_767 [Constant] outputs: [1942 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_768 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_768 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_768 [Constant] outputs: [1943 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_770 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_770 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_770 [Constant] outputs: [1945 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_771 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_771 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_771 [Constant] outputs: [1946 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_773 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_773 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_773 [Constant] outputs: [1948 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_774 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_774 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_774 [Constant] outputs: [1949 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_778 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_778 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_778 [Constant] outputs: [1953 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_779 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_779 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_779 [Constant] outputs: [1954 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_781 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_781 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_781 [Constant] outputs: [1956 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_782 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_782 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_782 [Constant] outputs: [1957 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_784 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_784 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_784 [Constant] outputs: [1959 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_785 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_785 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_785 [Constant] outputs: [1960 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_787 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_787 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_787 [Constant] outputs: [1962 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_788 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_788 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_788 [Constant] outputs: [1963 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_792 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_792 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_792 [Constant] outputs: [1967 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_793 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_793 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_793 [Constant] outputs: [1968 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_795 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_795 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_795 [Constant] outputs: [1970 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_796 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_796 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_796 [Constant] outputs: [1971 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_800 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_800 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_800 [Constant] outputs: [1975 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_801 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_801 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_801 [Constant] outputs: [1976 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_803 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_803 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_803 [Constant] outputs: [1978 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_804 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_804 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_804 [Constant] outputs: [1979 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_806 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_806 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_806 [Constant] outputs: [1981 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_807 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_807 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_807 [Constant] outputs: [1982 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_809 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_809 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_809 [Constant] outputs: [1984 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_810 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_810 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_810 [Constant] outputs: [1985 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_815 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_815 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_815 [Constant] outputs: [1990 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_816 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_816 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_816 [Constant] outputs: [1991 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_818 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_818 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_818 [Constant] outputs: [1993 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_819 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_819 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_819 [Constant] outputs: [1994 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_821 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_821 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_821 [Constant] outputs: [1996 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_822 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_822 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_822 [Constant] outputs: [1997 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_824 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_824 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_824 [Constant] outputs: [1999 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_825 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_825 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_825 [Constant] outputs: [2000 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_830 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_830 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_830 [Constant] outputs: [2005 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_831 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_831 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_831 [Constant] outputs: [2006 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_833 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_833 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_833 [Constant] outputs: [2008 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_834 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_834 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_834 [Constant] outputs: [2009 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_836 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_836 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_836 [Constant] outputs: [2011 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_837 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_837 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_837 [Constant] outputs: [2012 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_839 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_839 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_839 [Constant] outputs: [2014 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_840 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_840 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_840 [Constant] outputs: [2015 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_844 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_844 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_844 [Constant] outputs: [2019 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_845 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_845 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_845 [Constant] outputs: [2020 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_847 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_847 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_847 [Constant] outputs: [2022 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_848 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_848 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_848 [Constant] outputs: [2023 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_852 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_852 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_852 [Constant] outputs: [2027 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_853 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_853 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_853 [Constant] outputs: [2028 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_855 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_855 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_855 [Constant] outputs: [2030 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_856 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_856 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_856 [Constant] outputs: [2031 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_858 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_858 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_858 [Constant] outputs: [2033 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_859 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_859 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_859 [Constant] outputs: [2034 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_861 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_861 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_861 [Constant] outputs: [2036 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_862 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_862 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_862 [Constant] outputs: [2037 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_867 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_867 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_867 [Constant] outputs: [2042 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_868 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_868 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_868 [Constant] outputs: [2043 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_870 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_870 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_870 [Constant] outputs: [2045 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_871 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_871 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_871 [Constant] outputs: [2046 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_873 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_873 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_873 [Constant] outputs: [2048 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_874 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_874 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_874 [Constant] outputs: [2049 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_876 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_876 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_876 [Constant] outputs: [2051 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_877 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_877 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_877 [Constant] outputs: [2052 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_882 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_882 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_882 [Constant] outputs: [2057 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_883 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_883 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_883 [Constant] outputs: [2058 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_885 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_885 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_885 [Constant] outputs: [2060 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_886 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_886 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_886 [Constant] outputs: [2061 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_888 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_888 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_888 [Constant] outputs: [2063 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_889 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_889 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_889 [Constant] outputs: [2064 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_891 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_891 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_891 [Constant] outputs: [2066 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_892 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_892 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_892 [Constant] outputs: [2067 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_896 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_896 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_896 [Constant] outputs: [2071 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_897 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_897 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_897 [Constant] outputs: [2072 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_899 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_899 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_899 [Constant] outputs: [2074 -> ()[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_900 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_900 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_900 [Constant] outputs: [2075 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: Constant_906 [Constant] [03/25/2022-13:24:02] [V] [TRT] Constant_906 [Constant] inputs: [03/25/2022-13:24:02] [V] [TRT] Constant_906 [Constant] outputs: [2081 -> ()[INT32]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: QuantizeLinear_2 [QuantizeLinear] [03/25/2022-13:24:02] [V] [TRT] Searching for input: input [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1175 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1176 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_2 [QuantizeLinear] inputs: [input -> (-1, 3, 224, 224)[FLOAT]], [1175 -> ()[FLOAT]], [1176 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Registering tensor: 1177 for ONNX tensor: 1177 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_2 [QuantizeLinear] outputs: [1177 -> (-1, 3, 224, 224)[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: QuantizeLinear_8 [QuantizeLinear] [03/25/2022-13:24:02] [V] [TRT] Searching for input: input.conv.module.weight [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1181 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1182 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_8 [QuantizeLinear] inputs: [input.conv.module.weight -> (64, 3, 7, 7)[FLOAT]], [1181 -> ()[FLOAT]], [1182 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Registering layer: input.conv.module.weight for ONNX node: input.conv.module.weight [03/25/2022-13:24:02] [V] [TRT] Registering tensor: 1183 for ONNX tensor: 1183 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_8 [QuantizeLinear] outputs: [1183 -> (64, 3, 7, 7)[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: QuantizeLinear_24 [QuantizeLinear] [03/25/2022-13:24:02] [V] [TRT] Searching for input: sections.0.0.conv1.module.weight [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1197 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1198 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_24 [QuantizeLinear] inputs: [sections.0.0.conv1.module.weight -> (64, 64, 1, 1)[FLOAT]], [1197 -> ()[FLOAT]], [1198 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Registering layer: sections.0.0.conv1.module.weight for ONNX node: sections.0.0.conv1.module.weight [03/25/2022-13:24:02] [V] [TRT] Registering tensor: 1199 for ONNX tensor: 1199 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_24 [QuantizeLinear] outputs: [1199 -> (64, 64, 1, 1)[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: QuantizeLinear_39 [QuantizeLinear] [03/25/2022-13:24:02] [V] [TRT] Searching for input: sections.0.0.conv2.module.weight [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1212 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1213 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_39 [QuantizeLinear] inputs: [sections.0.0.conv2.module.weight -> (64, 64, 3, 3)[FLOAT]], [1212 -> ()[FLOAT]], [1213 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Registering layer: sections.0.0.conv2.module.weight for ONNX node: sections.0.0.conv2.module.weight [03/25/2022-13:24:02] [V] [TRT] Registering tensor: 1214 for ONNX tensor: 1214 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_39 [QuantizeLinear] outputs: [1214 -> (64, 64, 3, 3)[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: QuantizeLinear_54 [QuantizeLinear] [03/25/2022-13:24:02] [V] [TRT] Searching for input: sections.0.0.conv3.module.weight [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1227 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1228 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_54 [QuantizeLinear] inputs: [sections.0.0.conv3.module.weight -> (256, 64, 1, 1)[FLOAT]], [1227 -> ()[FLOAT]], [1228 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Registering layer: sections.0.0.conv3.module.weight for ONNX node: sections.0.0.conv3.module.weight [03/25/2022-13:24:02] [V] [TRT] Registering tensor: 1229 for ONNX tensor: 1229 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_54 [QuantizeLinear] outputs: [1229 -> (256, 64, 1, 1)[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: QuantizeLinear_68 [QuantizeLinear] [03/25/2022-13:24:02] [V] [TRT] Searching for input: sections.0.0.identity.conv.module.weight [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1241 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1242 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_68 [QuantizeLinear] inputs: [sections.0.0.identity.conv.module.weight -> (256, 64, 1, 1)[FLOAT]], [1241 -> ()[FLOAT]], [1242 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Registering layer: sections.0.0.identity.conv.module.weight for ONNX node: sections.0.0.identity.conv.module.weight [03/25/2022-13:24:02] [V] [TRT] Registering tensor: 1243 for ONNX tensor: 1243 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_68 [QuantizeLinear] outputs: [1243 -> (256, 64, 1, 1)[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: QuantizeLinear_90 [QuantizeLinear] [03/25/2022-13:24:02] [V] [TRT] Searching for input: sections.0.1.conv1.module.weight [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1263 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1264 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_90 [QuantizeLinear] inputs: [sections.0.1.conv1.module.weight -> (64, 256, 1, 1)[FLOAT]], [1263 -> ()[FLOAT]], [1264 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Registering layer: sections.0.1.conv1.module.weight for ONNX node: sections.0.1.conv1.module.weight [03/25/2022-13:24:02] [V] [TRT] Registering tensor: 1265 for ONNX tensor: 1265 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_90 [QuantizeLinear] outputs: [1265 -> (64, 256, 1, 1)[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: QuantizeLinear_105 [QuantizeLinear] [03/25/2022-13:24:02] [V] [TRT] Searching for input: sections.0.1.conv2.module.weight [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1278 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1279 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_105 [QuantizeLinear] inputs: [sections.0.1.conv2.module.weight -> (64, 64, 3, 3)[FLOAT]], [1278 -> ()[FLOAT]], [1279 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Registering layer: sections.0.1.conv2.module.weight for ONNX node: sections.0.1.conv2.module.weight [03/25/2022-13:24:02] [V] [TRT] Registering tensor: 1280 for ONNX tensor: 1280 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_105 [QuantizeLinear] outputs: [1280 -> (64, 64, 3, 3)[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: QuantizeLinear_120 [QuantizeLinear] [03/25/2022-13:24:02] [V] [TRT] Searching for input: sections.0.1.conv3.module.weight [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1293 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1294 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_120 [QuantizeLinear] inputs: [sections.0.1.conv3.module.weight -> (256, 64, 1, 1)[FLOAT]], [1293 -> ()[FLOAT]], [1294 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Registering layer: sections.0.1.conv3.module.weight for ONNX node: sections.0.1.conv3.module.weight [03/25/2022-13:24:02] [V] [TRT] Registering tensor: 1295 for ONNX tensor: 1295 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_120 [QuantizeLinear] outputs: [1295 -> (256, 64, 1, 1)[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: QuantizeLinear_142 [QuantizeLinear] [03/25/2022-13:24:02] [V] [TRT] Searching for input: sections.0.2.conv1.module.weight [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1315 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1316 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_142 [QuantizeLinear] inputs: [sections.0.2.conv1.module.weight -> (64, 256, 1, 1)[FLOAT]], [1315 -> ()[FLOAT]], [1316 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Registering layer: sections.0.2.conv1.module.weight for ONNX node: sections.0.2.conv1.module.weight [03/25/2022-13:24:02] [V] [TRT] Registering tensor: 1317 for ONNX tensor: 1317 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_142 [QuantizeLinear] outputs: [1317 -> (64, 256, 1, 1)[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: QuantizeLinear_157 [QuantizeLinear] [03/25/2022-13:24:02] [V] [TRT] Searching for input: sections.0.2.conv2.module.weight [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1330 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1331 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_157 [QuantizeLinear] inputs: [sections.0.2.conv2.module.weight -> (64, 64, 3, 3)[FLOAT]], [1330 -> ()[FLOAT]], [1331 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Registering layer: sections.0.2.conv2.module.weight for ONNX node: sections.0.2.conv2.module.weight [03/25/2022-13:24:02] [V] [TRT] Registering tensor: 1332 for ONNX tensor: 1332 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_157 [QuantizeLinear] outputs: [1332 -> (64, 64, 3, 3)[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: QuantizeLinear_172 [QuantizeLinear] [03/25/2022-13:24:02] [V] [TRT] Searching for input: sections.0.2.conv3.module.weight [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1345 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1346 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_172 [QuantizeLinear] inputs: [sections.0.2.conv3.module.weight -> (256, 64, 1, 1)[FLOAT]], [1345 -> ()[FLOAT]], [1346 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Registering layer: sections.0.2.conv3.module.weight for ONNX node: sections.0.2.conv3.module.weight [03/25/2022-13:24:02] [V] [TRT] Registering tensor: 1347 for ONNX tensor: 1347 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_172 [QuantizeLinear] outputs: [1347 -> (256, 64, 1, 1)[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: QuantizeLinear_194 [QuantizeLinear] [03/25/2022-13:24:02] [V] [TRT] Searching for input: sections.1.0.conv1.module.weight [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1367 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1368 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_194 [QuantizeLinear] inputs: [sections.1.0.conv1.module.weight -> (128, 256, 1, 1)[FLOAT]], [1367 -> ()[FLOAT]], [1368 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Registering layer: sections.1.0.conv1.module.weight for ONNX node: sections.1.0.conv1.module.weight [03/25/2022-13:24:02] [V] [TRT] Registering tensor: 1369 for ONNX tensor: 1369 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_194 [QuantizeLinear] outputs: [1369 -> (128, 256, 1, 1)[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: QuantizeLinear_209 [QuantizeLinear] [03/25/2022-13:24:02] [V] [TRT] Searching for input: sections.1.0.conv2.module.weight [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1382 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1383 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_209 [QuantizeLinear] inputs: [sections.1.0.conv2.module.weight -> (128, 128, 3, 3)[FLOAT]], [1382 -> ()[FLOAT]], [1383 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Registering layer: sections.1.0.conv2.module.weight for ONNX node: sections.1.0.conv2.module.weight [03/25/2022-13:24:02] [V] [TRT] Registering tensor: 1384 for ONNX tensor: 1384 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_209 [QuantizeLinear] outputs: [1384 -> (128, 128, 3, 3)[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: QuantizeLinear_224 [QuantizeLinear] [03/25/2022-13:24:02] [V] [TRT] Searching for input: sections.1.0.conv3.module.weight [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1397 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1398 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_224 [QuantizeLinear] inputs: [sections.1.0.conv3.module.weight -> (512, 128, 1, 1)[FLOAT]], [1397 -> ()[FLOAT]], [1398 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Registering layer: sections.1.0.conv3.module.weight for ONNX node: sections.1.0.conv3.module.weight [03/25/2022-13:24:02] [V] [TRT] Registering tensor: 1399 for ONNX tensor: 1399 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_224 [QuantizeLinear] outputs: [1399 -> (512, 128, 1, 1)[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: QuantizeLinear_238 [QuantizeLinear] [03/25/2022-13:24:02] [V] [TRT] Searching for input: sections.1.0.identity.conv.module.weight [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1411 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1412 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_238 [QuantizeLinear] inputs: [sections.1.0.identity.conv.module.weight -> (512, 256, 1, 1)[FLOAT]], [1411 -> ()[FLOAT]], [1412 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Registering layer: sections.1.0.identity.conv.module.weight for ONNX node: sections.1.0.identity.conv.module.weight [03/25/2022-13:24:02] [V] [TRT] Registering tensor: 1413 for ONNX tensor: 1413 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_238 [QuantizeLinear] outputs: [1413 -> (512, 256, 1, 1)[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: QuantizeLinear_260 [QuantizeLinear] [03/25/2022-13:24:02] [V] [TRT] Searching for input: sections.1.1.conv1.module.weight [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1433 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1434 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_260 [QuantizeLinear] inputs: [sections.1.1.conv1.module.weight -> (128, 512, 1, 1)[FLOAT]], [1433 -> ()[FLOAT]], [1434 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Registering layer: sections.1.1.conv1.module.weight for ONNX node: sections.1.1.conv1.module.weight [03/25/2022-13:24:02] [V] [TRT] Registering tensor: 1435 for ONNX tensor: 1435 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_260 [QuantizeLinear] outputs: [1435 -> (128, 512, 1, 1)[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: QuantizeLinear_275 [QuantizeLinear] [03/25/2022-13:24:02] [V] [TRT] Searching for input: sections.1.1.conv2.module.weight [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1448 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1449 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_275 [QuantizeLinear] inputs: [sections.1.1.conv2.module.weight -> (128, 128, 3, 3)[FLOAT]], [1448 -> ()[FLOAT]], [1449 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Registering layer: sections.1.1.conv2.module.weight for ONNX node: sections.1.1.conv2.module.weight [03/25/2022-13:24:02] [V] [TRT] Registering tensor: 1450 for ONNX tensor: 1450 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_275 [QuantizeLinear] outputs: [1450 -> (128, 128, 3, 3)[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: QuantizeLinear_290 [QuantizeLinear] [03/25/2022-13:24:02] [V] [TRT] Searching for input: sections.1.1.conv3.module.weight [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1463 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1464 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_290 [QuantizeLinear] inputs: [sections.1.1.conv3.module.weight -> (512, 128, 1, 1)[FLOAT]], [1463 -> ()[FLOAT]], [1464 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Registering layer: sections.1.1.conv3.module.weight for ONNX node: sections.1.1.conv3.module.weight [03/25/2022-13:24:02] [V] [TRT] Registering tensor: 1465 for ONNX tensor: 1465 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_290 [QuantizeLinear] outputs: [1465 -> (512, 128, 1, 1)[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: QuantizeLinear_312 [QuantizeLinear] [03/25/2022-13:24:02] [V] [TRT] Searching for input: sections.1.2.conv1.module.weight [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1485 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1486 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_312 [QuantizeLinear] inputs: [sections.1.2.conv1.module.weight -> (128, 512, 1, 1)[FLOAT]], [1485 -> ()[FLOAT]], [1486 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Registering layer: sections.1.2.conv1.module.weight for ONNX node: sections.1.2.conv1.module.weight [03/25/2022-13:24:02] [V] [TRT] Registering tensor: 1487 for ONNX tensor: 1487 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_312 [QuantizeLinear] outputs: [1487 -> (128, 512, 1, 1)[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: QuantizeLinear_327 [QuantizeLinear] [03/25/2022-13:24:02] [V] [TRT] Searching for input: sections.1.2.conv2.module.weight [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1500 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1501 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_327 [QuantizeLinear] inputs: [sections.1.2.conv2.module.weight -> (128, 128, 3, 3)[FLOAT]], [1500 -> ()[FLOAT]], [1501 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Registering layer: sections.1.2.conv2.module.weight for ONNX node: sections.1.2.conv2.module.weight [03/25/2022-13:24:02] [V] [TRT] Registering tensor: 1502 for ONNX tensor: 1502 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_327 [QuantizeLinear] outputs: [1502 -> (128, 128, 3, 3)[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: QuantizeLinear_342 [QuantizeLinear] [03/25/2022-13:24:02] [V] [TRT] Searching for input: sections.1.2.conv3.module.weight [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1515 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1516 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_342 [QuantizeLinear] inputs: [sections.1.2.conv3.module.weight -> (512, 128, 1, 1)[FLOAT]], [1515 -> ()[FLOAT]], [1516 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Registering layer: sections.1.2.conv3.module.weight for ONNX node: sections.1.2.conv3.module.weight [03/25/2022-13:24:02] [V] [TRT] Registering tensor: 1517 for ONNX tensor: 1517 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_342 [QuantizeLinear] outputs: [1517 -> (512, 128, 1, 1)[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: QuantizeLinear_364 [QuantizeLinear] [03/25/2022-13:24:02] [V] [TRT] Searching for input: sections.1.3.conv1.module.weight [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1537 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1538 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_364 [QuantizeLinear] inputs: [sections.1.3.conv1.module.weight -> (128, 512, 1, 1)[FLOAT]], [1537 -> ()[FLOAT]], [1538 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Registering layer: sections.1.3.conv1.module.weight for ONNX node: sections.1.3.conv1.module.weight [03/25/2022-13:24:02] [V] [TRT] Registering tensor: 1539 for ONNX tensor: 1539 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_364 [QuantizeLinear] outputs: [1539 -> (128, 512, 1, 1)[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: QuantizeLinear_379 [QuantizeLinear] [03/25/2022-13:24:02] [V] [TRT] Searching for input: sections.1.3.conv2.module.weight [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1552 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1553 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_379 [QuantizeLinear] inputs: [sections.1.3.conv2.module.weight -> (128, 128, 3, 3)[FLOAT]], [1552 -> ()[FLOAT]], [1553 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Registering layer: sections.1.3.conv2.module.weight for ONNX node: sections.1.3.conv2.module.weight [03/25/2022-13:24:02] [V] [TRT] Registering tensor: 1554 for ONNX tensor: 1554 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_379 [QuantizeLinear] outputs: [1554 -> (128, 128, 3, 3)[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: QuantizeLinear_394 [QuantizeLinear] [03/25/2022-13:24:02] [V] [TRT] Searching for input: sections.1.3.conv3.module.weight [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1567 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1568 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_394 [QuantizeLinear] inputs: [sections.1.3.conv3.module.weight -> (512, 128, 1, 1)[FLOAT]], [1567 -> ()[FLOAT]], [1568 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Registering layer: sections.1.3.conv3.module.weight for ONNX node: sections.1.3.conv3.module.weight [03/25/2022-13:24:02] [V] [TRT] Registering tensor: 1569 for ONNX tensor: 1569 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_394 [QuantizeLinear] outputs: [1569 -> (512, 128, 1, 1)[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: QuantizeLinear_416 [QuantizeLinear] [03/25/2022-13:24:02] [V] [TRT] Searching for input: sections.2.0.conv1.module.weight [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1589 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1590 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_416 [QuantizeLinear] inputs: [sections.2.0.conv1.module.weight -> (256, 512, 1, 1)[FLOAT]], [1589 -> ()[FLOAT]], [1590 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Registering layer: sections.2.0.conv1.module.weight for ONNX node: sections.2.0.conv1.module.weight [03/25/2022-13:24:02] [V] [TRT] Registering tensor: 1591 for ONNX tensor: 1591 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_416 [QuantizeLinear] outputs: [1591 -> (256, 512, 1, 1)[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: QuantizeLinear_431 [QuantizeLinear] [03/25/2022-13:24:02] [V] [TRT] Searching for input: sections.2.0.conv2.module.weight [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1604 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1605 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_431 [QuantizeLinear] inputs: [sections.2.0.conv2.module.weight -> (256, 256, 3, 3)[FLOAT]], [1604 -> ()[FLOAT]], [1605 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Registering layer: sections.2.0.conv2.module.weight for ONNX node: sections.2.0.conv2.module.weight [03/25/2022-13:24:02] [V] [TRT] Registering tensor: 1606 for ONNX tensor: 1606 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_431 [QuantizeLinear] outputs: [1606 -> (256, 256, 3, 3)[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: QuantizeLinear_446 [QuantizeLinear] [03/25/2022-13:24:02] [V] [TRT] Searching for input: sections.2.0.conv3.module.weight [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1619 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1620 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_446 [QuantizeLinear] inputs: [sections.2.0.conv3.module.weight -> (1024, 256, 1, 1)[FLOAT]], [1619 -> ()[FLOAT]], [1620 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Registering layer: sections.2.0.conv3.module.weight for ONNX node: sections.2.0.conv3.module.weight [03/25/2022-13:24:02] [V] [TRT] Registering tensor: 1621 for ONNX tensor: 1621 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_446 [QuantizeLinear] outputs: [1621 -> (1024, 256, 1, 1)[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: QuantizeLinear_460 [QuantizeLinear] [03/25/2022-13:24:02] [V] [TRT] Searching for input: sections.2.0.identity.conv.module.weight [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1633 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1634 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_460 [QuantizeLinear] inputs: [sections.2.0.identity.conv.module.weight -> (1024, 512, 1, 1)[FLOAT]], [1633 -> ()[FLOAT]], [1634 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Registering layer: sections.2.0.identity.conv.module.weight for ONNX node: sections.2.0.identity.conv.module.weight [03/25/2022-13:24:02] [V] [TRT] Registering tensor: 1635 for ONNX tensor: 1635 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_460 [QuantizeLinear] outputs: [1635 -> (1024, 512, 1, 1)[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: QuantizeLinear_482 [QuantizeLinear] [03/25/2022-13:24:02] [V] [TRT] Searching for input: sections.2.1.conv1.module.weight [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1655 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1656 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_482 [QuantizeLinear] inputs: [sections.2.1.conv1.module.weight -> (256, 1024, 1, 1)[FLOAT]], [1655 -> ()[FLOAT]], [1656 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Registering layer: sections.2.1.conv1.module.weight for ONNX node: sections.2.1.conv1.module.weight [03/25/2022-13:24:02] [V] [TRT] Registering tensor: 1657 for ONNX tensor: 1657 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_482 [QuantizeLinear] outputs: [1657 -> (256, 1024, 1, 1)[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: QuantizeLinear_497 [QuantizeLinear] [03/25/2022-13:24:02] [V] [TRT] Searching for input: sections.2.1.conv2.module.weight [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1670 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1671 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_497 [QuantizeLinear] inputs: [sections.2.1.conv2.module.weight -> (256, 256, 3, 3)[FLOAT]], [1670 -> ()[FLOAT]], [1671 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Registering layer: sections.2.1.conv2.module.weight for ONNX node: sections.2.1.conv2.module.weight [03/25/2022-13:24:02] [V] [TRT] Registering tensor: 1672 for ONNX tensor: 1672 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_497 [QuantizeLinear] outputs: [1672 -> (256, 256, 3, 3)[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: QuantizeLinear_512 [QuantizeLinear] [03/25/2022-13:24:02] [V] [TRT] Searching for input: sections.2.1.conv3.module.weight [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1685 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1686 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_512 [QuantizeLinear] inputs: [sections.2.1.conv3.module.weight -> (1024, 256, 1, 1)[FLOAT]], [1685 -> ()[FLOAT]], [1686 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Registering layer: sections.2.1.conv3.module.weight for ONNX node: sections.2.1.conv3.module.weight [03/25/2022-13:24:02] [V] [TRT] Registering tensor: 1687 for ONNX tensor: 1687 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_512 [QuantizeLinear] outputs: [1687 -> (1024, 256, 1, 1)[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: QuantizeLinear_534 [QuantizeLinear] [03/25/2022-13:24:02] [V] [TRT] Searching for input: sections.2.2.conv1.module.weight [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1707 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1708 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_534 [QuantizeLinear] inputs: [sections.2.2.conv1.module.weight -> (256, 1024, 1, 1)[FLOAT]], [1707 -> ()[FLOAT]], [1708 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Registering layer: sections.2.2.conv1.module.weight for ONNX node: sections.2.2.conv1.module.weight [03/25/2022-13:24:02] [V] [TRT] Registering tensor: 1709 for ONNX tensor: 1709 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_534 [QuantizeLinear] outputs: [1709 -> (256, 1024, 1, 1)[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: QuantizeLinear_549 [QuantizeLinear] [03/25/2022-13:24:02] [V] [TRT] Searching for input: sections.2.2.conv2.module.weight [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1722 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1723 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_549 [QuantizeLinear] inputs: [sections.2.2.conv2.module.weight -> (256, 256, 3, 3)[FLOAT]], [1722 -> ()[FLOAT]], [1723 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Registering layer: sections.2.2.conv2.module.weight for ONNX node: sections.2.2.conv2.module.weight [03/25/2022-13:24:02] [V] [TRT] Registering tensor: 1724 for ONNX tensor: 1724 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_549 [QuantizeLinear] outputs: [1724 -> (256, 256, 3, 3)[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: QuantizeLinear_564 [QuantizeLinear] [03/25/2022-13:24:02] [V] [TRT] Searching for input: sections.2.2.conv3.module.weight [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1737 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1738 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_564 [QuantizeLinear] inputs: [sections.2.2.conv3.module.weight -> (1024, 256, 1, 1)[FLOAT]], [1737 -> ()[FLOAT]], [1738 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Registering layer: sections.2.2.conv3.module.weight for ONNX node: sections.2.2.conv3.module.weight [03/25/2022-13:24:02] [V] [TRT] Registering tensor: 1739 for ONNX tensor: 1739 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_564 [QuantizeLinear] outputs: [1739 -> (1024, 256, 1, 1)[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: QuantizeLinear_586 [QuantizeLinear] [03/25/2022-13:24:02] [V] [TRT] Searching for input: sections.2.3.conv1.module.weight [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1759 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1760 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_586 [QuantizeLinear] inputs: [sections.2.3.conv1.module.weight -> (256, 1024, 1, 1)[FLOAT]], [1759 -> ()[FLOAT]], [1760 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Registering layer: sections.2.3.conv1.module.weight for ONNX node: sections.2.3.conv1.module.weight [03/25/2022-13:24:02] [V] [TRT] Registering tensor: 1761 for ONNX tensor: 1761 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_586 [QuantizeLinear] outputs: [1761 -> (256, 1024, 1, 1)[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: QuantizeLinear_601 [QuantizeLinear] [03/25/2022-13:24:02] [V] [TRT] Searching for input: sections.2.3.conv2.module.weight [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1774 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1775 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_601 [QuantizeLinear] inputs: [sections.2.3.conv2.module.weight -> (256, 256, 3, 3)[FLOAT]], [1774 -> ()[FLOAT]], [1775 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Registering layer: sections.2.3.conv2.module.weight for ONNX node: sections.2.3.conv2.module.weight [03/25/2022-13:24:02] [V] [TRT] Registering tensor: 1776 for ONNX tensor: 1776 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_601 [QuantizeLinear] outputs: [1776 -> (256, 256, 3, 3)[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: QuantizeLinear_616 [QuantizeLinear] [03/25/2022-13:24:02] [V] [TRT] Searching for input: sections.2.3.conv3.module.weight [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1789 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1790 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_616 [QuantizeLinear] inputs: [sections.2.3.conv3.module.weight -> (1024, 256, 1, 1)[FLOAT]], [1789 -> ()[FLOAT]], [1790 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Registering layer: sections.2.3.conv3.module.weight for ONNX node: sections.2.3.conv3.module.weight [03/25/2022-13:24:02] [V] [TRT] Registering tensor: 1791 for ONNX tensor: 1791 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_616 [QuantizeLinear] outputs: [1791 -> (1024, 256, 1, 1)[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: QuantizeLinear_638 [QuantizeLinear] [03/25/2022-13:24:02] [V] [TRT] Searching for input: sections.2.4.conv1.module.weight [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1811 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1812 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_638 [QuantizeLinear] inputs: [sections.2.4.conv1.module.weight -> (256, 1024, 1, 1)[FLOAT]], [1811 -> ()[FLOAT]], [1812 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Registering layer: sections.2.4.conv1.module.weight for ONNX node: sections.2.4.conv1.module.weight [03/25/2022-13:24:02] [V] [TRT] Registering tensor: 1813 for ONNX tensor: 1813 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_638 [QuantizeLinear] outputs: [1813 -> (256, 1024, 1, 1)[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: QuantizeLinear_653 [QuantizeLinear] [03/25/2022-13:24:02] [V] [TRT] Searching for input: sections.2.4.conv2.module.weight [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1826 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1827 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_653 [QuantizeLinear] inputs: [sections.2.4.conv2.module.weight -> (256, 256, 3, 3)[FLOAT]], [1826 -> ()[FLOAT]], [1827 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Registering layer: sections.2.4.conv2.module.weight for ONNX node: sections.2.4.conv2.module.weight [03/25/2022-13:24:02] [V] [TRT] Registering tensor: 1828 for ONNX tensor: 1828 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_653 [QuantizeLinear] outputs: [1828 -> (256, 256, 3, 3)[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: QuantizeLinear_668 [QuantizeLinear] [03/25/2022-13:24:02] [V] [TRT] Searching for input: sections.2.4.conv3.module.weight [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1841 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1842 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_668 [QuantizeLinear] inputs: [sections.2.4.conv3.module.weight -> (1024, 256, 1, 1)[FLOAT]], [1841 -> ()[FLOAT]], [1842 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Registering layer: sections.2.4.conv3.module.weight for ONNX node: sections.2.4.conv3.module.weight [03/25/2022-13:24:02] [V] [TRT] Registering tensor: 1843 for ONNX tensor: 1843 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_668 [QuantizeLinear] outputs: [1843 -> (1024, 256, 1, 1)[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: QuantizeLinear_690 [QuantizeLinear] [03/25/2022-13:24:02] [V] [TRT] Searching for input: sections.2.5.conv1.module.weight [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1863 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1864 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_690 [QuantizeLinear] inputs: [sections.2.5.conv1.module.weight -> (256, 1024, 1, 1)[FLOAT]], [1863 -> ()[FLOAT]], [1864 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Registering layer: sections.2.5.conv1.module.weight for ONNX node: sections.2.5.conv1.module.weight [03/25/2022-13:24:02] [V] [TRT] Registering tensor: 1865 for ONNX tensor: 1865 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_690 [QuantizeLinear] outputs: [1865 -> (256, 1024, 1, 1)[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: QuantizeLinear_705 [QuantizeLinear] [03/25/2022-13:24:02] [V] [TRT] Searching for input: sections.2.5.conv2.module.weight [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1878 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1879 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_705 [QuantizeLinear] inputs: [sections.2.5.conv2.module.weight -> (256, 256, 3, 3)[FLOAT]], [1878 -> ()[FLOAT]], [1879 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Registering layer: sections.2.5.conv2.module.weight for ONNX node: sections.2.5.conv2.module.weight [03/25/2022-13:24:02] [V] [TRT] Registering tensor: 1880 for ONNX tensor: 1880 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_705 [QuantizeLinear] outputs: [1880 -> (256, 256, 3, 3)[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: QuantizeLinear_720 [QuantizeLinear] [03/25/2022-13:24:02] [V] [TRT] Searching for input: sections.2.5.conv3.module.weight [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1893 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1894 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_720 [QuantizeLinear] inputs: [sections.2.5.conv3.module.weight -> (1024, 256, 1, 1)[FLOAT]], [1893 -> ()[FLOAT]], [1894 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Registering layer: sections.2.5.conv3.module.weight for ONNX node: sections.2.5.conv3.module.weight [03/25/2022-13:24:02] [V] [TRT] Registering tensor: 1895 for ONNX tensor: 1895 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_720 [QuantizeLinear] outputs: [1895 -> (1024, 256, 1, 1)[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: QuantizeLinear_742 [QuantizeLinear] [03/25/2022-13:24:02] [V] [TRT] Searching for input: sections.3.0.conv1.module.weight [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1915 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1916 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_742 [QuantizeLinear] inputs: [sections.3.0.conv1.module.weight -> (512, 1024, 1, 1)[FLOAT]], [1915 -> ()[FLOAT]], [1916 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Registering layer: sections.3.0.conv1.module.weight for ONNX node: sections.3.0.conv1.module.weight [03/25/2022-13:24:02] [V] [TRT] Registering tensor: 1917 for ONNX tensor: 1917 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_742 [QuantizeLinear] outputs: [1917 -> (512, 1024, 1, 1)[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: QuantizeLinear_757 [QuantizeLinear] [03/25/2022-13:24:02] [V] [TRT] Searching for input: sections.3.0.conv2.module.weight [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1930 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1931 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_757 [QuantizeLinear] inputs: [sections.3.0.conv2.module.weight -> (512, 512, 3, 3)[FLOAT]], [1930 -> ()[FLOAT]], [1931 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Registering layer: sections.3.0.conv2.module.weight for ONNX node: sections.3.0.conv2.module.weight [03/25/2022-13:24:02] [V] [TRT] Registering tensor: 1932 for ONNX tensor: 1932 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_757 [QuantizeLinear] outputs: [1932 -> (512, 512, 3, 3)[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: QuantizeLinear_772 [QuantizeLinear] [03/25/2022-13:24:02] [V] [TRT] Searching for input: sections.3.0.conv3.module.weight [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1945 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1946 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_772 [QuantizeLinear] inputs: [sections.3.0.conv3.module.weight -> (2048, 512, 1, 1)[FLOAT]], [1945 -> ()[FLOAT]], [1946 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Registering layer: sections.3.0.conv3.module.weight for ONNX node: sections.3.0.conv3.module.weight [03/25/2022-13:24:02] [V] [TRT] Registering tensor: 1947 for ONNX tensor: 1947 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_772 [QuantizeLinear] outputs: [1947 -> (2048, 512, 1, 1)[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: QuantizeLinear_786 [QuantizeLinear] [03/25/2022-13:24:02] [V] [TRT] Searching for input: sections.3.0.identity.conv.module.weight [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1959 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1960 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_786 [QuantizeLinear] inputs: [sections.3.0.identity.conv.module.weight -> (2048, 1024, 1, 1)[FLOAT]], [1959 -> ()[FLOAT]], [1960 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Registering layer: sections.3.0.identity.conv.module.weight for ONNX node: sections.3.0.identity.conv.module.weight [03/25/2022-13:24:02] [V] [TRT] Registering tensor: 1961 for ONNX tensor: 1961 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_786 [QuantizeLinear] outputs: [1961 -> (2048, 1024, 1, 1)[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: QuantizeLinear_808 [QuantizeLinear] [03/25/2022-13:24:02] [V] [TRT] Searching for input: sections.3.1.conv1.module.weight [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1981 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1982 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_808 [QuantizeLinear] inputs: [sections.3.1.conv1.module.weight -> (512, 2048, 1, 1)[FLOAT]], [1981 -> ()[FLOAT]], [1982 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Registering layer: sections.3.1.conv1.module.weight for ONNX node: sections.3.1.conv1.module.weight [03/25/2022-13:24:02] [V] [TRT] Registering tensor: 1983 for ONNX tensor: 1983 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_808 [QuantizeLinear] outputs: [1983 -> (512, 2048, 1, 1)[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: QuantizeLinear_823 [QuantizeLinear] [03/25/2022-13:24:02] [V] [TRT] Searching for input: sections.3.1.conv2.module.weight [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1996 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1997 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_823 [QuantizeLinear] inputs: [sections.3.1.conv2.module.weight -> (512, 512, 3, 3)[FLOAT]], [1996 -> ()[FLOAT]], [1997 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Registering layer: sections.3.1.conv2.module.weight for ONNX node: sections.3.1.conv2.module.weight [03/25/2022-13:24:02] [V] [TRT] Registering tensor: 1998 for ONNX tensor: 1998 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_823 [QuantizeLinear] outputs: [1998 -> (512, 512, 3, 3)[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: QuantizeLinear_838 [QuantizeLinear] [03/25/2022-13:24:02] [V] [TRT] Searching for input: sections.3.1.conv3.module.weight [03/25/2022-13:24:02] [V] [TRT] Searching for input: 2011 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 2012 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_838 [QuantizeLinear] inputs: [sections.3.1.conv3.module.weight -> (2048, 512, 1, 1)[FLOAT]], [2011 -> ()[FLOAT]], [2012 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Registering layer: sections.3.1.conv3.module.weight for ONNX node: sections.3.1.conv3.module.weight [03/25/2022-13:24:02] [V] [TRT] Registering tensor: 2013 for ONNX tensor: 2013 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_838 [QuantizeLinear] outputs: [2013 -> (2048, 512, 1, 1)[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: QuantizeLinear_860 [QuantizeLinear] [03/25/2022-13:24:02] [V] [TRT] Searching for input: sections.3.2.conv1.module.weight [03/25/2022-13:24:02] [V] [TRT] Searching for input: 2033 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 2034 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_860 [QuantizeLinear] inputs: [sections.3.2.conv1.module.weight -> (512, 2048, 1, 1)[FLOAT]], [2033 -> ()[FLOAT]], [2034 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Registering layer: sections.3.2.conv1.module.weight for ONNX node: sections.3.2.conv1.module.weight [03/25/2022-13:24:02] [V] [TRT] Registering tensor: 2035 for ONNX tensor: 2035 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_860 [QuantizeLinear] outputs: [2035 -> (512, 2048, 1, 1)[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: QuantizeLinear_875 [QuantizeLinear] [03/25/2022-13:24:02] [V] [TRT] Searching for input: sections.3.2.conv2.module.weight [03/25/2022-13:24:02] [V] [TRT] Searching for input: 2048 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 2049 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_875 [QuantizeLinear] inputs: [sections.3.2.conv2.module.weight -> (512, 512, 3, 3)[FLOAT]], [2048 -> ()[FLOAT]], [2049 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Registering layer: sections.3.2.conv2.module.weight for ONNX node: sections.3.2.conv2.module.weight [03/25/2022-13:24:02] [V] [TRT] Registering tensor: 2050 for ONNX tensor: 2050 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_875 [QuantizeLinear] outputs: [2050 -> (512, 512, 3, 3)[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: QuantizeLinear_890 [QuantizeLinear] [03/25/2022-13:24:02] [V] [TRT] Searching for input: sections.3.2.conv3.module.weight [03/25/2022-13:24:02] [V] [TRT] Searching for input: 2063 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 2064 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_890 [QuantizeLinear] inputs: [sections.3.2.conv3.module.weight -> (2048, 512, 1, 1)[FLOAT]], [2063 -> ()[FLOAT]], [2064 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Registering layer: sections.3.2.conv3.module.weight for ONNX node: sections.3.2.conv3.module.weight [03/25/2022-13:24:02] [V] [TRT] Registering tensor: 2065 for ONNX tensor: 2065 [03/25/2022-13:24:02] [V] [TRT] QuantizeLinear_890 [QuantizeLinear] outputs: [2065 -> (2048, 512, 1, 1)[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: DequantizeLinear_5 [DequantizeLinear] [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1177 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1178 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1179 [03/25/2022-13:24:02] [V] [TRT] DequantizeLinear_5 [DequantizeLinear] inputs: [1177 -> (-1, 3, 224, 224)[FLOAT]], [1178 -> ()[FLOAT]], [1179 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Registering tensor: 1180 for ONNX tensor: 1180 [03/25/2022-13:24:02] [V] [TRT] DequantizeLinear_5 [DequantizeLinear] outputs: [1180 -> (-1, 3, 224, 224)[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: DequantizeLinear_11 [DequantizeLinear] [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1183 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1184 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1185 [03/25/2022-13:24:02] [V] [TRT] DequantizeLinear_11 [DequantizeLinear] inputs: [1183 -> (64, 3, 7, 7)[FLOAT]], [1184 -> ()[FLOAT]], [1185 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Registering tensor: 1186 for ONNX tensor: 1186 [03/25/2022-13:24:02] [V] [TRT] DequantizeLinear_11 [DequantizeLinear] outputs: [1186 -> (64, 3, 7, 7)[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: DequantizeLinear_27 [DequantizeLinear] [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1199 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1200 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1201 [03/25/2022-13:24:02] [V] [TRT] DequantizeLinear_27 [DequantizeLinear] inputs: [1199 -> (64, 64, 1, 1)[FLOAT]], [1200 -> ()[FLOAT]], [1201 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Registering tensor: 1202 for ONNX tensor: 1202 [03/25/2022-13:24:02] [V] [TRT] DequantizeLinear_27 [DequantizeLinear] outputs: [1202 -> (64, 64, 1, 1)[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: DequantizeLinear_42 [DequantizeLinear] [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1214 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1215 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1216 [03/25/2022-13:24:02] [V] [TRT] DequantizeLinear_42 [DequantizeLinear] inputs: [1214 -> (64, 64, 3, 3)[FLOAT]], [1215 -> ()[FLOAT]], [1216 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Registering tensor: 1217 for ONNX tensor: 1217 [03/25/2022-13:24:02] [V] [TRT] DequantizeLinear_42 [DequantizeLinear] outputs: [1217 -> (64, 64, 3, 3)[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: DequantizeLinear_57 [DequantizeLinear] [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1229 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1230 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1231 [03/25/2022-13:24:02] [V] [TRT] DequantizeLinear_57 [DequantizeLinear] inputs: [1229 -> (256, 64, 1, 1)[FLOAT]], [1230 -> ()[FLOAT]], [1231 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Registering tensor: 1232 for ONNX tensor: 1232 [03/25/2022-13:24:02] [V] [TRT] DequantizeLinear_57 [DequantizeLinear] outputs: [1232 -> (256, 64, 1, 1)[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: DequantizeLinear_71 [DequantizeLinear] [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1243 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1244 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1245 [03/25/2022-13:24:02] [V] [TRT] DequantizeLinear_71 [DequantizeLinear] inputs: [1243 -> (256, 64, 1, 1)[FLOAT]], [1244 -> ()[FLOAT]], [1245 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Registering tensor: 1246 for ONNX tensor: 1246 [03/25/2022-13:24:02] [V] [TRT] DequantizeLinear_71 [DequantizeLinear] outputs: [1246 -> (256, 64, 1, 1)[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: DequantizeLinear_93 [DequantizeLinear] [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1265 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1266 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1267 [03/25/2022-13:24:02] [V] [TRT] DequantizeLinear_93 [DequantizeLinear] inputs: [1265 -> (64, 256, 1, 1)[FLOAT]], [1266 -> ()[FLOAT]], [1267 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Registering tensor: 1268 for ONNX tensor: 1268 [03/25/2022-13:24:02] [V] [TRT] DequantizeLinear_93 [DequantizeLinear] outputs: [1268 -> (64, 256, 1, 1)[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: DequantizeLinear_108 [DequantizeLinear] [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1280 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1281 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1282 [03/25/2022-13:24:02] [V] [TRT] DequantizeLinear_108 [DequantizeLinear] inputs: [1280 -> (64, 64, 3, 3)[FLOAT]], [1281 -> ()[FLOAT]], [1282 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Registering tensor: 1283 for ONNX tensor: 1283 [03/25/2022-13:24:02] [V] [TRT] DequantizeLinear_108 [DequantizeLinear] outputs: [1283 -> (64, 64, 3, 3)[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: DequantizeLinear_123 [DequantizeLinear] [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1295 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1296 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1297 [03/25/2022-13:24:02] [V] [TRT] DequantizeLinear_123 [DequantizeLinear] inputs: [1295 -> (256, 64, 1, 1)[FLOAT]], [1296 -> ()[FLOAT]], [1297 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Registering tensor: 1298 for ONNX tensor: 1298 [03/25/2022-13:24:02] [V] [TRT] DequantizeLinear_123 [DequantizeLinear] outputs: [1298 -> (256, 64, 1, 1)[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: DequantizeLinear_145 [DequantizeLinear] [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1317 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1318 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1319 [03/25/2022-13:24:02] [V] [TRT] DequantizeLinear_145 [DequantizeLinear] inputs: [1317 -> (64, 256, 1, 1)[FLOAT]], [1318 -> ()[FLOAT]], [1319 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Registering tensor: 1320 for ONNX tensor: 1320 [03/25/2022-13:24:02] [V] [TRT] DequantizeLinear_145 [DequantizeLinear] outputs: [1320 -> (64, 256, 1, 1)[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: DequantizeLinear_160 [DequantizeLinear] [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1332 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1333 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1334 [03/25/2022-13:24:02] [V] [TRT] DequantizeLinear_160 [DequantizeLinear] inputs: [1332 -> (64, 64, 3, 3)[FLOAT]], [1333 -> ()[FLOAT]], [1334 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Registering tensor: 1335 for ONNX tensor: 1335 [03/25/2022-13:24:02] [V] [TRT] DequantizeLinear_160 [DequantizeLinear] outputs: [1335 -> (64, 64, 3, 3)[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: DequantizeLinear_175 [DequantizeLinear] [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1347 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1348 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1349 [03/25/2022-13:24:02] [V] [TRT] DequantizeLinear_175 [DequantizeLinear] inputs: [1347 -> (256, 64, 1, 1)[FLOAT]], [1348 -> ()[FLOAT]], [1349 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Registering tensor: 1350 for ONNX tensor: 1350 [03/25/2022-13:24:02] [V] [TRT] DequantizeLinear_175 [DequantizeLinear] outputs: [1350 -> (256, 64, 1, 1)[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: DequantizeLinear_197 [DequantizeLinear] [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1369 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1370 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1371 [03/25/2022-13:24:02] [V] [TRT] DequantizeLinear_197 [DequantizeLinear] inputs: [1369 -> (128, 256, 1, 1)[FLOAT]], [1370 -> ()[FLOAT]], [1371 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Registering tensor: 1372 for ONNX tensor: 1372 [03/25/2022-13:24:02] [V] [TRT] DequantizeLinear_197 [DequantizeLinear] outputs: [1372 -> (128, 256, 1, 1)[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: DequantizeLinear_212 [DequantizeLinear] [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1384 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1385 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1386 [03/25/2022-13:24:02] [V] [TRT] DequantizeLinear_212 [DequantizeLinear] inputs: [1384 -> (128, 128, 3, 3)[FLOAT]], [1385 -> ()[FLOAT]], [1386 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Registering tensor: 1387 for ONNX tensor: 1387 [03/25/2022-13:24:02] [V] [TRT] DequantizeLinear_212 [DequantizeLinear] outputs: [1387 -> (128, 128, 3, 3)[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: DequantizeLinear_227 [DequantizeLinear] [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1399 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1400 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1401 [03/25/2022-13:24:02] [V] [TRT] DequantizeLinear_227 [DequantizeLinear] inputs: [1399 -> (512, 128, 1, 1)[FLOAT]], [1400 -> ()[FLOAT]], [1401 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Registering tensor: 1402 for ONNX tensor: 1402 [03/25/2022-13:24:02] [V] [TRT] DequantizeLinear_227 [DequantizeLinear] outputs: [1402 -> (512, 128, 1, 1)[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: DequantizeLinear_241 [DequantizeLinear] [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1413 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1414 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1415 [03/25/2022-13:24:02] [V] [TRT] DequantizeLinear_241 [DequantizeLinear] inputs: [1413 -> (512, 256, 1, 1)[FLOAT]], [1414 -> ()[FLOAT]], [1415 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Registering tensor: 1416 for ONNX tensor: 1416 [03/25/2022-13:24:02] [V] [TRT] DequantizeLinear_241 [DequantizeLinear] outputs: [1416 -> (512, 256, 1, 1)[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: DequantizeLinear_263 [DequantizeLinear] [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1435 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1436 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1437 [03/25/2022-13:24:02] [V] [TRT] DequantizeLinear_263 [DequantizeLinear] inputs: [1435 -> (128, 512, 1, 1)[FLOAT]], [1436 -> ()[FLOAT]], [1437 -> ()[INT8]], [03/25/2022-13:24:02] [V] [TRT] Registering tensor: 1438 for ONNX tensor: 1438 [03/25/2022-13:24:02] [V] [TRT] DequantizeLinear_263 [DequantizeLinear] outputs: [1438 -> (128, 512, 1, 1)[FLOAT]], [03/25/2022-13:24:02] [V] [TRT] Parsing node: DequantizeLinear_278 [DequantizeLinear] [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1450 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1451 [03/25/2022-13:24:02] [V] [TRT] Searching for input: 1452 [03/25/2022-13:24:02] [V] [TRT] DequantizeLinear_278 [DequantizeLinear] inputs: [1450 -> (128, 128, 3, 3)[FLOAT]], [1451 -> ()[FLOAT]], [1452 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1453 for ONNX tensor: 1453 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_278 [DequantizeLinear] outputs: [1453 -> (128, 128, 3, 3)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_293 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1465 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1466 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1467 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_293 [DequantizeLinear] inputs: [1465 -> (512, 128, 1, 1)[FLOAT]], [1466 -> ()[FLOAT]], [1467 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1468 for ONNX tensor: 1468 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_293 [DequantizeLinear] outputs: [1468 -> (512, 128, 1, 1)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_315 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1487 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1488 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1489 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_315 [DequantizeLinear] inputs: [1487 -> (128, 512, 1, 1)[FLOAT]], [1488 -> ()[FLOAT]], [1489 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1490 for ONNX tensor: 1490 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_315 [DequantizeLinear] outputs: [1490 -> (128, 512, 1, 1)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_330 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1502 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1503 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1504 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_330 [DequantizeLinear] inputs: [1502 -> (128, 128, 3, 3)[FLOAT]], [1503 -> ()[FLOAT]], [1504 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1505 for ONNX tensor: 1505 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_330 [DequantizeLinear] outputs: [1505 -> (128, 128, 3, 3)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_345 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1517 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1518 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1519 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_345 [DequantizeLinear] inputs: [1517 -> (512, 128, 1, 1)[FLOAT]], [1518 -> ()[FLOAT]], [1519 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1520 for ONNX tensor: 1520 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_345 [DequantizeLinear] outputs: [1520 -> (512, 128, 1, 1)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_367 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1539 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1540 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1541 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_367 [DequantizeLinear] inputs: [1539 -> (128, 512, 1, 1)[FLOAT]], [1540 -> ()[FLOAT]], [1541 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1542 for ONNX tensor: 1542 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_367 [DequantizeLinear] outputs: [1542 -> (128, 512, 1, 1)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_382 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1554 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1555 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1556 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_382 [DequantizeLinear] inputs: [1554 -> (128, 128, 3, 3)[FLOAT]], [1555 -> ()[FLOAT]], [1556 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1557 for ONNX tensor: 1557 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_382 [DequantizeLinear] outputs: [1557 -> (128, 128, 3, 3)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_397 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1569 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1570 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1571 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_397 [DequantizeLinear] inputs: [1569 -> (512, 128, 1, 1)[FLOAT]], [1570 -> ()[FLOAT]], [1571 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1572 for ONNX tensor: 1572 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_397 [DequantizeLinear] outputs: [1572 -> (512, 128, 1, 1)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_419 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1591 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1592 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1593 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_419 [DequantizeLinear] inputs: [1591 -> (256, 512, 1, 1)[FLOAT]], [1592 -> ()[FLOAT]], [1593 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1594 for ONNX tensor: 1594 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_419 [DequantizeLinear] outputs: [1594 -> (256, 512, 1, 1)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_434 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1606 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1607 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1608 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_434 [DequantizeLinear] inputs: [1606 -> (256, 256, 3, 3)[FLOAT]], [1607 -> ()[FLOAT]], [1608 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1609 for ONNX tensor: 1609 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_434 [DequantizeLinear] outputs: [1609 -> (256, 256, 3, 3)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_449 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1621 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1622 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1623 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_449 [DequantizeLinear] inputs: [1621 -> (1024, 256, 1, 1)[FLOAT]], [1622 -> ()[FLOAT]], [1623 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1624 for ONNX tensor: 1624 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_449 [DequantizeLinear] outputs: [1624 -> (1024, 256, 1, 1)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_463 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1635 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1636 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1637 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_463 [DequantizeLinear] inputs: [1635 -> (1024, 512, 1, 1)[FLOAT]], [1636 -> ()[FLOAT]], [1637 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1638 for ONNX tensor: 1638 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_463 [DequantizeLinear] outputs: [1638 -> (1024, 512, 1, 1)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_485 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1657 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1658 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1659 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_485 [DequantizeLinear] inputs: [1657 -> (256, 1024, 1, 1)[FLOAT]], [1658 -> ()[FLOAT]], [1659 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1660 for ONNX tensor: 1660 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_485 [DequantizeLinear] outputs: [1660 -> (256, 1024, 1, 1)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_500 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1672 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1673 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1674 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_500 [DequantizeLinear] inputs: [1672 -> (256, 256, 3, 3)[FLOAT]], [1673 -> ()[FLOAT]], [1674 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1675 for ONNX tensor: 1675 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_500 [DequantizeLinear] outputs: [1675 -> (256, 256, 3, 3)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_515 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1687 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1688 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1689 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_515 [DequantizeLinear] inputs: [1687 -> (1024, 256, 1, 1)[FLOAT]], [1688 -> ()[FLOAT]], [1689 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1690 for ONNX tensor: 1690 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_515 [DequantizeLinear] outputs: [1690 -> (1024, 256, 1, 1)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_537 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1709 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1710 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1711 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_537 [DequantizeLinear] inputs: [1709 -> (256, 1024, 1, 1)[FLOAT]], [1710 -> ()[FLOAT]], [1711 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1712 for ONNX tensor: 1712 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_537 [DequantizeLinear] outputs: [1712 -> (256, 1024, 1, 1)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_552 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1724 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1725 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1726 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_552 [DequantizeLinear] inputs: [1724 -> (256, 256, 3, 3)[FLOAT]], [1725 -> ()[FLOAT]], [1726 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1727 for ONNX tensor: 1727 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_552 [DequantizeLinear] outputs: [1727 -> (256, 256, 3, 3)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_567 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1739 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1740 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1741 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_567 [DequantizeLinear] inputs: [1739 -> (1024, 256, 1, 1)[FLOAT]], [1740 -> ()[FLOAT]], [1741 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1742 for ONNX tensor: 1742 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_567 [DequantizeLinear] outputs: [1742 -> (1024, 256, 1, 1)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_589 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1761 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1762 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1763 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_589 [DequantizeLinear] inputs: [1761 -> (256, 1024, 1, 1)[FLOAT]], [1762 -> ()[FLOAT]], [1763 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1764 for ONNX tensor: 1764 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_589 [DequantizeLinear] outputs: [1764 -> (256, 1024, 1, 1)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_604 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1776 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1777 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1778 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_604 [DequantizeLinear] inputs: [1776 -> (256, 256, 3, 3)[FLOAT]], [1777 -> ()[FLOAT]], [1778 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1779 for ONNX tensor: 1779 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_604 [DequantizeLinear] outputs: [1779 -> (256, 256, 3, 3)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_619 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1791 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1792 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1793 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_619 [DequantizeLinear] inputs: [1791 -> (1024, 256, 1, 1)[FLOAT]], [1792 -> ()[FLOAT]], [1793 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1794 for ONNX tensor: 1794 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_619 [DequantizeLinear] outputs: [1794 -> (1024, 256, 1, 1)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_641 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1813 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1814 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1815 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_641 [DequantizeLinear] inputs: [1813 -> (256, 1024, 1, 1)[FLOAT]], [1814 -> ()[FLOAT]], [1815 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1816 for ONNX tensor: 1816 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_641 [DequantizeLinear] outputs: [1816 -> (256, 1024, 1, 1)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_656 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1828 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1829 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1830 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_656 [DequantizeLinear] inputs: [1828 -> (256, 256, 3, 3)[FLOAT]], [1829 -> ()[FLOAT]], [1830 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1831 for ONNX tensor: 1831 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_656 [DequantizeLinear] outputs: [1831 -> (256, 256, 3, 3)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_671 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1843 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1844 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1845 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_671 [DequantizeLinear] inputs: [1843 -> (1024, 256, 1, 1)[FLOAT]], [1844 -> ()[FLOAT]], [1845 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1846 for ONNX tensor: 1846 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_671 [DequantizeLinear] outputs: [1846 -> (1024, 256, 1, 1)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_693 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1865 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1866 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1867 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_693 [DequantizeLinear] inputs: [1865 -> (256, 1024, 1, 1)[FLOAT]], [1866 -> ()[FLOAT]], [1867 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1868 for ONNX tensor: 1868 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_693 [DequantizeLinear] outputs: [1868 -> (256, 1024, 1, 1)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_708 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1880 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1881 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1882 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_708 [DequantizeLinear] inputs: [1880 -> (256, 256, 3, 3)[FLOAT]], [1881 -> ()[FLOAT]], [1882 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1883 for ONNX tensor: 1883 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_708 [DequantizeLinear] outputs: [1883 -> (256, 256, 3, 3)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_723 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1895 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1896 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1897 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_723 [DequantizeLinear] inputs: [1895 -> (1024, 256, 1, 1)[FLOAT]], [1896 -> ()[FLOAT]], [1897 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1898 for ONNX tensor: 1898 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_723 [DequantizeLinear] outputs: [1898 -> (1024, 256, 1, 1)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_745 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1917 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1918 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1919 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_745 [DequantizeLinear] inputs: [1917 -> (512, 1024, 1, 1)[FLOAT]], [1918 -> ()[FLOAT]], [1919 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1920 for ONNX tensor: 1920 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_745 [DequantizeLinear] outputs: [1920 -> (512, 1024, 1, 1)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_760 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1932 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1933 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1934 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_760 [DequantizeLinear] inputs: [1932 -> (512, 512, 3, 3)[FLOAT]], [1933 -> ()[FLOAT]], [1934 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1935 for ONNX tensor: 1935 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_760 [DequantizeLinear] outputs: [1935 -> (512, 512, 3, 3)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_775 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1947 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1948 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1949 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_775 [DequantizeLinear] inputs: [1947 -> (2048, 512, 1, 1)[FLOAT]], [1948 -> ()[FLOAT]], [1949 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1950 for ONNX tensor: 1950 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_775 [DequantizeLinear] outputs: [1950 -> (2048, 512, 1, 1)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_789 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1961 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1962 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1963 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_789 [DequantizeLinear] inputs: [1961 -> (2048, 1024, 1, 1)[FLOAT]], [1962 -> ()[FLOAT]], [1963 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1964 for ONNX tensor: 1964 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_789 [DequantizeLinear] outputs: [1964 -> (2048, 1024, 1, 1)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_811 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1983 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1984 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1985 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_811 [DequantizeLinear] inputs: [1983 -> (512, 2048, 1, 1)[FLOAT]], [1984 -> ()[FLOAT]], [1985 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1986 for ONNX tensor: 1986 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_811 [DequantizeLinear] outputs: [1986 -> (512, 2048, 1, 1)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_826 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1998 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1999 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2000 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_826 [DequantizeLinear] inputs: [1998 -> (512, 512, 3, 3)[FLOAT]], [1999 -> ()[FLOAT]], [2000 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 2001 for ONNX tensor: 2001 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_826 [DequantizeLinear] outputs: [2001 -> (512, 512, 3, 3)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_841 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2013 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2014 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2015 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_841 [DequantizeLinear] inputs: [2013 -> (2048, 512, 1, 1)[FLOAT]], [2014 -> ()[FLOAT]], [2015 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 2016 for ONNX tensor: 2016 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_841 [DequantizeLinear] outputs: [2016 -> (2048, 512, 1, 1)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_863 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2035 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2036 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2037 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_863 [DequantizeLinear] inputs: [2035 -> (512, 2048, 1, 1)[FLOAT]], [2036 -> ()[FLOAT]], [2037 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 2038 for ONNX tensor: 2038 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_863 [DequantizeLinear] outputs: [2038 -> (512, 2048, 1, 1)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_878 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2050 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2051 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2052 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_878 [DequantizeLinear] inputs: [2050 -> (512, 512, 3, 3)[FLOAT]], [2051 -> ()[FLOAT]], [2052 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 2053 for ONNX tensor: 2053 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_878 [DequantizeLinear] outputs: [2053 -> (512, 512, 3, 3)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_893 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2065 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2066 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2067 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_893 [DequantizeLinear] inputs: [2065 -> (2048, 512, 1, 1)[FLOAT]], [2066 -> ()[FLOAT]], [2067 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 2068 for ONNX tensor: 2068 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_893 [DequantizeLinear] outputs: [2068 -> (2048, 512, 1, 1)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Conv_12 [Conv] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1180 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1186 [03/25/2022-13:24:03] [V] [TRT] Conv_12 [Conv] inputs: [1180 -> (-1, 3, 224, 224)[FLOAT]], [1186 -> (64, 3, 7, 7)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Convolution input dimensions: (-1, 3, 224, 224) [03/25/2022-13:24:03] [V] [TRT] Kernel weights are not set yet. Kernel weights must be set using setInput(1, kernel_tensor) API call. [03/25/2022-13:24:03] [V] [TRT] Registering layer: Conv_12 for ONNX node: Conv_12 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1187 for ONNX tensor: 1187 [03/25/2022-13:24:03] [V] [TRT] Conv_12 [Conv] outputs: [1187 -> (-1, 64, 112, 112)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: BatchNormalization_13 [BatchNormalization] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1187 [03/25/2022-13:24:03] [V] [TRT] Searching for input: input.bn.bn.weight [03/25/2022-13:24:03] [V] [TRT] Searching for input: input.bn.bn.bias [03/25/2022-13:24:03] [V] [TRT] Searching for input: input.bn.bn.running_mean [03/25/2022-13:24:03] [V] [TRT] Searching for input: input.bn.bn.running_var [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_13 [BatchNormalization] inputs: [1187 -> (-1, 64, 112, 112)[FLOAT]], [input.bn.bn.weight -> (64)[FLOAT]], [input.bn.bn.bias -> (64)[FLOAT]], [input.bn.bn.running_mean -> (64)[FLOAT]], [input.bn.bn.running_var -> (64)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: BatchNormalization_13 for ONNX node: BatchNormalization_13 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1188 for ONNX tensor: 1188 [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_13 [BatchNormalization] outputs: [1188 -> (-1, 64, 112, 112)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Relu_14 [Relu] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1188 [03/25/2022-13:24:03] [V] [TRT] Relu_14 [Relu] inputs: [1188 -> (-1, 64, 112, 112)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: Relu_14 for ONNX node: Relu_14 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1189 for ONNX tensor: 1189 [03/25/2022-13:24:03] [V] [TRT] Relu_14 [Relu] outputs: [1189 -> (-1, 64, 112, 112)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: MaxPool_15 [MaxPool] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1189 [03/25/2022-13:24:03] [V] [TRT] MaxPool_15 [MaxPool] inputs: [1189 -> (-1, 64, 112, 112)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: MaxPool_15 for ONNX node: MaxPool_15 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1190 for ONNX tensor: 1190 [03/25/2022-13:24:03] [V] [TRT] MaxPool_15 [MaxPool] outputs: [1190 -> (-1, 64, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: QuantizeLinear_18 [QuantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1190 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1191 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1192 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_18 [QuantizeLinear] inputs: [1190 -> (-1, 64, 56, 56)[FLOAT]], [1191 -> ()[FLOAT]], [1192 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1193 for ONNX tensor: 1193 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_18 [QuantizeLinear] outputs: [1193 -> (-1, 64, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: QuantizeLinear_62 [QuantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1190 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1235 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1236 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_62 [QuantizeLinear] inputs: [1190 -> (-1, 64, 56, 56)[FLOAT]], [1235 -> ()[FLOAT]], [1236 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1237 for ONNX tensor: 1237 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_62 [QuantizeLinear] outputs: [1237 -> (-1, 64, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_21 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1193 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1194 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1195 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_21 [DequantizeLinear] inputs: [1193 -> (-1, 64, 56, 56)[FLOAT]], [1194 -> ()[FLOAT]], [1195 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1196 for ONNX tensor: 1196 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_21 [DequantizeLinear] outputs: [1196 -> (-1, 64, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_65 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1237 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1238 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1239 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_65 [DequantizeLinear] inputs: [1237 -> (-1, 64, 56, 56)[FLOAT]], [1238 -> ()[FLOAT]], [1239 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1240 for ONNX tensor: 1240 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_65 [DequantizeLinear] outputs: [1240 -> (-1, 64, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Conv_28 [Conv] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1196 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1202 [03/25/2022-13:24:03] [V] [TRT] Conv_28 [Conv] inputs: [1196 -> (-1, 64, 56, 56)[FLOAT]], [1202 -> (64, 64, 1, 1)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Convolution input dimensions: (-1, 64, 56, 56) [03/25/2022-13:24:03] [V] [TRT] Kernel weights are not set yet. Kernel weights must be set using setInput(1, kernel_tensor) API call. [03/25/2022-13:24:03] [V] [TRT] Registering layer: Conv_28 for ONNX node: Conv_28 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1203 for ONNX tensor: 1203 [03/25/2022-13:24:03] [V] [TRT] Conv_28 [Conv] outputs: [1203 -> (-1, 64, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Conv_72 [Conv] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1240 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1246 [03/25/2022-13:24:03] [V] [TRT] Conv_72 [Conv] inputs: [1240 -> (-1, 64, 56, 56)[FLOAT]], [1246 -> (256, 64, 1, 1)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Convolution input dimensions: (-1, 64, 56, 56) [03/25/2022-13:24:03] [V] [TRT] Kernel weights are not set yet. Kernel weights must be set using setInput(1, kernel_tensor) API call. [03/25/2022-13:24:03] [V] [TRT] Registering layer: Conv_72 for ONNX node: Conv_72 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1247 for ONNX tensor: 1247 [03/25/2022-13:24:03] [V] [TRT] Conv_72 [Conv] outputs: [1247 -> (-1, 256, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: BatchNormalization_29 [BatchNormalization] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1203 [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.0.0.bn1.bn.weight [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.0.0.bn1.bn.bias [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.0.0.bn1.bn.running_mean [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.0.0.bn1.bn.running_var [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_29 [BatchNormalization] inputs: [1203 -> (-1, 64, 56, 56)[FLOAT]], [sections.0.0.bn1.bn.weight -> (64)[FLOAT]], [sections.0.0.bn1.bn.bias -> (64)[FLOAT]], [sections.0.0.bn1.bn.running_mean -> (64)[FLOAT]], [sections.0.0.bn1.bn.running_var -> (64)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: BatchNormalization_29 for ONNX node: BatchNormalization_29 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1204 for ONNX tensor: 1204 [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_29 [BatchNormalization] outputs: [1204 -> (-1, 64, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: BatchNormalization_73 [BatchNormalization] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1247 [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.0.0.identity.bn.bn.weight [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.0.0.identity.bn.bn.bias [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.0.0.identity.bn.bn.running_mean [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.0.0.identity.bn.bn.running_var [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_73 [BatchNormalization] inputs: [1247 -> (-1, 256, 56, 56)[FLOAT]], [sections.0.0.identity.bn.bn.weight -> (256)[FLOAT]], [sections.0.0.identity.bn.bn.bias -> (256)[FLOAT]], [sections.0.0.identity.bn.bn.running_mean -> (256)[FLOAT]], [sections.0.0.identity.bn.bn.running_var -> (256)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: BatchNormalization_73 for ONNX node: BatchNormalization_73 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1248 for ONNX tensor: 1248 [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_73 [BatchNormalization] outputs: [1248 -> (-1, 256, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Relu_30 [Relu] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1204 [03/25/2022-13:24:03] [V] [TRT] Relu_30 [Relu] inputs: [1204 -> (-1, 64, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: Relu_30 for ONNX node: Relu_30 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1205 for ONNX tensor: 1205 [03/25/2022-13:24:03] [V] [TRT] Relu_30 [Relu] outputs: [1205 -> (-1, 64, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: QuantizeLinear_76 [QuantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1248 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1249 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1250 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_76 [QuantizeLinear] inputs: [1248 -> (-1, 256, 56, 56)[FLOAT]], [1249 -> ()[FLOAT]], [1250 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1251 for ONNX tensor: 1251 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_76 [QuantizeLinear] outputs: [1251 -> (-1, 256, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: QuantizeLinear_33 [QuantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1205 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1206 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1207 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_33 [QuantizeLinear] inputs: [1205 -> (-1, 64, 56, 56)[FLOAT]], [1206 -> ()[FLOAT]], [1207 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1208 for ONNX tensor: 1208 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_33 [QuantizeLinear] outputs: [1208 -> (-1, 64, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_79 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1251 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1252 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1253 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_79 [DequantizeLinear] inputs: [1251 -> (-1, 256, 56, 56)[FLOAT]], [1252 -> ()[FLOAT]], [1253 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1254 for ONNX tensor: 1254 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_79 [DequantizeLinear] outputs: [1254 -> (-1, 256, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_36 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1208 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1209 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1210 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_36 [DequantizeLinear] inputs: [1208 -> (-1, 64, 56, 56)[FLOAT]], [1209 -> ()[FLOAT]], [1210 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1211 for ONNX tensor: 1211 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_36 [DequantizeLinear] outputs: [1211 -> (-1, 64, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Conv_43 [Conv] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1211 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1217 [03/25/2022-13:24:03] [V] [TRT] Conv_43 [Conv] inputs: [1211 -> (-1, 64, 56, 56)[FLOAT]], [1217 -> (64, 64, 3, 3)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Convolution input dimensions: (-1, 64, 56, 56) [03/25/2022-13:24:03] [V] [TRT] Kernel weights are not set yet. Kernel weights must be set using setInput(1, kernel_tensor) API call. [03/25/2022-13:24:03] [V] [TRT] Registering layer: Conv_43 for ONNX node: Conv_43 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1218 for ONNX tensor: 1218 [03/25/2022-13:24:03] [V] [TRT] Conv_43 [Conv] outputs: [1218 -> (-1, 64, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: BatchNormalization_44 [BatchNormalization] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1218 [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.0.0.bn2.bn.weight [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.0.0.bn2.bn.bias [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.0.0.bn2.bn.running_mean [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.0.0.bn2.bn.running_var [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_44 [BatchNormalization] inputs: [1218 -> (-1, 64, 56, 56)[FLOAT]], [sections.0.0.bn2.bn.weight -> (64)[FLOAT]], [sections.0.0.bn2.bn.bias -> (64)[FLOAT]], [sections.0.0.bn2.bn.running_mean -> (64)[FLOAT]], [sections.0.0.bn2.bn.running_var -> (64)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: BatchNormalization_44 for ONNX node: BatchNormalization_44 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1219 for ONNX tensor: 1219 [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_44 [BatchNormalization] outputs: [1219 -> (-1, 64, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Relu_45 [Relu] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1219 [03/25/2022-13:24:03] [V] [TRT] Relu_45 [Relu] inputs: [1219 -> (-1, 64, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: Relu_45 for ONNX node: Relu_45 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1220 for ONNX tensor: 1220 [03/25/2022-13:24:03] [V] [TRT] Relu_45 [Relu] outputs: [1220 -> (-1, 64, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: QuantizeLinear_48 [QuantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1220 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1221 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1222 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_48 [QuantizeLinear] inputs: [1220 -> (-1, 64, 56, 56)[FLOAT]], [1221 -> ()[FLOAT]], [1222 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1223 for ONNX tensor: 1223 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_48 [QuantizeLinear] outputs: [1223 -> (-1, 64, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_51 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1223 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1224 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1225 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_51 [DequantizeLinear] inputs: [1223 -> (-1, 64, 56, 56)[FLOAT]], [1224 -> ()[FLOAT]], [1225 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1226 for ONNX tensor: 1226 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_51 [DequantizeLinear] outputs: [1226 -> (-1, 64, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Conv_58 [Conv] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1226 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1232 [03/25/2022-13:24:03] [V] [TRT] Conv_58 [Conv] inputs: [1226 -> (-1, 64, 56, 56)[FLOAT]], [1232 -> (256, 64, 1, 1)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Convolution input dimensions: (-1, 64, 56, 56) [03/25/2022-13:24:03] [V] [TRT] Kernel weights are not set yet. Kernel weights must be set using setInput(1, kernel_tensor) API call. [03/25/2022-13:24:03] [V] [TRT] Registering layer: Conv_58 for ONNX node: Conv_58 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1233 for ONNX tensor: 1233 [03/25/2022-13:24:03] [V] [TRT] Conv_58 [Conv] outputs: [1233 -> (-1, 256, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: BatchNormalization_59 [BatchNormalization] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1233 [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.0.0.bn3.bn.weight [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.0.0.bn3.bn.bias [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.0.0.bn3.bn.running_mean [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.0.0.bn3.bn.running_var [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_59 [BatchNormalization] inputs: [1233 -> (-1, 256, 56, 56)[FLOAT]], [sections.0.0.bn3.bn.weight -> (256)[FLOAT]], [sections.0.0.bn3.bn.bias -> (256)[FLOAT]], [sections.0.0.bn3.bn.running_mean -> (256)[FLOAT]], [sections.0.0.bn3.bn.running_var -> (256)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: BatchNormalization_59 for ONNX node: BatchNormalization_59 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1234 for ONNX tensor: 1234 [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_59 [BatchNormalization] outputs: [1234 -> (-1, 256, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Add_80 [Add] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1254 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1234 [03/25/2022-13:24:03] [V] [TRT] Add_80 [Add] inputs: [1254 -> (-1, 256, 56, 56)[FLOAT]], [1234 -> (-1, 256, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: Add_80 for ONNX node: Add_80 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1255 for ONNX tensor: 1255 [03/25/2022-13:24:03] [V] [TRT] Add_80 [Add] outputs: [1255 -> (-1, 256, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Relu_81 [Relu] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1255 [03/25/2022-13:24:03] [V] [TRT] Relu_81 [Relu] inputs: [1255 -> (-1, 256, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: Relu_81 for ONNX node: Relu_81 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1256 for ONNX tensor: 1256 [03/25/2022-13:24:03] [V] [TRT] Relu_81 [Relu] outputs: [1256 -> (-1, 256, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: QuantizeLinear_84 [QuantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1256 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1257 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1258 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_84 [QuantizeLinear] inputs: [1256 -> (-1, 256, 56, 56)[FLOAT]], [1257 -> ()[FLOAT]], [1258 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1259 for ONNX tensor: 1259 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_84 [QuantizeLinear] outputs: [1259 -> (-1, 256, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: QuantizeLinear_128 [QuantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1256 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1301 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1302 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_128 [QuantizeLinear] inputs: [1256 -> (-1, 256, 56, 56)[FLOAT]], [1301 -> ()[FLOAT]], [1302 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1303 for ONNX tensor: 1303 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_128 [QuantizeLinear] outputs: [1303 -> (-1, 256, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_87 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1259 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1260 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1261 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_87 [DequantizeLinear] inputs: [1259 -> (-1, 256, 56, 56)[FLOAT]], [1260 -> ()[FLOAT]], [1261 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1262 for ONNX tensor: 1262 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_87 [DequantizeLinear] outputs: [1262 -> (-1, 256, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_131 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1303 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1304 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1305 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_131 [DequantizeLinear] inputs: [1303 -> (-1, 256, 56, 56)[FLOAT]], [1304 -> ()[FLOAT]], [1305 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1306 for ONNX tensor: 1306 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_131 [DequantizeLinear] outputs: [1306 -> (-1, 256, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Conv_94 [Conv] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1262 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1268 [03/25/2022-13:24:03] [V] [TRT] Conv_94 [Conv] inputs: [1262 -> (-1, 256, 56, 56)[FLOAT]], [1268 -> (64, 256, 1, 1)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Convolution input dimensions: (-1, 256, 56, 56) [03/25/2022-13:24:03] [V] [TRT] Kernel weights are not set yet. Kernel weights must be set using setInput(1, kernel_tensor) API call. [03/25/2022-13:24:03] [V] [TRT] Registering layer: Conv_94 for ONNX node: Conv_94 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1269 for ONNX tensor: 1269 [03/25/2022-13:24:03] [V] [TRT] Conv_94 [Conv] outputs: [1269 -> (-1, 64, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: BatchNormalization_95 [BatchNormalization] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1269 [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.0.1.bn1.bn.weight [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.0.1.bn1.bn.bias [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.0.1.bn1.bn.running_mean [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.0.1.bn1.bn.running_var [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_95 [BatchNormalization] inputs: [1269 -> (-1, 64, 56, 56)[FLOAT]], [sections.0.1.bn1.bn.weight -> (64)[FLOAT]], [sections.0.1.bn1.bn.bias -> (64)[FLOAT]], [sections.0.1.bn1.bn.running_mean -> (64)[FLOAT]], [sections.0.1.bn1.bn.running_var -> (64)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: BatchNormalization_95 for ONNX node: BatchNormalization_95 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1270 for ONNX tensor: 1270 [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_95 [BatchNormalization] outputs: [1270 -> (-1, 64, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Relu_96 [Relu] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1270 [03/25/2022-13:24:03] [V] [TRT] Relu_96 [Relu] inputs: [1270 -> (-1, 64, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: Relu_96 for ONNX node: Relu_96 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1271 for ONNX tensor: 1271 [03/25/2022-13:24:03] [V] [TRT] Relu_96 [Relu] outputs: [1271 -> (-1, 64, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: QuantizeLinear_99 [QuantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1271 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1272 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1273 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_99 [QuantizeLinear] inputs: [1271 -> (-1, 64, 56, 56)[FLOAT]], [1272 -> ()[FLOAT]], [1273 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1274 for ONNX tensor: 1274 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_99 [QuantizeLinear] outputs: [1274 -> (-1, 64, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_102 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1274 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1275 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1276 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_102 [DequantizeLinear] inputs: [1274 -> (-1, 64, 56, 56)[FLOAT]], [1275 -> ()[FLOAT]], [1276 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1277 for ONNX tensor: 1277 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_102 [DequantizeLinear] outputs: [1277 -> (-1, 64, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Conv_109 [Conv] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1277 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1283 [03/25/2022-13:24:03] [V] [TRT] Conv_109 [Conv] inputs: [1277 -> (-1, 64, 56, 56)[FLOAT]], [1283 -> (64, 64, 3, 3)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Convolution input dimensions: (-1, 64, 56, 56) [03/25/2022-13:24:03] [V] [TRT] Kernel weights are not set yet. Kernel weights must be set using setInput(1, kernel_tensor) API call. [03/25/2022-13:24:03] [V] [TRT] Registering layer: Conv_109 for ONNX node: Conv_109 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1284 for ONNX tensor: 1284 [03/25/2022-13:24:03] [V] [TRT] Conv_109 [Conv] outputs: [1284 -> (-1, 64, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: BatchNormalization_110 [BatchNormalization] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1284 [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.0.1.bn2.bn.weight [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.0.1.bn2.bn.bias [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.0.1.bn2.bn.running_mean [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.0.1.bn2.bn.running_var [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_110 [BatchNormalization] inputs: [1284 -> (-1, 64, 56, 56)[FLOAT]], [sections.0.1.bn2.bn.weight -> (64)[FLOAT]], [sections.0.1.bn2.bn.bias -> (64)[FLOAT]], [sections.0.1.bn2.bn.running_mean -> (64)[FLOAT]], [sections.0.1.bn2.bn.running_var -> (64)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: BatchNormalization_110 for ONNX node: BatchNormalization_110 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1285 for ONNX tensor: 1285 [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_110 [BatchNormalization] outputs: [1285 -> (-1, 64, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Relu_111 [Relu] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1285 [03/25/2022-13:24:03] [V] [TRT] Relu_111 [Relu] inputs: [1285 -> (-1, 64, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: Relu_111 for ONNX node: Relu_111 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1286 for ONNX tensor: 1286 [03/25/2022-13:24:03] [V] [TRT] Relu_111 [Relu] outputs: [1286 -> (-1, 64, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: QuantizeLinear_114 [QuantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1286 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1287 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1288 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_114 [QuantizeLinear] inputs: [1286 -> (-1, 64, 56, 56)[FLOAT]], [1287 -> ()[FLOAT]], [1288 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1289 for ONNX tensor: 1289 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_114 [QuantizeLinear] outputs: [1289 -> (-1, 64, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_117 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1289 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1290 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1291 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_117 [DequantizeLinear] inputs: [1289 -> (-1, 64, 56, 56)[FLOAT]], [1290 -> ()[FLOAT]], [1291 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1292 for ONNX tensor: 1292 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_117 [DequantizeLinear] outputs: [1292 -> (-1, 64, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Conv_124 [Conv] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1292 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1298 [03/25/2022-13:24:03] [V] [TRT] Conv_124 [Conv] inputs: [1292 -> (-1, 64, 56, 56)[FLOAT]], [1298 -> (256, 64, 1, 1)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Convolution input dimensions: (-1, 64, 56, 56) [03/25/2022-13:24:03] [V] [TRT] Kernel weights are not set yet. Kernel weights must be set using setInput(1, kernel_tensor) API call. [03/25/2022-13:24:03] [V] [TRT] Registering layer: Conv_124 for ONNX node: Conv_124 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1299 for ONNX tensor: 1299 [03/25/2022-13:24:03] [V] [TRT] Conv_124 [Conv] outputs: [1299 -> (-1, 256, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: BatchNormalization_125 [BatchNormalization] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1299 [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.0.1.bn3.bn.weight [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.0.1.bn3.bn.bias [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.0.1.bn3.bn.running_mean [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.0.1.bn3.bn.running_var [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_125 [BatchNormalization] inputs: [1299 -> (-1, 256, 56, 56)[FLOAT]], [sections.0.1.bn3.bn.weight -> (256)[FLOAT]], [sections.0.1.bn3.bn.bias -> (256)[FLOAT]], [sections.0.1.bn3.bn.running_mean -> (256)[FLOAT]], [sections.0.1.bn3.bn.running_var -> (256)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: BatchNormalization_125 for ONNX node: BatchNormalization_125 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1300 for ONNX tensor: 1300 [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_125 [BatchNormalization] outputs: [1300 -> (-1, 256, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Add_132 [Add] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1306 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1300 [03/25/2022-13:24:03] [V] [TRT] Add_132 [Add] inputs: [1306 -> (-1, 256, 56, 56)[FLOAT]], [1300 -> (-1, 256, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: Add_132 for ONNX node: Add_132 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1307 for ONNX tensor: 1307 [03/25/2022-13:24:03] [V] [TRT] Add_132 [Add] outputs: [1307 -> (-1, 256, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Relu_133 [Relu] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1307 [03/25/2022-13:24:03] [V] [TRT] Relu_133 [Relu] inputs: [1307 -> (-1, 256, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: Relu_133 for ONNX node: Relu_133 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1308 for ONNX tensor: 1308 [03/25/2022-13:24:03] [V] [TRT] Relu_133 [Relu] outputs: [1308 -> (-1, 256, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: QuantizeLinear_136 [QuantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1308 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1309 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1310 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_136 [QuantizeLinear] inputs: [1308 -> (-1, 256, 56, 56)[FLOAT]], [1309 -> ()[FLOAT]], [1310 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1311 for ONNX tensor: 1311 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_136 [QuantizeLinear] outputs: [1311 -> (-1, 256, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: QuantizeLinear_180 [QuantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1308 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1353 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1354 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_180 [QuantizeLinear] inputs: [1308 -> (-1, 256, 56, 56)[FLOAT]], [1353 -> ()[FLOAT]], [1354 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1355 for ONNX tensor: 1355 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_180 [QuantizeLinear] outputs: [1355 -> (-1, 256, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_139 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1311 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1312 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1313 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_139 [DequantizeLinear] inputs: [1311 -> (-1, 256, 56, 56)[FLOAT]], [1312 -> ()[FLOAT]], [1313 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1314 for ONNX tensor: 1314 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_139 [DequantizeLinear] outputs: [1314 -> (-1, 256, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_183 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1355 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1356 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1357 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_183 [DequantizeLinear] inputs: [1355 -> (-1, 256, 56, 56)[FLOAT]], [1356 -> ()[FLOAT]], [1357 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1358 for ONNX tensor: 1358 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_183 [DequantizeLinear] outputs: [1358 -> (-1, 256, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Conv_146 [Conv] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1314 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1320 [03/25/2022-13:24:03] [V] [TRT] Conv_146 [Conv] inputs: [1314 -> (-1, 256, 56, 56)[FLOAT]], [1320 -> (64, 256, 1, 1)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Convolution input dimensions: (-1, 256, 56, 56) [03/25/2022-13:24:03] [V] [TRT] Kernel weights are not set yet. Kernel weights must be set using setInput(1, kernel_tensor) API call. [03/25/2022-13:24:03] [V] [TRT] Registering layer: Conv_146 for ONNX node: Conv_146 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1321 for ONNX tensor: 1321 [03/25/2022-13:24:03] [V] [TRT] Conv_146 [Conv] outputs: [1321 -> (-1, 64, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: BatchNormalization_147 [BatchNormalization] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1321 [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.0.2.bn1.bn.weight [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.0.2.bn1.bn.bias [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.0.2.bn1.bn.running_mean [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.0.2.bn1.bn.running_var [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_147 [BatchNormalization] inputs: [1321 -> (-1, 64, 56, 56)[FLOAT]], [sections.0.2.bn1.bn.weight -> (64)[FLOAT]], [sections.0.2.bn1.bn.bias -> (64)[FLOAT]], [sections.0.2.bn1.bn.running_mean -> (64)[FLOAT]], [sections.0.2.bn1.bn.running_var -> (64)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: BatchNormalization_147 for ONNX node: BatchNormalization_147 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1322 for ONNX tensor: 1322 [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_147 [BatchNormalization] outputs: [1322 -> (-1, 64, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Relu_148 [Relu] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1322 [03/25/2022-13:24:03] [V] [TRT] Relu_148 [Relu] inputs: [1322 -> (-1, 64, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: Relu_148 for ONNX node: Relu_148 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1323 for ONNX tensor: 1323 [03/25/2022-13:24:03] [V] [TRT] Relu_148 [Relu] outputs: [1323 -> (-1, 64, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: QuantizeLinear_151 [QuantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1323 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1324 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1325 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_151 [QuantizeLinear] inputs: [1323 -> (-1, 64, 56, 56)[FLOAT]], [1324 -> ()[FLOAT]], [1325 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1326 for ONNX tensor: 1326 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_151 [QuantizeLinear] outputs: [1326 -> (-1, 64, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_154 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1326 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1327 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1328 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_154 [DequantizeLinear] inputs: [1326 -> (-1, 64, 56, 56)[FLOAT]], [1327 -> ()[FLOAT]], [1328 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1329 for ONNX tensor: 1329 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_154 [DequantizeLinear] outputs: [1329 -> (-1, 64, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Conv_161 [Conv] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1329 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1335 [03/25/2022-13:24:03] [V] [TRT] Conv_161 [Conv] inputs: [1329 -> (-1, 64, 56, 56)[FLOAT]], [1335 -> (64, 64, 3, 3)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Convolution input dimensions: (-1, 64, 56, 56) [03/25/2022-13:24:03] [V] [TRT] Kernel weights are not set yet. Kernel weights must be set using setInput(1, kernel_tensor) API call. [03/25/2022-13:24:03] [V] [TRT] Registering layer: Conv_161 for ONNX node: Conv_161 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1336 for ONNX tensor: 1336 [03/25/2022-13:24:03] [V] [TRT] Conv_161 [Conv] outputs: [1336 -> (-1, 64, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: BatchNormalization_162 [BatchNormalization] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1336 [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.0.2.bn2.bn.weight [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.0.2.bn2.bn.bias [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.0.2.bn2.bn.running_mean [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.0.2.bn2.bn.running_var [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_162 [BatchNormalization] inputs: [1336 -> (-1, 64, 56, 56)[FLOAT]], [sections.0.2.bn2.bn.weight -> (64)[FLOAT]], [sections.0.2.bn2.bn.bias -> (64)[FLOAT]], [sections.0.2.bn2.bn.running_mean -> (64)[FLOAT]], [sections.0.2.bn2.bn.running_var -> (64)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: BatchNormalization_162 for ONNX node: BatchNormalization_162 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1337 for ONNX tensor: 1337 [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_162 [BatchNormalization] outputs: [1337 -> (-1, 64, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Relu_163 [Relu] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1337 [03/25/2022-13:24:03] [V] [TRT] Relu_163 [Relu] inputs: [1337 -> (-1, 64, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: Relu_163 for ONNX node: Relu_163 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1338 for ONNX tensor: 1338 [03/25/2022-13:24:03] [V] [TRT] Relu_163 [Relu] outputs: [1338 -> (-1, 64, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: QuantizeLinear_166 [QuantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1338 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1339 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1340 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_166 [QuantizeLinear] inputs: [1338 -> (-1, 64, 56, 56)[FLOAT]], [1339 -> ()[FLOAT]], [1340 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1341 for ONNX tensor: 1341 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_166 [QuantizeLinear] outputs: [1341 -> (-1, 64, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_169 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1341 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1342 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1343 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_169 [DequantizeLinear] inputs: [1341 -> (-1, 64, 56, 56)[FLOAT]], [1342 -> ()[FLOAT]], [1343 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1344 for ONNX tensor: 1344 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_169 [DequantizeLinear] outputs: [1344 -> (-1, 64, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Conv_176 [Conv] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1344 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1350 [03/25/2022-13:24:03] [V] [TRT] Conv_176 [Conv] inputs: [1344 -> (-1, 64, 56, 56)[FLOAT]], [1350 -> (256, 64, 1, 1)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Convolution input dimensions: (-1, 64, 56, 56) [03/25/2022-13:24:03] [V] [TRT] Kernel weights are not set yet. Kernel weights must be set using setInput(1, kernel_tensor) API call. [03/25/2022-13:24:03] [V] [TRT] Registering layer: Conv_176 for ONNX node: Conv_176 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1351 for ONNX tensor: 1351 [03/25/2022-13:24:03] [V] [TRT] Conv_176 [Conv] outputs: [1351 -> (-1, 256, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: BatchNormalization_177 [BatchNormalization] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1351 [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.0.2.bn3.bn.weight [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.0.2.bn3.bn.bias [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.0.2.bn3.bn.running_mean [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.0.2.bn3.bn.running_var [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_177 [BatchNormalization] inputs: [1351 -> (-1, 256, 56, 56)[FLOAT]], [sections.0.2.bn3.bn.weight -> (256)[FLOAT]], [sections.0.2.bn3.bn.bias -> (256)[FLOAT]], [sections.0.2.bn3.bn.running_mean -> (256)[FLOAT]], [sections.0.2.bn3.bn.running_var -> (256)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: BatchNormalization_177 for ONNX node: BatchNormalization_177 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1352 for ONNX tensor: 1352 [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_177 [BatchNormalization] outputs: [1352 -> (-1, 256, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Add_184 [Add] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1358 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1352 [03/25/2022-13:24:03] [V] [TRT] Add_184 [Add] inputs: [1358 -> (-1, 256, 56, 56)[FLOAT]], [1352 -> (-1, 256, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: Add_184 for ONNX node: Add_184 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1359 for ONNX tensor: 1359 [03/25/2022-13:24:03] [V] [TRT] Add_184 [Add] outputs: [1359 -> (-1, 256, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Relu_185 [Relu] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1359 [03/25/2022-13:24:03] [V] [TRT] Relu_185 [Relu] inputs: [1359 -> (-1, 256, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: Relu_185 for ONNX node: Relu_185 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1360 for ONNX tensor: 1360 [03/25/2022-13:24:03] [V] [TRT] Relu_185 [Relu] outputs: [1360 -> (-1, 256, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: QuantizeLinear_188 [QuantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1360 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1361 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1362 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_188 [QuantizeLinear] inputs: [1360 -> (-1, 256, 56, 56)[FLOAT]], [1361 -> ()[FLOAT]], [1362 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1363 for ONNX tensor: 1363 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_188 [QuantizeLinear] outputs: [1363 -> (-1, 256, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: QuantizeLinear_232 [QuantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1360 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1405 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1406 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_232 [QuantizeLinear] inputs: [1360 -> (-1, 256, 56, 56)[FLOAT]], [1405 -> ()[FLOAT]], [1406 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1407 for ONNX tensor: 1407 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_232 [QuantizeLinear] outputs: [1407 -> (-1, 256, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_191 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1363 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1364 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1365 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_191 [DequantizeLinear] inputs: [1363 -> (-1, 256, 56, 56)[FLOAT]], [1364 -> ()[FLOAT]], [1365 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1366 for ONNX tensor: 1366 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_191 [DequantizeLinear] outputs: [1366 -> (-1, 256, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_235 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1407 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1408 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1409 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_235 [DequantizeLinear] inputs: [1407 -> (-1, 256, 56, 56)[FLOAT]], [1408 -> ()[FLOAT]], [1409 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1410 for ONNX tensor: 1410 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_235 [DequantizeLinear] outputs: [1410 -> (-1, 256, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Conv_198 [Conv] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1366 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1372 [03/25/2022-13:24:03] [V] [TRT] Conv_198 [Conv] inputs: [1366 -> (-1, 256, 56, 56)[FLOAT]], [1372 -> (128, 256, 1, 1)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Convolution input dimensions: (-1, 256, 56, 56) [03/25/2022-13:24:03] [V] [TRT] Kernel weights are not set yet. Kernel weights must be set using setInput(1, kernel_tensor) API call. [03/25/2022-13:24:03] [V] [TRT] Registering layer: Conv_198 for ONNX node: Conv_198 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1373 for ONNX tensor: 1373 [03/25/2022-13:24:03] [V] [TRT] Conv_198 [Conv] outputs: [1373 -> (-1, 128, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Conv_242 [Conv] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1410 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1416 [03/25/2022-13:24:03] [V] [TRT] Conv_242 [Conv] inputs: [1410 -> (-1, 256, 56, 56)[FLOAT]], [1416 -> (512, 256, 1, 1)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Convolution input dimensions: (-1, 256, 56, 56) [03/25/2022-13:24:03] [V] [TRT] Kernel weights are not set yet. Kernel weights must be set using setInput(1, kernel_tensor) API call. [03/25/2022-13:24:03] [V] [TRT] Registering layer: Conv_242 for ONNX node: Conv_242 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1417 for ONNX tensor: 1417 [03/25/2022-13:24:03] [V] [TRT] Conv_242 [Conv] outputs: [1417 -> (-1, 512, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: BatchNormalization_199 [BatchNormalization] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1373 [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.1.0.bn1.bn.weight [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.1.0.bn1.bn.bias [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.1.0.bn1.bn.running_mean [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.1.0.bn1.bn.running_var [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_199 [BatchNormalization] inputs: [1373 -> (-1, 128, 56, 56)[FLOAT]], [sections.1.0.bn1.bn.weight -> (128)[FLOAT]], [sections.1.0.bn1.bn.bias -> (128)[FLOAT]], [sections.1.0.bn1.bn.running_mean -> (128)[FLOAT]], [sections.1.0.bn1.bn.running_var -> (128)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: BatchNormalization_199 for ONNX node: BatchNormalization_199 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1374 for ONNX tensor: 1374 [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_199 [BatchNormalization] outputs: [1374 -> (-1, 128, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: BatchNormalization_243 [BatchNormalization] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1417 [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.1.0.identity.bn.bn.weight [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.1.0.identity.bn.bn.bias [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.1.0.identity.bn.bn.running_mean [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.1.0.identity.bn.bn.running_var [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_243 [BatchNormalization] inputs: [1417 -> (-1, 512, 28, 28)[FLOAT]], [sections.1.0.identity.bn.bn.weight -> (512)[FLOAT]], [sections.1.0.identity.bn.bn.bias -> (512)[FLOAT]], [sections.1.0.identity.bn.bn.running_mean -> (512)[FLOAT]], [sections.1.0.identity.bn.bn.running_var -> (512)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: BatchNormalization_243 for ONNX node: BatchNormalization_243 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1418 for ONNX tensor: 1418 [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_243 [BatchNormalization] outputs: [1418 -> (-1, 512, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Relu_200 [Relu] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1374 [03/25/2022-13:24:03] [V] [TRT] Relu_200 [Relu] inputs: [1374 -> (-1, 128, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: Relu_200 for ONNX node: Relu_200 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1375 for ONNX tensor: 1375 [03/25/2022-13:24:03] [V] [TRT] Relu_200 [Relu] outputs: [1375 -> (-1, 128, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: QuantizeLinear_246 [QuantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1418 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1419 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1420 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_246 [QuantizeLinear] inputs: [1418 -> (-1, 512, 28, 28)[FLOAT]], [1419 -> ()[FLOAT]], [1420 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1421 for ONNX tensor: 1421 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_246 [QuantizeLinear] outputs: [1421 -> (-1, 512, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: QuantizeLinear_203 [QuantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1375 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1376 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1377 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_203 [QuantizeLinear] inputs: [1375 -> (-1, 128, 56, 56)[FLOAT]], [1376 -> ()[FLOAT]], [1377 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1378 for ONNX tensor: 1378 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_203 [QuantizeLinear] outputs: [1378 -> (-1, 128, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_249 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1421 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1422 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1423 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_249 [DequantizeLinear] inputs: [1421 -> (-1, 512, 28, 28)[FLOAT]], [1422 -> ()[FLOAT]], [1423 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1424 for ONNX tensor: 1424 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_249 [DequantizeLinear] outputs: [1424 -> (-1, 512, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_206 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1378 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1379 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1380 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_206 [DequantizeLinear] inputs: [1378 -> (-1, 128, 56, 56)[FLOAT]], [1379 -> ()[FLOAT]], [1380 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1381 for ONNX tensor: 1381 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_206 [DequantizeLinear] outputs: [1381 -> (-1, 128, 56, 56)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Conv_213 [Conv] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1381 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1387 [03/25/2022-13:24:03] [V] [TRT] Conv_213 [Conv] inputs: [1381 -> (-1, 128, 56, 56)[FLOAT]], [1387 -> (128, 128, 3, 3)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Convolution input dimensions: (-1, 128, 56, 56) [03/25/2022-13:24:03] [V] [TRT] Kernel weights are not set yet. Kernel weights must be set using setInput(1, kernel_tensor) API call. [03/25/2022-13:24:03] [V] [TRT] Registering layer: Conv_213 for ONNX node: Conv_213 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1388 for ONNX tensor: 1388 [03/25/2022-13:24:03] [V] [TRT] Conv_213 [Conv] outputs: [1388 -> (-1, 128, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: BatchNormalization_214 [BatchNormalization] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1388 [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.1.0.bn2.bn.weight [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.1.0.bn2.bn.bias [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.1.0.bn2.bn.running_mean [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.1.0.bn2.bn.running_var [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_214 [BatchNormalization] inputs: [1388 -> (-1, 128, 28, 28)[FLOAT]], [sections.1.0.bn2.bn.weight -> (128)[FLOAT]], [sections.1.0.bn2.bn.bias -> (128)[FLOAT]], [sections.1.0.bn2.bn.running_mean -> (128)[FLOAT]], [sections.1.0.bn2.bn.running_var -> (128)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: BatchNormalization_214 for ONNX node: BatchNormalization_214 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1389 for ONNX tensor: 1389 [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_214 [BatchNormalization] outputs: [1389 -> (-1, 128, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Relu_215 [Relu] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1389 [03/25/2022-13:24:03] [V] [TRT] Relu_215 [Relu] inputs: [1389 -> (-1, 128, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: Relu_215 for ONNX node: Relu_215 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1390 for ONNX tensor: 1390 [03/25/2022-13:24:03] [V] [TRT] Relu_215 [Relu] outputs: [1390 -> (-1, 128, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: QuantizeLinear_218 [QuantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1390 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1391 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1392 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_218 [QuantizeLinear] inputs: [1390 -> (-1, 128, 28, 28)[FLOAT]], [1391 -> ()[FLOAT]], [1392 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1393 for ONNX tensor: 1393 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_218 [QuantizeLinear] outputs: [1393 -> (-1, 128, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_221 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1393 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1394 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1395 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_221 [DequantizeLinear] inputs: [1393 -> (-1, 128, 28, 28)[FLOAT]], [1394 -> ()[FLOAT]], [1395 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1396 for ONNX tensor: 1396 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_221 [DequantizeLinear] outputs: [1396 -> (-1, 128, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Conv_228 [Conv] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1396 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1402 [03/25/2022-13:24:03] [V] [TRT] Conv_228 [Conv] inputs: [1396 -> (-1, 128, 28, 28)[FLOAT]], [1402 -> (512, 128, 1, 1)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Convolution input dimensions: (-1, 128, 28, 28) [03/25/2022-13:24:03] [V] [TRT] Kernel weights are not set yet. Kernel weights must be set using setInput(1, kernel_tensor) API call. [03/25/2022-13:24:03] [V] [TRT] Registering layer: Conv_228 for ONNX node: Conv_228 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1403 for ONNX tensor: 1403 [03/25/2022-13:24:03] [V] [TRT] Conv_228 [Conv] outputs: [1403 -> (-1, 512, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: BatchNormalization_229 [BatchNormalization] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1403 [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.1.0.bn3.bn.weight [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.1.0.bn3.bn.bias [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.1.0.bn3.bn.running_mean [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.1.0.bn3.bn.running_var [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_229 [BatchNormalization] inputs: [1403 -> (-1, 512, 28, 28)[FLOAT]], [sections.1.0.bn3.bn.weight -> (512)[FLOAT]], [sections.1.0.bn3.bn.bias -> (512)[FLOAT]], [sections.1.0.bn3.bn.running_mean -> (512)[FLOAT]], [sections.1.0.bn3.bn.running_var -> (512)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: BatchNormalization_229 for ONNX node: BatchNormalization_229 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1404 for ONNX tensor: 1404 [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_229 [BatchNormalization] outputs: [1404 -> (-1, 512, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Add_250 [Add] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1424 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1404 [03/25/2022-13:24:03] [V] [TRT] Add_250 [Add] inputs: [1424 -> (-1, 512, 28, 28)[FLOAT]], [1404 -> (-1, 512, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: Add_250 for ONNX node: Add_250 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1425 for ONNX tensor: 1425 [03/25/2022-13:24:03] [V] [TRT] Add_250 [Add] outputs: [1425 -> (-1, 512, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Relu_251 [Relu] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1425 [03/25/2022-13:24:03] [V] [TRT] Relu_251 [Relu] inputs: [1425 -> (-1, 512, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: Relu_251 for ONNX node: Relu_251 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1426 for ONNX tensor: 1426 [03/25/2022-13:24:03] [V] [TRT] Relu_251 [Relu] outputs: [1426 -> (-1, 512, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: QuantizeLinear_254 [QuantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1426 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1427 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1428 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_254 [QuantizeLinear] inputs: [1426 -> (-1, 512, 28, 28)[FLOAT]], [1427 -> ()[FLOAT]], [1428 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1429 for ONNX tensor: 1429 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_254 [QuantizeLinear] outputs: [1429 -> (-1, 512, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: QuantizeLinear_298 [QuantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1426 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1471 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1472 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_298 [QuantizeLinear] inputs: [1426 -> (-1, 512, 28, 28)[FLOAT]], [1471 -> ()[FLOAT]], [1472 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1473 for ONNX tensor: 1473 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_298 [QuantizeLinear] outputs: [1473 -> (-1, 512, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_257 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1429 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1430 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1431 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_257 [DequantizeLinear] inputs: [1429 -> (-1, 512, 28, 28)[FLOAT]], [1430 -> ()[FLOAT]], [1431 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1432 for ONNX tensor: 1432 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_257 [DequantizeLinear] outputs: [1432 -> (-1, 512, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_301 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1473 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1474 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1475 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_301 [DequantizeLinear] inputs: [1473 -> (-1, 512, 28, 28)[FLOAT]], [1474 -> ()[FLOAT]], [1475 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1476 for ONNX tensor: 1476 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_301 [DequantizeLinear] outputs: [1476 -> (-1, 512, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Conv_264 [Conv] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1432 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1438 [03/25/2022-13:24:03] [V] [TRT] Conv_264 [Conv] inputs: [1432 -> (-1, 512, 28, 28)[FLOAT]], [1438 -> (128, 512, 1, 1)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Convolution input dimensions: (-1, 512, 28, 28) [03/25/2022-13:24:03] [V] [TRT] Kernel weights are not set yet. Kernel weights must be set using setInput(1, kernel_tensor) API call. [03/25/2022-13:24:03] [V] [TRT] Registering layer: Conv_264 for ONNX node: Conv_264 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1439 for ONNX tensor: 1439 [03/25/2022-13:24:03] [V] [TRT] Conv_264 [Conv] outputs: [1439 -> (-1, 128, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: BatchNormalization_265 [BatchNormalization] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1439 [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.1.1.bn1.bn.weight [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.1.1.bn1.bn.bias [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.1.1.bn1.bn.running_mean [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.1.1.bn1.bn.running_var [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_265 [BatchNormalization] inputs: [1439 -> (-1, 128, 28, 28)[FLOAT]], [sections.1.1.bn1.bn.weight -> (128)[FLOAT]], [sections.1.1.bn1.bn.bias -> (128)[FLOAT]], [sections.1.1.bn1.bn.running_mean -> (128)[FLOAT]], [sections.1.1.bn1.bn.running_var -> (128)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: BatchNormalization_265 for ONNX node: BatchNormalization_265 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1440 for ONNX tensor: 1440 [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_265 [BatchNormalization] outputs: [1440 -> (-1, 128, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Relu_266 [Relu] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1440 [03/25/2022-13:24:03] [V] [TRT] Relu_266 [Relu] inputs: [1440 -> (-1, 128, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: Relu_266 for ONNX node: Relu_266 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1441 for ONNX tensor: 1441 [03/25/2022-13:24:03] [V] [TRT] Relu_266 [Relu] outputs: [1441 -> (-1, 128, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: QuantizeLinear_269 [QuantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1441 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1442 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1443 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_269 [QuantizeLinear] inputs: [1441 -> (-1, 128, 28, 28)[FLOAT]], [1442 -> ()[FLOAT]], [1443 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1444 for ONNX tensor: 1444 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_269 [QuantizeLinear] outputs: [1444 -> (-1, 128, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_272 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1444 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1445 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1446 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_272 [DequantizeLinear] inputs: [1444 -> (-1, 128, 28, 28)[FLOAT]], [1445 -> ()[FLOAT]], [1446 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1447 for ONNX tensor: 1447 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_272 [DequantizeLinear] outputs: [1447 -> (-1, 128, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Conv_279 [Conv] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1447 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1453 [03/25/2022-13:24:03] [V] [TRT] Conv_279 [Conv] inputs: [1447 -> (-1, 128, 28, 28)[FLOAT]], [1453 -> (128, 128, 3, 3)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Convolution input dimensions: (-1, 128, 28, 28) [03/25/2022-13:24:03] [V] [TRT] Kernel weights are not set yet. Kernel weights must be set using setInput(1, kernel_tensor) API call. [03/25/2022-13:24:03] [V] [TRT] Registering layer: Conv_279 for ONNX node: Conv_279 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1454 for ONNX tensor: 1454 [03/25/2022-13:24:03] [V] [TRT] Conv_279 [Conv] outputs: [1454 -> (-1, 128, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: BatchNormalization_280 [BatchNormalization] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1454 [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.1.1.bn2.bn.weight [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.1.1.bn2.bn.bias [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.1.1.bn2.bn.running_mean [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.1.1.bn2.bn.running_var [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_280 [BatchNormalization] inputs: [1454 -> (-1, 128, 28, 28)[FLOAT]], [sections.1.1.bn2.bn.weight -> (128)[FLOAT]], [sections.1.1.bn2.bn.bias -> (128)[FLOAT]], [sections.1.1.bn2.bn.running_mean -> (128)[FLOAT]], [sections.1.1.bn2.bn.running_var -> (128)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: BatchNormalization_280 for ONNX node: BatchNormalization_280 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1455 for ONNX tensor: 1455 [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_280 [BatchNormalization] outputs: [1455 -> (-1, 128, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Relu_281 [Relu] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1455 [03/25/2022-13:24:03] [V] [TRT] Relu_281 [Relu] inputs: [1455 -> (-1, 128, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: Relu_281 for ONNX node: Relu_281 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1456 for ONNX tensor: 1456 [03/25/2022-13:24:03] [V] [TRT] Relu_281 [Relu] outputs: [1456 -> (-1, 128, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: QuantizeLinear_284 [QuantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1456 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1457 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1458 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_284 [QuantizeLinear] inputs: [1456 -> (-1, 128, 28, 28)[FLOAT]], [1457 -> ()[FLOAT]], [1458 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1459 for ONNX tensor: 1459 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_284 [QuantizeLinear] outputs: [1459 -> (-1, 128, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_287 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1459 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1460 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1461 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_287 [DequantizeLinear] inputs: [1459 -> (-1, 128, 28, 28)[FLOAT]], [1460 -> ()[FLOAT]], [1461 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1462 for ONNX tensor: 1462 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_287 [DequantizeLinear] outputs: [1462 -> (-1, 128, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Conv_294 [Conv] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1462 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1468 [03/25/2022-13:24:03] [V] [TRT] Conv_294 [Conv] inputs: [1462 -> (-1, 128, 28, 28)[FLOAT]], [1468 -> (512, 128, 1, 1)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Convolution input dimensions: (-1, 128, 28, 28) [03/25/2022-13:24:03] [V] [TRT] Kernel weights are not set yet. Kernel weights must be set using setInput(1, kernel_tensor) API call. [03/25/2022-13:24:03] [V] [TRT] Registering layer: Conv_294 for ONNX node: Conv_294 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1469 for ONNX tensor: 1469 [03/25/2022-13:24:03] [V] [TRT] Conv_294 [Conv] outputs: [1469 -> (-1, 512, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: BatchNormalization_295 [BatchNormalization] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1469 [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.1.1.bn3.bn.weight [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.1.1.bn3.bn.bias [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.1.1.bn3.bn.running_mean [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.1.1.bn3.bn.running_var [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_295 [BatchNormalization] inputs: [1469 -> (-1, 512, 28, 28)[FLOAT]], [sections.1.1.bn3.bn.weight -> (512)[FLOAT]], [sections.1.1.bn3.bn.bias -> (512)[FLOAT]], [sections.1.1.bn3.bn.running_mean -> (512)[FLOAT]], [sections.1.1.bn3.bn.running_var -> (512)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: BatchNormalization_295 for ONNX node: BatchNormalization_295 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1470 for ONNX tensor: 1470 [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_295 [BatchNormalization] outputs: [1470 -> (-1, 512, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Add_302 [Add] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1476 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1470 [03/25/2022-13:24:03] [V] [TRT] Add_302 [Add] inputs: [1476 -> (-1, 512, 28, 28)[FLOAT]], [1470 -> (-1, 512, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: Add_302 for ONNX node: Add_302 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1477 for ONNX tensor: 1477 [03/25/2022-13:24:03] [V] [TRT] Add_302 [Add] outputs: [1477 -> (-1, 512, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Relu_303 [Relu] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1477 [03/25/2022-13:24:03] [V] [TRT] Relu_303 [Relu] inputs: [1477 -> (-1, 512, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: Relu_303 for ONNX node: Relu_303 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1478 for ONNX tensor: 1478 [03/25/2022-13:24:03] [V] [TRT] Relu_303 [Relu] outputs: [1478 -> (-1, 512, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: QuantizeLinear_306 [QuantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1478 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1479 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1480 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_306 [QuantizeLinear] inputs: [1478 -> (-1, 512, 28, 28)[FLOAT]], [1479 -> ()[FLOAT]], [1480 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1481 for ONNX tensor: 1481 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_306 [QuantizeLinear] outputs: [1481 -> (-1, 512, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: QuantizeLinear_350 [QuantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1478 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1523 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1524 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_350 [QuantizeLinear] inputs: [1478 -> (-1, 512, 28, 28)[FLOAT]], [1523 -> ()[FLOAT]], [1524 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1525 for ONNX tensor: 1525 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_350 [QuantizeLinear] outputs: [1525 -> (-1, 512, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_309 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1481 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1482 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1483 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_309 [DequantizeLinear] inputs: [1481 -> (-1, 512, 28, 28)[FLOAT]], [1482 -> ()[FLOAT]], [1483 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1484 for ONNX tensor: 1484 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_309 [DequantizeLinear] outputs: [1484 -> (-1, 512, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_353 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1525 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1526 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1527 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_353 [DequantizeLinear] inputs: [1525 -> (-1, 512, 28, 28)[FLOAT]], [1526 -> ()[FLOAT]], [1527 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1528 for ONNX tensor: 1528 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_353 [DequantizeLinear] outputs: [1528 -> (-1, 512, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Conv_316 [Conv] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1484 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1490 [03/25/2022-13:24:03] [V] [TRT] Conv_316 [Conv] inputs: [1484 -> (-1, 512, 28, 28)[FLOAT]], [1490 -> (128, 512, 1, 1)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Convolution input dimensions: (-1, 512, 28, 28) [03/25/2022-13:24:03] [V] [TRT] Kernel weights are not set yet. Kernel weights must be set using setInput(1, kernel_tensor) API call. [03/25/2022-13:24:03] [V] [TRT] Registering layer: Conv_316 for ONNX node: Conv_316 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1491 for ONNX tensor: 1491 [03/25/2022-13:24:03] [V] [TRT] Conv_316 [Conv] outputs: [1491 -> (-1, 128, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: BatchNormalization_317 [BatchNormalization] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1491 [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.1.2.bn1.bn.weight [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.1.2.bn1.bn.bias [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.1.2.bn1.bn.running_mean [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.1.2.bn1.bn.running_var [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_317 [BatchNormalization] inputs: [1491 -> (-1, 128, 28, 28)[FLOAT]], [sections.1.2.bn1.bn.weight -> (128)[FLOAT]], [sections.1.2.bn1.bn.bias -> (128)[FLOAT]], [sections.1.2.bn1.bn.running_mean -> (128)[FLOAT]], [sections.1.2.bn1.bn.running_var -> (128)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: BatchNormalization_317 for ONNX node: BatchNormalization_317 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1492 for ONNX tensor: 1492 [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_317 [BatchNormalization] outputs: [1492 -> (-1, 128, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Relu_318 [Relu] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1492 [03/25/2022-13:24:03] [V] [TRT] Relu_318 [Relu] inputs: [1492 -> (-1, 128, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: Relu_318 for ONNX node: Relu_318 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1493 for ONNX tensor: 1493 [03/25/2022-13:24:03] [V] [TRT] Relu_318 [Relu] outputs: [1493 -> (-1, 128, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: QuantizeLinear_321 [QuantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1493 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1494 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1495 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_321 [QuantizeLinear] inputs: [1493 -> (-1, 128, 28, 28)[FLOAT]], [1494 -> ()[FLOAT]], [1495 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1496 for ONNX tensor: 1496 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_321 [QuantizeLinear] outputs: [1496 -> (-1, 128, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_324 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1496 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1497 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1498 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_324 [DequantizeLinear] inputs: [1496 -> (-1, 128, 28, 28)[FLOAT]], [1497 -> ()[FLOAT]], [1498 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1499 for ONNX tensor: 1499 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_324 [DequantizeLinear] outputs: [1499 -> (-1, 128, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Conv_331 [Conv] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1499 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1505 [03/25/2022-13:24:03] [V] [TRT] Conv_331 [Conv] inputs: [1499 -> (-1, 128, 28, 28)[FLOAT]], [1505 -> (128, 128, 3, 3)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Convolution input dimensions: (-1, 128, 28, 28) [03/25/2022-13:24:03] [V] [TRT] Kernel weights are not set yet. Kernel weights must be set using setInput(1, kernel_tensor) API call. [03/25/2022-13:24:03] [V] [TRT] Registering layer: Conv_331 for ONNX node: Conv_331 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1506 for ONNX tensor: 1506 [03/25/2022-13:24:03] [V] [TRT] Conv_331 [Conv] outputs: [1506 -> (-1, 128, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: BatchNormalization_332 [BatchNormalization] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1506 [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.1.2.bn2.bn.weight [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.1.2.bn2.bn.bias [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.1.2.bn2.bn.running_mean [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.1.2.bn2.bn.running_var [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_332 [BatchNormalization] inputs: [1506 -> (-1, 128, 28, 28)[FLOAT]], [sections.1.2.bn2.bn.weight -> (128)[FLOAT]], [sections.1.2.bn2.bn.bias -> (128)[FLOAT]], [sections.1.2.bn2.bn.running_mean -> (128)[FLOAT]], [sections.1.2.bn2.bn.running_var -> (128)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: BatchNormalization_332 for ONNX node: BatchNormalization_332 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1507 for ONNX tensor: 1507 [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_332 [BatchNormalization] outputs: [1507 -> (-1, 128, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Relu_333 [Relu] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1507 [03/25/2022-13:24:03] [V] [TRT] Relu_333 [Relu] inputs: [1507 -> (-1, 128, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: Relu_333 for ONNX node: Relu_333 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1508 for ONNX tensor: 1508 [03/25/2022-13:24:03] [V] [TRT] Relu_333 [Relu] outputs: [1508 -> (-1, 128, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: QuantizeLinear_336 [QuantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1508 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1509 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1510 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_336 [QuantizeLinear] inputs: [1508 -> (-1, 128, 28, 28)[FLOAT]], [1509 -> ()[FLOAT]], [1510 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1511 for ONNX tensor: 1511 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_336 [QuantizeLinear] outputs: [1511 -> (-1, 128, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_339 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1511 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1512 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1513 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_339 [DequantizeLinear] inputs: [1511 -> (-1, 128, 28, 28)[FLOAT]], [1512 -> ()[FLOAT]], [1513 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1514 for ONNX tensor: 1514 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_339 [DequantizeLinear] outputs: [1514 -> (-1, 128, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Conv_346 [Conv] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1514 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1520 [03/25/2022-13:24:03] [V] [TRT] Conv_346 [Conv] inputs: [1514 -> (-1, 128, 28, 28)[FLOAT]], [1520 -> (512, 128, 1, 1)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Convolution input dimensions: (-1, 128, 28, 28) [03/25/2022-13:24:03] [V] [TRT] Kernel weights are not set yet. Kernel weights must be set using setInput(1, kernel_tensor) API call. [03/25/2022-13:24:03] [V] [TRT] Registering layer: Conv_346 for ONNX node: Conv_346 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1521 for ONNX tensor: 1521 [03/25/2022-13:24:03] [V] [TRT] Conv_346 [Conv] outputs: [1521 -> (-1, 512, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: BatchNormalization_347 [BatchNormalization] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1521 [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.1.2.bn3.bn.weight [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.1.2.bn3.bn.bias [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.1.2.bn3.bn.running_mean [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.1.2.bn3.bn.running_var [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_347 [BatchNormalization] inputs: [1521 -> (-1, 512, 28, 28)[FLOAT]], [sections.1.2.bn3.bn.weight -> (512)[FLOAT]], [sections.1.2.bn3.bn.bias -> (512)[FLOAT]], [sections.1.2.bn3.bn.running_mean -> (512)[FLOAT]], [sections.1.2.bn3.bn.running_var -> (512)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: BatchNormalization_347 for ONNX node: BatchNormalization_347 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1522 for ONNX tensor: 1522 [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_347 [BatchNormalization] outputs: [1522 -> (-1, 512, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Add_354 [Add] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1528 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1522 [03/25/2022-13:24:03] [V] [TRT] Add_354 [Add] inputs: [1528 -> (-1, 512, 28, 28)[FLOAT]], [1522 -> (-1, 512, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: Add_354 for ONNX node: Add_354 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1529 for ONNX tensor: 1529 [03/25/2022-13:24:03] [V] [TRT] Add_354 [Add] outputs: [1529 -> (-1, 512, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Relu_355 [Relu] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1529 [03/25/2022-13:24:03] [V] [TRT] Relu_355 [Relu] inputs: [1529 -> (-1, 512, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: Relu_355 for ONNX node: Relu_355 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1530 for ONNX tensor: 1530 [03/25/2022-13:24:03] [V] [TRT] Relu_355 [Relu] outputs: [1530 -> (-1, 512, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: QuantizeLinear_358 [QuantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1530 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1531 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1532 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_358 [QuantizeLinear] inputs: [1530 -> (-1, 512, 28, 28)[FLOAT]], [1531 -> ()[FLOAT]], [1532 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1533 for ONNX tensor: 1533 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_358 [QuantizeLinear] outputs: [1533 -> (-1, 512, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: QuantizeLinear_402 [QuantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1530 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1575 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1576 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_402 [QuantizeLinear] inputs: [1530 -> (-1, 512, 28, 28)[FLOAT]], [1575 -> ()[FLOAT]], [1576 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1577 for ONNX tensor: 1577 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_402 [QuantizeLinear] outputs: [1577 -> (-1, 512, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_361 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1533 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1534 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1535 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_361 [DequantizeLinear] inputs: [1533 -> (-1, 512, 28, 28)[FLOAT]], [1534 -> ()[FLOAT]], [1535 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1536 for ONNX tensor: 1536 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_361 [DequantizeLinear] outputs: [1536 -> (-1, 512, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_405 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1577 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1578 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1579 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_405 [DequantizeLinear] inputs: [1577 -> (-1, 512, 28, 28)[FLOAT]], [1578 -> ()[FLOAT]], [1579 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1580 for ONNX tensor: 1580 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_405 [DequantizeLinear] outputs: [1580 -> (-1, 512, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Conv_368 [Conv] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1536 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1542 [03/25/2022-13:24:03] [V] [TRT] Conv_368 [Conv] inputs: [1536 -> (-1, 512, 28, 28)[FLOAT]], [1542 -> (128, 512, 1, 1)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Convolution input dimensions: (-1, 512, 28, 28) [03/25/2022-13:24:03] [V] [TRT] Kernel weights are not set yet. Kernel weights must be set using setInput(1, kernel_tensor) API call. [03/25/2022-13:24:03] [V] [TRT] Registering layer: Conv_368 for ONNX node: Conv_368 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1543 for ONNX tensor: 1543 [03/25/2022-13:24:03] [V] [TRT] Conv_368 [Conv] outputs: [1543 -> (-1, 128, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: BatchNormalization_369 [BatchNormalization] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1543 [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.1.3.bn1.bn.weight [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.1.3.bn1.bn.bias [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.1.3.bn1.bn.running_mean [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.1.3.bn1.bn.running_var [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_369 [BatchNormalization] inputs: [1543 -> (-1, 128, 28, 28)[FLOAT]], [sections.1.3.bn1.bn.weight -> (128)[FLOAT]], [sections.1.3.bn1.bn.bias -> (128)[FLOAT]], [sections.1.3.bn1.bn.running_mean -> (128)[FLOAT]], [sections.1.3.bn1.bn.running_var -> (128)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: BatchNormalization_369 for ONNX node: BatchNormalization_369 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1544 for ONNX tensor: 1544 [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_369 [BatchNormalization] outputs: [1544 -> (-1, 128, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Relu_370 [Relu] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1544 [03/25/2022-13:24:03] [V] [TRT] Relu_370 [Relu] inputs: [1544 -> (-1, 128, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: Relu_370 for ONNX node: Relu_370 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1545 for ONNX tensor: 1545 [03/25/2022-13:24:03] [V] [TRT] Relu_370 [Relu] outputs: [1545 -> (-1, 128, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: QuantizeLinear_373 [QuantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1545 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1546 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1547 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_373 [QuantizeLinear] inputs: [1545 -> (-1, 128, 28, 28)[FLOAT]], [1546 -> ()[FLOAT]], [1547 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1548 for ONNX tensor: 1548 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_373 [QuantizeLinear] outputs: [1548 -> (-1, 128, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_376 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1548 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1549 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1550 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_376 [DequantizeLinear] inputs: [1548 -> (-1, 128, 28, 28)[FLOAT]], [1549 -> ()[FLOAT]], [1550 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1551 for ONNX tensor: 1551 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_376 [DequantizeLinear] outputs: [1551 -> (-1, 128, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Conv_383 [Conv] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1551 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1557 [03/25/2022-13:24:03] [V] [TRT] Conv_383 [Conv] inputs: [1551 -> (-1, 128, 28, 28)[FLOAT]], [1557 -> (128, 128, 3, 3)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Convolution input dimensions: (-1, 128, 28, 28) [03/25/2022-13:24:03] [V] [TRT] Kernel weights are not set yet. Kernel weights must be set using setInput(1, kernel_tensor) API call. [03/25/2022-13:24:03] [V] [TRT] Registering layer: Conv_383 for ONNX node: Conv_383 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1558 for ONNX tensor: 1558 [03/25/2022-13:24:03] [V] [TRT] Conv_383 [Conv] outputs: [1558 -> (-1, 128, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: BatchNormalization_384 [BatchNormalization] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1558 [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.1.3.bn2.bn.weight [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.1.3.bn2.bn.bias [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.1.3.bn2.bn.running_mean [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.1.3.bn2.bn.running_var [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_384 [BatchNormalization] inputs: [1558 -> (-1, 128, 28, 28)[FLOAT]], [sections.1.3.bn2.bn.weight -> (128)[FLOAT]], [sections.1.3.bn2.bn.bias -> (128)[FLOAT]], [sections.1.3.bn2.bn.running_mean -> (128)[FLOAT]], [sections.1.3.bn2.bn.running_var -> (128)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: BatchNormalization_384 for ONNX node: BatchNormalization_384 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1559 for ONNX tensor: 1559 [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_384 [BatchNormalization] outputs: [1559 -> (-1, 128, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Relu_385 [Relu] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1559 [03/25/2022-13:24:03] [V] [TRT] Relu_385 [Relu] inputs: [1559 -> (-1, 128, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: Relu_385 for ONNX node: Relu_385 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1560 for ONNX tensor: 1560 [03/25/2022-13:24:03] [V] [TRT] Relu_385 [Relu] outputs: [1560 -> (-1, 128, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: QuantizeLinear_388 [QuantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1560 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1561 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1562 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_388 [QuantizeLinear] inputs: [1560 -> (-1, 128, 28, 28)[FLOAT]], [1561 -> ()[FLOAT]], [1562 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1563 for ONNX tensor: 1563 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_388 [QuantizeLinear] outputs: [1563 -> (-1, 128, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_391 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1563 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1564 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1565 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_391 [DequantizeLinear] inputs: [1563 -> (-1, 128, 28, 28)[FLOAT]], [1564 -> ()[FLOAT]], [1565 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1566 for ONNX tensor: 1566 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_391 [DequantizeLinear] outputs: [1566 -> (-1, 128, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Conv_398 [Conv] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1566 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1572 [03/25/2022-13:24:03] [V] [TRT] Conv_398 [Conv] inputs: [1566 -> (-1, 128, 28, 28)[FLOAT]], [1572 -> (512, 128, 1, 1)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Convolution input dimensions: (-1, 128, 28, 28) [03/25/2022-13:24:03] [V] [TRT] Kernel weights are not set yet. Kernel weights must be set using setInput(1, kernel_tensor) API call. [03/25/2022-13:24:03] [V] [TRT] Registering layer: Conv_398 for ONNX node: Conv_398 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1573 for ONNX tensor: 1573 [03/25/2022-13:24:03] [V] [TRT] Conv_398 [Conv] outputs: [1573 -> (-1, 512, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: BatchNormalization_399 [BatchNormalization] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1573 [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.1.3.bn3.bn.weight [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.1.3.bn3.bn.bias [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.1.3.bn3.bn.running_mean [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.1.3.bn3.bn.running_var [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_399 [BatchNormalization] inputs: [1573 -> (-1, 512, 28, 28)[FLOAT]], [sections.1.3.bn3.bn.weight -> (512)[FLOAT]], [sections.1.3.bn3.bn.bias -> (512)[FLOAT]], [sections.1.3.bn3.bn.running_mean -> (512)[FLOAT]], [sections.1.3.bn3.bn.running_var -> (512)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: BatchNormalization_399 for ONNX node: BatchNormalization_399 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1574 for ONNX tensor: 1574 [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_399 [BatchNormalization] outputs: [1574 -> (-1, 512, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Add_406 [Add] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1580 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1574 [03/25/2022-13:24:03] [V] [TRT] Add_406 [Add] inputs: [1580 -> (-1, 512, 28, 28)[FLOAT]], [1574 -> (-1, 512, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: Add_406 for ONNX node: Add_406 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1581 for ONNX tensor: 1581 [03/25/2022-13:24:03] [V] [TRT] Add_406 [Add] outputs: [1581 -> (-1, 512, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Relu_407 [Relu] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1581 [03/25/2022-13:24:03] [V] [TRT] Relu_407 [Relu] inputs: [1581 -> (-1, 512, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: Relu_407 for ONNX node: Relu_407 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1582 for ONNX tensor: 1582 [03/25/2022-13:24:03] [V] [TRT] Relu_407 [Relu] outputs: [1582 -> (-1, 512, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: QuantizeLinear_410 [QuantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1582 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1583 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1584 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_410 [QuantizeLinear] inputs: [1582 -> (-1, 512, 28, 28)[FLOAT]], [1583 -> ()[FLOAT]], [1584 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1585 for ONNX tensor: 1585 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_410 [QuantizeLinear] outputs: [1585 -> (-1, 512, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: QuantizeLinear_454 [QuantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1582 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1627 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1628 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_454 [QuantizeLinear] inputs: [1582 -> (-1, 512, 28, 28)[FLOAT]], [1627 -> ()[FLOAT]], [1628 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1629 for ONNX tensor: 1629 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_454 [QuantizeLinear] outputs: [1629 -> (-1, 512, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_413 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1585 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1586 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1587 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_413 [DequantizeLinear] inputs: [1585 -> (-1, 512, 28, 28)[FLOAT]], [1586 -> ()[FLOAT]], [1587 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1588 for ONNX tensor: 1588 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_413 [DequantizeLinear] outputs: [1588 -> (-1, 512, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_457 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1629 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1630 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1631 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_457 [DequantizeLinear] inputs: [1629 -> (-1, 512, 28, 28)[FLOAT]], [1630 -> ()[FLOAT]], [1631 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1632 for ONNX tensor: 1632 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_457 [DequantizeLinear] outputs: [1632 -> (-1, 512, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Conv_420 [Conv] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1588 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1594 [03/25/2022-13:24:03] [V] [TRT] Conv_420 [Conv] inputs: [1588 -> (-1, 512, 28, 28)[FLOAT]], [1594 -> (256, 512, 1, 1)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Convolution input dimensions: (-1, 512, 28, 28) [03/25/2022-13:24:03] [V] [TRT] Kernel weights are not set yet. Kernel weights must be set using setInput(1, kernel_tensor) API call. [03/25/2022-13:24:03] [V] [TRT] Registering layer: Conv_420 for ONNX node: Conv_420 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1595 for ONNX tensor: 1595 [03/25/2022-13:24:03] [V] [TRT] Conv_420 [Conv] outputs: [1595 -> (-1, 256, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Conv_464 [Conv] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1632 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1638 [03/25/2022-13:24:03] [V] [TRT] Conv_464 [Conv] inputs: [1632 -> (-1, 512, 28, 28)[FLOAT]], [1638 -> (1024, 512, 1, 1)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Convolution input dimensions: (-1, 512, 28, 28) [03/25/2022-13:24:03] [V] [TRT] Kernel weights are not set yet. Kernel weights must be set using setInput(1, kernel_tensor) API call. [03/25/2022-13:24:03] [V] [TRT] Registering layer: Conv_464 for ONNX node: Conv_464 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1639 for ONNX tensor: 1639 [03/25/2022-13:24:03] [V] [TRT] Conv_464 [Conv] outputs: [1639 -> (-1, 1024, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: BatchNormalization_421 [BatchNormalization] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1595 [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.0.bn1.bn.weight [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.0.bn1.bn.bias [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.0.bn1.bn.running_mean [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.0.bn1.bn.running_var [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_421 [BatchNormalization] inputs: [1595 -> (-1, 256, 28, 28)[FLOAT]], [sections.2.0.bn1.bn.weight -> (256)[FLOAT]], [sections.2.0.bn1.bn.bias -> (256)[FLOAT]], [sections.2.0.bn1.bn.running_mean -> (256)[FLOAT]], [sections.2.0.bn1.bn.running_var -> (256)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: BatchNormalization_421 for ONNX node: BatchNormalization_421 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1596 for ONNX tensor: 1596 [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_421 [BatchNormalization] outputs: [1596 -> (-1, 256, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: BatchNormalization_465 [BatchNormalization] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1639 [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.0.identity.bn.bn.weight [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.0.identity.bn.bn.bias [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.0.identity.bn.bn.running_mean [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.0.identity.bn.bn.running_var [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_465 [BatchNormalization] inputs: [1639 -> (-1, 1024, 14, 14)[FLOAT]], [sections.2.0.identity.bn.bn.weight -> (1024)[FLOAT]], [sections.2.0.identity.bn.bn.bias -> (1024)[FLOAT]], [sections.2.0.identity.bn.bn.running_mean -> (1024)[FLOAT]], [sections.2.0.identity.bn.bn.running_var -> (1024)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: BatchNormalization_465 for ONNX node: BatchNormalization_465 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1640 for ONNX tensor: 1640 [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_465 [BatchNormalization] outputs: [1640 -> (-1, 1024, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Relu_422 [Relu] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1596 [03/25/2022-13:24:03] [V] [TRT] Relu_422 [Relu] inputs: [1596 -> (-1, 256, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: Relu_422 for ONNX node: Relu_422 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1597 for ONNX tensor: 1597 [03/25/2022-13:24:03] [V] [TRT] Relu_422 [Relu] outputs: [1597 -> (-1, 256, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: QuantizeLinear_468 [QuantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1640 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1641 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1642 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_468 [QuantizeLinear] inputs: [1640 -> (-1, 1024, 14, 14)[FLOAT]], [1641 -> ()[FLOAT]], [1642 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1643 for ONNX tensor: 1643 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_468 [QuantizeLinear] outputs: [1643 -> (-1, 1024, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: QuantizeLinear_425 [QuantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1597 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1598 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1599 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_425 [QuantizeLinear] inputs: [1597 -> (-1, 256, 28, 28)[FLOAT]], [1598 -> ()[FLOAT]], [1599 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1600 for ONNX tensor: 1600 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_425 [QuantizeLinear] outputs: [1600 -> (-1, 256, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_471 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1643 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1644 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1645 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_471 [DequantizeLinear] inputs: [1643 -> (-1, 1024, 14, 14)[FLOAT]], [1644 -> ()[FLOAT]], [1645 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1646 for ONNX tensor: 1646 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_471 [DequantizeLinear] outputs: [1646 -> (-1, 1024, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_428 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1600 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1601 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1602 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_428 [DequantizeLinear] inputs: [1600 -> (-1, 256, 28, 28)[FLOAT]], [1601 -> ()[FLOAT]], [1602 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1603 for ONNX tensor: 1603 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_428 [DequantizeLinear] outputs: [1603 -> (-1, 256, 28, 28)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Conv_435 [Conv] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1603 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1609 [03/25/2022-13:24:03] [V] [TRT] Conv_435 [Conv] inputs: [1603 -> (-1, 256, 28, 28)[FLOAT]], [1609 -> (256, 256, 3, 3)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Convolution input dimensions: (-1, 256, 28, 28) [03/25/2022-13:24:03] [V] [TRT] Kernel weights are not set yet. Kernel weights must be set using setInput(1, kernel_tensor) API call. [03/25/2022-13:24:03] [V] [TRT] Registering layer: Conv_435 for ONNX node: Conv_435 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1610 for ONNX tensor: 1610 [03/25/2022-13:24:03] [V] [TRT] Conv_435 [Conv] outputs: [1610 -> (-1, 256, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: BatchNormalization_436 [BatchNormalization] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1610 [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.0.bn2.bn.weight [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.0.bn2.bn.bias [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.0.bn2.bn.running_mean [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.0.bn2.bn.running_var [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_436 [BatchNormalization] inputs: [1610 -> (-1, 256, 14, 14)[FLOAT]], [sections.2.0.bn2.bn.weight -> (256)[FLOAT]], [sections.2.0.bn2.bn.bias -> (256)[FLOAT]], [sections.2.0.bn2.bn.running_mean -> (256)[FLOAT]], [sections.2.0.bn2.bn.running_var -> (256)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: BatchNormalization_436 for ONNX node: BatchNormalization_436 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1611 for ONNX tensor: 1611 [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_436 [BatchNormalization] outputs: [1611 -> (-1, 256, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Relu_437 [Relu] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1611 [03/25/2022-13:24:03] [V] [TRT] Relu_437 [Relu] inputs: [1611 -> (-1, 256, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: Relu_437 for ONNX node: Relu_437 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1612 for ONNX tensor: 1612 [03/25/2022-13:24:03] [V] [TRT] Relu_437 [Relu] outputs: [1612 -> (-1, 256, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: QuantizeLinear_440 [QuantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1612 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1613 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1614 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_440 [QuantizeLinear] inputs: [1612 -> (-1, 256, 14, 14)[FLOAT]], [1613 -> ()[FLOAT]], [1614 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1615 for ONNX tensor: 1615 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_440 [QuantizeLinear] outputs: [1615 -> (-1, 256, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_443 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1615 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1616 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1617 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_443 [DequantizeLinear] inputs: [1615 -> (-1, 256, 14, 14)[FLOAT]], [1616 -> ()[FLOAT]], [1617 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1618 for ONNX tensor: 1618 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_443 [DequantizeLinear] outputs: [1618 -> (-1, 256, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Conv_450 [Conv] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1618 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1624 [03/25/2022-13:24:03] [V] [TRT] Conv_450 [Conv] inputs: [1618 -> (-1, 256, 14, 14)[FLOAT]], [1624 -> (1024, 256, 1, 1)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Convolution input dimensions: (-1, 256, 14, 14) [03/25/2022-13:24:03] [V] [TRT] Kernel weights are not set yet. Kernel weights must be set using setInput(1, kernel_tensor) API call. [03/25/2022-13:24:03] [V] [TRT] Registering layer: Conv_450 for ONNX node: Conv_450 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1625 for ONNX tensor: 1625 [03/25/2022-13:24:03] [V] [TRT] Conv_450 [Conv] outputs: [1625 -> (-1, 1024, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: BatchNormalization_451 [BatchNormalization] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1625 [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.0.bn3.bn.weight [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.0.bn3.bn.bias [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.0.bn3.bn.running_mean [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.0.bn3.bn.running_var [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_451 [BatchNormalization] inputs: [1625 -> (-1, 1024, 14, 14)[FLOAT]], [sections.2.0.bn3.bn.weight -> (1024)[FLOAT]], [sections.2.0.bn3.bn.bias -> (1024)[FLOAT]], [sections.2.0.bn3.bn.running_mean -> (1024)[FLOAT]], [sections.2.0.bn3.bn.running_var -> (1024)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: BatchNormalization_451 for ONNX node: BatchNormalization_451 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1626 for ONNX tensor: 1626 [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_451 [BatchNormalization] outputs: [1626 -> (-1, 1024, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Add_472 [Add] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1646 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1626 [03/25/2022-13:24:03] [V] [TRT] Add_472 [Add] inputs: [1646 -> (-1, 1024, 14, 14)[FLOAT]], [1626 -> (-1, 1024, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: Add_472 for ONNX node: Add_472 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1647 for ONNX tensor: 1647 [03/25/2022-13:24:03] [V] [TRT] Add_472 [Add] outputs: [1647 -> (-1, 1024, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Relu_473 [Relu] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1647 [03/25/2022-13:24:03] [V] [TRT] Relu_473 [Relu] inputs: [1647 -> (-1, 1024, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: Relu_473 for ONNX node: Relu_473 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1648 for ONNX tensor: 1648 [03/25/2022-13:24:03] [V] [TRT] Relu_473 [Relu] outputs: [1648 -> (-1, 1024, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: QuantizeLinear_476 [QuantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1648 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1649 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1650 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_476 [QuantizeLinear] inputs: [1648 -> (-1, 1024, 14, 14)[FLOAT]], [1649 -> ()[FLOAT]], [1650 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1651 for ONNX tensor: 1651 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_476 [QuantizeLinear] outputs: [1651 -> (-1, 1024, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: QuantizeLinear_520 [QuantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1648 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1693 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1694 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_520 [QuantizeLinear] inputs: [1648 -> (-1, 1024, 14, 14)[FLOAT]], [1693 -> ()[FLOAT]], [1694 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1695 for ONNX tensor: 1695 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_520 [QuantizeLinear] outputs: [1695 -> (-1, 1024, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_479 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1651 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1652 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1653 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_479 [DequantizeLinear] inputs: [1651 -> (-1, 1024, 14, 14)[FLOAT]], [1652 -> ()[FLOAT]], [1653 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1654 for ONNX tensor: 1654 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_479 [DequantizeLinear] outputs: [1654 -> (-1, 1024, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_523 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1695 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1696 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1697 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_523 [DequantizeLinear] inputs: [1695 -> (-1, 1024, 14, 14)[FLOAT]], [1696 -> ()[FLOAT]], [1697 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1698 for ONNX tensor: 1698 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_523 [DequantizeLinear] outputs: [1698 -> (-1, 1024, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Conv_486 [Conv] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1654 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1660 [03/25/2022-13:24:03] [V] [TRT] Conv_486 [Conv] inputs: [1654 -> (-1, 1024, 14, 14)[FLOAT]], [1660 -> (256, 1024, 1, 1)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Convolution input dimensions: (-1, 1024, 14, 14) [03/25/2022-13:24:03] [V] [TRT] Kernel weights are not set yet. Kernel weights must be set using setInput(1, kernel_tensor) API call. [03/25/2022-13:24:03] [V] [TRT] Registering layer: Conv_486 for ONNX node: Conv_486 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1661 for ONNX tensor: 1661 [03/25/2022-13:24:03] [V] [TRT] Conv_486 [Conv] outputs: [1661 -> (-1, 256, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: BatchNormalization_487 [BatchNormalization] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1661 [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.1.bn1.bn.weight [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.1.bn1.bn.bias [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.1.bn1.bn.running_mean [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.1.bn1.bn.running_var [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_487 [BatchNormalization] inputs: [1661 -> (-1, 256, 14, 14)[FLOAT]], [sections.2.1.bn1.bn.weight -> (256)[FLOAT]], [sections.2.1.bn1.bn.bias -> (256)[FLOAT]], [sections.2.1.bn1.bn.running_mean -> (256)[FLOAT]], [sections.2.1.bn1.bn.running_var -> (256)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: BatchNormalization_487 for ONNX node: BatchNormalization_487 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1662 for ONNX tensor: 1662 [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_487 [BatchNormalization] outputs: [1662 -> (-1, 256, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Relu_488 [Relu] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1662 [03/25/2022-13:24:03] [V] [TRT] Relu_488 [Relu] inputs: [1662 -> (-1, 256, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: Relu_488 for ONNX node: Relu_488 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1663 for ONNX tensor: 1663 [03/25/2022-13:24:03] [V] [TRT] Relu_488 [Relu] outputs: [1663 -> (-1, 256, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: QuantizeLinear_491 [QuantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1663 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1664 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1665 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_491 [QuantizeLinear] inputs: [1663 -> (-1, 256, 14, 14)[FLOAT]], [1664 -> ()[FLOAT]], [1665 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1666 for ONNX tensor: 1666 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_491 [QuantizeLinear] outputs: [1666 -> (-1, 256, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_494 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1666 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1667 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1668 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_494 [DequantizeLinear] inputs: [1666 -> (-1, 256, 14, 14)[FLOAT]], [1667 -> ()[FLOAT]], [1668 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1669 for ONNX tensor: 1669 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_494 [DequantizeLinear] outputs: [1669 -> (-1, 256, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Conv_501 [Conv] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1669 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1675 [03/25/2022-13:24:03] [V] [TRT] Conv_501 [Conv] inputs: [1669 -> (-1, 256, 14, 14)[FLOAT]], [1675 -> (256, 256, 3, 3)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Convolution input dimensions: (-1, 256, 14, 14) [03/25/2022-13:24:03] [V] [TRT] Kernel weights are not set yet. Kernel weights must be set using setInput(1, kernel_tensor) API call. [03/25/2022-13:24:03] [V] [TRT] Registering layer: Conv_501 for ONNX node: Conv_501 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1676 for ONNX tensor: 1676 [03/25/2022-13:24:03] [V] [TRT] Conv_501 [Conv] outputs: [1676 -> (-1, 256, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: BatchNormalization_502 [BatchNormalization] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1676 [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.1.bn2.bn.weight [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.1.bn2.bn.bias [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.1.bn2.bn.running_mean [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.1.bn2.bn.running_var [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_502 [BatchNormalization] inputs: [1676 -> (-1, 256, 14, 14)[FLOAT]], [sections.2.1.bn2.bn.weight -> (256)[FLOAT]], [sections.2.1.bn2.bn.bias -> (256)[FLOAT]], [sections.2.1.bn2.bn.running_mean -> (256)[FLOAT]], [sections.2.1.bn2.bn.running_var -> (256)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: BatchNormalization_502 for ONNX node: BatchNormalization_502 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1677 for ONNX tensor: 1677 [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_502 [BatchNormalization] outputs: [1677 -> (-1, 256, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Relu_503 [Relu] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1677 [03/25/2022-13:24:03] [V] [TRT] Relu_503 [Relu] inputs: [1677 -> (-1, 256, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: Relu_503 for ONNX node: Relu_503 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1678 for ONNX tensor: 1678 [03/25/2022-13:24:03] [V] [TRT] Relu_503 [Relu] outputs: [1678 -> (-1, 256, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: QuantizeLinear_506 [QuantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1678 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1679 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1680 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_506 [QuantizeLinear] inputs: [1678 -> (-1, 256, 14, 14)[FLOAT]], [1679 -> ()[FLOAT]], [1680 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1681 for ONNX tensor: 1681 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_506 [QuantizeLinear] outputs: [1681 -> (-1, 256, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_509 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1681 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1682 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1683 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_509 [DequantizeLinear] inputs: [1681 -> (-1, 256, 14, 14)[FLOAT]], [1682 -> ()[FLOAT]], [1683 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1684 for ONNX tensor: 1684 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_509 [DequantizeLinear] outputs: [1684 -> (-1, 256, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Conv_516 [Conv] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1684 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1690 [03/25/2022-13:24:03] [V] [TRT] Conv_516 [Conv] inputs: [1684 -> (-1, 256, 14, 14)[FLOAT]], [1690 -> (1024, 256, 1, 1)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Convolution input dimensions: (-1, 256, 14, 14) [03/25/2022-13:24:03] [V] [TRT] Kernel weights are not set yet. Kernel weights must be set using setInput(1, kernel_tensor) API call. [03/25/2022-13:24:03] [V] [TRT] Registering layer: Conv_516 for ONNX node: Conv_516 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1691 for ONNX tensor: 1691 [03/25/2022-13:24:03] [V] [TRT] Conv_516 [Conv] outputs: [1691 -> (-1, 1024, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: BatchNormalization_517 [BatchNormalization] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1691 [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.1.bn3.bn.weight [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.1.bn3.bn.bias [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.1.bn3.bn.running_mean [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.1.bn3.bn.running_var [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_517 [BatchNormalization] inputs: [1691 -> (-1, 1024, 14, 14)[FLOAT]], [sections.2.1.bn3.bn.weight -> (1024)[FLOAT]], [sections.2.1.bn3.bn.bias -> (1024)[FLOAT]], [sections.2.1.bn3.bn.running_mean -> (1024)[FLOAT]], [sections.2.1.bn3.bn.running_var -> (1024)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: BatchNormalization_517 for ONNX node: BatchNormalization_517 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1692 for ONNX tensor: 1692 [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_517 [BatchNormalization] outputs: [1692 -> (-1, 1024, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Add_524 [Add] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1698 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1692 [03/25/2022-13:24:03] [V] [TRT] Add_524 [Add] inputs: [1698 -> (-1, 1024, 14, 14)[FLOAT]], [1692 -> (-1, 1024, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: Add_524 for ONNX node: Add_524 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1699 for ONNX tensor: 1699 [03/25/2022-13:24:03] [V] [TRT] Add_524 [Add] outputs: [1699 -> (-1, 1024, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Relu_525 [Relu] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1699 [03/25/2022-13:24:03] [V] [TRT] Relu_525 [Relu] inputs: [1699 -> (-1, 1024, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: Relu_525 for ONNX node: Relu_525 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1700 for ONNX tensor: 1700 [03/25/2022-13:24:03] [V] [TRT] Relu_525 [Relu] outputs: [1700 -> (-1, 1024, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: QuantizeLinear_528 [QuantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1700 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1701 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1702 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_528 [QuantizeLinear] inputs: [1700 -> (-1, 1024, 14, 14)[FLOAT]], [1701 -> ()[FLOAT]], [1702 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1703 for ONNX tensor: 1703 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_528 [QuantizeLinear] outputs: [1703 -> (-1, 1024, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: QuantizeLinear_572 [QuantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1700 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1745 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1746 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_572 [QuantizeLinear] inputs: [1700 -> (-1, 1024, 14, 14)[FLOAT]], [1745 -> ()[FLOAT]], [1746 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1747 for ONNX tensor: 1747 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_572 [QuantizeLinear] outputs: [1747 -> (-1, 1024, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_531 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1703 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1704 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1705 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_531 [DequantizeLinear] inputs: [1703 -> (-1, 1024, 14, 14)[FLOAT]], [1704 -> ()[FLOAT]], [1705 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1706 for ONNX tensor: 1706 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_531 [DequantizeLinear] outputs: [1706 -> (-1, 1024, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_575 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1747 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1748 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1749 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_575 [DequantizeLinear] inputs: [1747 -> (-1, 1024, 14, 14)[FLOAT]], [1748 -> ()[FLOAT]], [1749 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1750 for ONNX tensor: 1750 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_575 [DequantizeLinear] outputs: [1750 -> (-1, 1024, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Conv_538 [Conv] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1706 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1712 [03/25/2022-13:24:03] [V] [TRT] Conv_538 [Conv] inputs: [1706 -> (-1, 1024, 14, 14)[FLOAT]], [1712 -> (256, 1024, 1, 1)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Convolution input dimensions: (-1, 1024, 14, 14) [03/25/2022-13:24:03] [V] [TRT] Kernel weights are not set yet. Kernel weights must be set using setInput(1, kernel_tensor) API call. [03/25/2022-13:24:03] [V] [TRT] Registering layer: Conv_538 for ONNX node: Conv_538 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1713 for ONNX tensor: 1713 [03/25/2022-13:24:03] [V] [TRT] Conv_538 [Conv] outputs: [1713 -> (-1, 256, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: BatchNormalization_539 [BatchNormalization] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1713 [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.2.bn1.bn.weight [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.2.bn1.bn.bias [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.2.bn1.bn.running_mean [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.2.bn1.bn.running_var [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_539 [BatchNormalization] inputs: [1713 -> (-1, 256, 14, 14)[FLOAT]], [sections.2.2.bn1.bn.weight -> (256)[FLOAT]], [sections.2.2.bn1.bn.bias -> (256)[FLOAT]], [sections.2.2.bn1.bn.running_mean -> (256)[FLOAT]], [sections.2.2.bn1.bn.running_var -> (256)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: BatchNormalization_539 for ONNX node: BatchNormalization_539 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1714 for ONNX tensor: 1714 [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_539 [BatchNormalization] outputs: [1714 -> (-1, 256, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Relu_540 [Relu] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1714 [03/25/2022-13:24:03] [V] [TRT] Relu_540 [Relu] inputs: [1714 -> (-1, 256, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: Relu_540 for ONNX node: Relu_540 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1715 for ONNX tensor: 1715 [03/25/2022-13:24:03] [V] [TRT] Relu_540 [Relu] outputs: [1715 -> (-1, 256, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: QuantizeLinear_543 [QuantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1715 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1716 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1717 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_543 [QuantizeLinear] inputs: [1715 -> (-1, 256, 14, 14)[FLOAT]], [1716 -> ()[FLOAT]], [1717 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1718 for ONNX tensor: 1718 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_543 [QuantizeLinear] outputs: [1718 -> (-1, 256, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_546 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1718 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1719 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1720 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_546 [DequantizeLinear] inputs: [1718 -> (-1, 256, 14, 14)[FLOAT]], [1719 -> ()[FLOAT]], [1720 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1721 for ONNX tensor: 1721 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_546 [DequantizeLinear] outputs: [1721 -> (-1, 256, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Conv_553 [Conv] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1721 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1727 [03/25/2022-13:24:03] [V] [TRT] Conv_553 [Conv] inputs: [1721 -> (-1, 256, 14, 14)[FLOAT]], [1727 -> (256, 256, 3, 3)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Convolution input dimensions: (-1, 256, 14, 14) [03/25/2022-13:24:03] [V] [TRT] Kernel weights are not set yet. Kernel weights must be set using setInput(1, kernel_tensor) API call. [03/25/2022-13:24:03] [V] [TRT] Registering layer: Conv_553 for ONNX node: Conv_553 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1728 for ONNX tensor: 1728 [03/25/2022-13:24:03] [V] [TRT] Conv_553 [Conv] outputs: [1728 -> (-1, 256, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: BatchNormalization_554 [BatchNormalization] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1728 [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.2.bn2.bn.weight [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.2.bn2.bn.bias [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.2.bn2.bn.running_mean [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.2.bn2.bn.running_var [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_554 [BatchNormalization] inputs: [1728 -> (-1, 256, 14, 14)[FLOAT]], [sections.2.2.bn2.bn.weight -> (256)[FLOAT]], [sections.2.2.bn2.bn.bias -> (256)[FLOAT]], [sections.2.2.bn2.bn.running_mean -> (256)[FLOAT]], [sections.2.2.bn2.bn.running_var -> (256)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: BatchNormalization_554 for ONNX node: BatchNormalization_554 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1729 for ONNX tensor: 1729 [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_554 [BatchNormalization] outputs: [1729 -> (-1, 256, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Relu_555 [Relu] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1729 [03/25/2022-13:24:03] [V] [TRT] Relu_555 [Relu] inputs: [1729 -> (-1, 256, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: Relu_555 for ONNX node: Relu_555 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1730 for ONNX tensor: 1730 [03/25/2022-13:24:03] [V] [TRT] Relu_555 [Relu] outputs: [1730 -> (-1, 256, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: QuantizeLinear_558 [QuantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1730 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1731 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1732 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_558 [QuantizeLinear] inputs: [1730 -> (-1, 256, 14, 14)[FLOAT]], [1731 -> ()[FLOAT]], [1732 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1733 for ONNX tensor: 1733 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_558 [QuantizeLinear] outputs: [1733 -> (-1, 256, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_561 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1733 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1734 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1735 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_561 [DequantizeLinear] inputs: [1733 -> (-1, 256, 14, 14)[FLOAT]], [1734 -> ()[FLOAT]], [1735 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1736 for ONNX tensor: 1736 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_561 [DequantizeLinear] outputs: [1736 -> (-1, 256, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Conv_568 [Conv] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1736 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1742 [03/25/2022-13:24:03] [V] [TRT] Conv_568 [Conv] inputs: [1736 -> (-1, 256, 14, 14)[FLOAT]], [1742 -> (1024, 256, 1, 1)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Convolution input dimensions: (-1, 256, 14, 14) [03/25/2022-13:24:03] [V] [TRT] Kernel weights are not set yet. Kernel weights must be set using setInput(1, kernel_tensor) API call. [03/25/2022-13:24:03] [V] [TRT] Registering layer: Conv_568 for ONNX node: Conv_568 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1743 for ONNX tensor: 1743 [03/25/2022-13:24:03] [V] [TRT] Conv_568 [Conv] outputs: [1743 -> (-1, 1024, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: BatchNormalization_569 [BatchNormalization] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1743 [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.2.bn3.bn.weight [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.2.bn3.bn.bias [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.2.bn3.bn.running_mean [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.2.bn3.bn.running_var [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_569 [BatchNormalization] inputs: [1743 -> (-1, 1024, 14, 14)[FLOAT]], [sections.2.2.bn3.bn.weight -> (1024)[FLOAT]], [sections.2.2.bn3.bn.bias -> (1024)[FLOAT]], [sections.2.2.bn3.bn.running_mean -> (1024)[FLOAT]], [sections.2.2.bn3.bn.running_var -> (1024)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: BatchNormalization_569 for ONNX node: BatchNormalization_569 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1744 for ONNX tensor: 1744 [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_569 [BatchNormalization] outputs: [1744 -> (-1, 1024, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Add_576 [Add] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1750 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1744 [03/25/2022-13:24:03] [V] [TRT] Add_576 [Add] inputs: [1750 -> (-1, 1024, 14, 14)[FLOAT]], [1744 -> (-1, 1024, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: Add_576 for ONNX node: Add_576 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1751 for ONNX tensor: 1751 [03/25/2022-13:24:03] [V] [TRT] Add_576 [Add] outputs: [1751 -> (-1, 1024, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Relu_577 [Relu] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1751 [03/25/2022-13:24:03] [V] [TRT] Relu_577 [Relu] inputs: [1751 -> (-1, 1024, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: Relu_577 for ONNX node: Relu_577 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1752 for ONNX tensor: 1752 [03/25/2022-13:24:03] [V] [TRT] Relu_577 [Relu] outputs: [1752 -> (-1, 1024, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: QuantizeLinear_580 [QuantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1752 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1753 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1754 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_580 [QuantizeLinear] inputs: [1752 -> (-1, 1024, 14, 14)[FLOAT]], [1753 -> ()[FLOAT]], [1754 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1755 for ONNX tensor: 1755 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_580 [QuantizeLinear] outputs: [1755 -> (-1, 1024, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: QuantizeLinear_624 [QuantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1752 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1797 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1798 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_624 [QuantizeLinear] inputs: [1752 -> (-1, 1024, 14, 14)[FLOAT]], [1797 -> ()[FLOAT]], [1798 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1799 for ONNX tensor: 1799 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_624 [QuantizeLinear] outputs: [1799 -> (-1, 1024, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_583 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1755 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1756 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1757 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_583 [DequantizeLinear] inputs: [1755 -> (-1, 1024, 14, 14)[FLOAT]], [1756 -> ()[FLOAT]], [1757 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1758 for ONNX tensor: 1758 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_583 [DequantizeLinear] outputs: [1758 -> (-1, 1024, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_627 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1799 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1800 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1801 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_627 [DequantizeLinear] inputs: [1799 -> (-1, 1024, 14, 14)[FLOAT]], [1800 -> ()[FLOAT]], [1801 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1802 for ONNX tensor: 1802 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_627 [DequantizeLinear] outputs: [1802 -> (-1, 1024, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Conv_590 [Conv] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1758 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1764 [03/25/2022-13:24:03] [V] [TRT] Conv_590 [Conv] inputs: [1758 -> (-1, 1024, 14, 14)[FLOAT]], [1764 -> (256, 1024, 1, 1)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Convolution input dimensions: (-1, 1024, 14, 14) [03/25/2022-13:24:03] [V] [TRT] Kernel weights are not set yet. Kernel weights must be set using setInput(1, kernel_tensor) API call. [03/25/2022-13:24:03] [V] [TRT] Registering layer: Conv_590 for ONNX node: Conv_590 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1765 for ONNX tensor: 1765 [03/25/2022-13:24:03] [V] [TRT] Conv_590 [Conv] outputs: [1765 -> (-1, 256, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: BatchNormalization_591 [BatchNormalization] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1765 [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.3.bn1.bn.weight [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.3.bn1.bn.bias [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.3.bn1.bn.running_mean [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.3.bn1.bn.running_var [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_591 [BatchNormalization] inputs: [1765 -> (-1, 256, 14, 14)[FLOAT]], [sections.2.3.bn1.bn.weight -> (256)[FLOAT]], [sections.2.3.bn1.bn.bias -> (256)[FLOAT]], [sections.2.3.bn1.bn.running_mean -> (256)[FLOAT]], [sections.2.3.bn1.bn.running_var -> (256)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: BatchNormalization_591 for ONNX node: BatchNormalization_591 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1766 for ONNX tensor: 1766 [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_591 [BatchNormalization] outputs: [1766 -> (-1, 256, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Relu_592 [Relu] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1766 [03/25/2022-13:24:03] [V] [TRT] Relu_592 [Relu] inputs: [1766 -> (-1, 256, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: Relu_592 for ONNX node: Relu_592 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1767 for ONNX tensor: 1767 [03/25/2022-13:24:03] [V] [TRT] Relu_592 [Relu] outputs: [1767 -> (-1, 256, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: QuantizeLinear_595 [QuantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1767 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1768 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1769 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_595 [QuantizeLinear] inputs: [1767 -> (-1, 256, 14, 14)[FLOAT]], [1768 -> ()[FLOAT]], [1769 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1770 for ONNX tensor: 1770 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_595 [QuantizeLinear] outputs: [1770 -> (-1, 256, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_598 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1770 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1771 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1772 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_598 [DequantizeLinear] inputs: [1770 -> (-1, 256, 14, 14)[FLOAT]], [1771 -> ()[FLOAT]], [1772 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1773 for ONNX tensor: 1773 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_598 [DequantizeLinear] outputs: [1773 -> (-1, 256, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Conv_605 [Conv] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1773 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1779 [03/25/2022-13:24:03] [V] [TRT] Conv_605 [Conv] inputs: [1773 -> (-1, 256, 14, 14)[FLOAT]], [1779 -> (256, 256, 3, 3)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Convolution input dimensions: (-1, 256, 14, 14) [03/25/2022-13:24:03] [V] [TRT] Kernel weights are not set yet. Kernel weights must be set using setInput(1, kernel_tensor) API call. [03/25/2022-13:24:03] [V] [TRT] Registering layer: Conv_605 for ONNX node: Conv_605 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1780 for ONNX tensor: 1780 [03/25/2022-13:24:03] [V] [TRT] Conv_605 [Conv] outputs: [1780 -> (-1, 256, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: BatchNormalization_606 [BatchNormalization] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1780 [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.3.bn2.bn.weight [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.3.bn2.bn.bias [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.3.bn2.bn.running_mean [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.3.bn2.bn.running_var [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_606 [BatchNormalization] inputs: [1780 -> (-1, 256, 14, 14)[FLOAT]], [sections.2.3.bn2.bn.weight -> (256)[FLOAT]], [sections.2.3.bn2.bn.bias -> (256)[FLOAT]], [sections.2.3.bn2.bn.running_mean -> (256)[FLOAT]], [sections.2.3.bn2.bn.running_var -> (256)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: BatchNormalization_606 for ONNX node: BatchNormalization_606 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1781 for ONNX tensor: 1781 [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_606 [BatchNormalization] outputs: [1781 -> (-1, 256, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Relu_607 [Relu] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1781 [03/25/2022-13:24:03] [V] [TRT] Relu_607 [Relu] inputs: [1781 -> (-1, 256, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: Relu_607 for ONNX node: Relu_607 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1782 for ONNX tensor: 1782 [03/25/2022-13:24:03] [V] [TRT] Relu_607 [Relu] outputs: [1782 -> (-1, 256, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: QuantizeLinear_610 [QuantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1782 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1783 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1784 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_610 [QuantizeLinear] inputs: [1782 -> (-1, 256, 14, 14)[FLOAT]], [1783 -> ()[FLOAT]], [1784 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1785 for ONNX tensor: 1785 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_610 [QuantizeLinear] outputs: [1785 -> (-1, 256, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_613 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1785 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1786 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1787 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_613 [DequantizeLinear] inputs: [1785 -> (-1, 256, 14, 14)[FLOAT]], [1786 -> ()[FLOAT]], [1787 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1788 for ONNX tensor: 1788 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_613 [DequantizeLinear] outputs: [1788 -> (-1, 256, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Conv_620 [Conv] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1788 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1794 [03/25/2022-13:24:03] [V] [TRT] Conv_620 [Conv] inputs: [1788 -> (-1, 256, 14, 14)[FLOAT]], [1794 -> (1024, 256, 1, 1)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Convolution input dimensions: (-1, 256, 14, 14) [03/25/2022-13:24:03] [V] [TRT] Kernel weights are not set yet. Kernel weights must be set using setInput(1, kernel_tensor) API call. [03/25/2022-13:24:03] [V] [TRT] Registering layer: Conv_620 for ONNX node: Conv_620 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1795 for ONNX tensor: 1795 [03/25/2022-13:24:03] [V] [TRT] Conv_620 [Conv] outputs: [1795 -> (-1, 1024, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: BatchNormalization_621 [BatchNormalization] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1795 [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.3.bn3.bn.weight [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.3.bn3.bn.bias [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.3.bn3.bn.running_mean [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.3.bn3.bn.running_var [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_621 [BatchNormalization] inputs: [1795 -> (-1, 1024, 14, 14)[FLOAT]], [sections.2.3.bn3.bn.weight -> (1024)[FLOAT]], [sections.2.3.bn3.bn.bias -> (1024)[FLOAT]], [sections.2.3.bn3.bn.running_mean -> (1024)[FLOAT]], [sections.2.3.bn3.bn.running_var -> (1024)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: BatchNormalization_621 for ONNX node: BatchNormalization_621 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1796 for ONNX tensor: 1796 [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_621 [BatchNormalization] outputs: [1796 -> (-1, 1024, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Add_628 [Add] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1802 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1796 [03/25/2022-13:24:03] [V] [TRT] Add_628 [Add] inputs: [1802 -> (-1, 1024, 14, 14)[FLOAT]], [1796 -> (-1, 1024, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: Add_628 for ONNX node: Add_628 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1803 for ONNX tensor: 1803 [03/25/2022-13:24:03] [V] [TRT] Add_628 [Add] outputs: [1803 -> (-1, 1024, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Relu_629 [Relu] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1803 [03/25/2022-13:24:03] [V] [TRT] Relu_629 [Relu] inputs: [1803 -> (-1, 1024, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: Relu_629 for ONNX node: Relu_629 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1804 for ONNX tensor: 1804 [03/25/2022-13:24:03] [V] [TRT] Relu_629 [Relu] outputs: [1804 -> (-1, 1024, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: QuantizeLinear_632 [QuantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1804 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1805 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1806 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_632 [QuantizeLinear] inputs: [1804 -> (-1, 1024, 14, 14)[FLOAT]], [1805 -> ()[FLOAT]], [1806 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1807 for ONNX tensor: 1807 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_632 [QuantizeLinear] outputs: [1807 -> (-1, 1024, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: QuantizeLinear_676 [QuantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1804 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1849 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1850 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_676 [QuantizeLinear] inputs: [1804 -> (-1, 1024, 14, 14)[FLOAT]], [1849 -> ()[FLOAT]], [1850 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1851 for ONNX tensor: 1851 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_676 [QuantizeLinear] outputs: [1851 -> (-1, 1024, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_635 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1807 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1808 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1809 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_635 [DequantizeLinear] inputs: [1807 -> (-1, 1024, 14, 14)[FLOAT]], [1808 -> ()[FLOAT]], [1809 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1810 for ONNX tensor: 1810 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_635 [DequantizeLinear] outputs: [1810 -> (-1, 1024, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_679 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1851 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1852 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1853 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_679 [DequantizeLinear] inputs: [1851 -> (-1, 1024, 14, 14)[FLOAT]], [1852 -> ()[FLOAT]], [1853 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1854 for ONNX tensor: 1854 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_679 [DequantizeLinear] outputs: [1854 -> (-1, 1024, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Conv_642 [Conv] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1810 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1816 [03/25/2022-13:24:03] [V] [TRT] Conv_642 [Conv] inputs: [1810 -> (-1, 1024, 14, 14)[FLOAT]], [1816 -> (256, 1024, 1, 1)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Convolution input dimensions: (-1, 1024, 14, 14) [03/25/2022-13:24:03] [V] [TRT] Kernel weights are not set yet. Kernel weights must be set using setInput(1, kernel_tensor) API call. [03/25/2022-13:24:03] [V] [TRT] Registering layer: Conv_642 for ONNX node: Conv_642 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1817 for ONNX tensor: 1817 [03/25/2022-13:24:03] [V] [TRT] Conv_642 [Conv] outputs: [1817 -> (-1, 256, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: BatchNormalization_643 [BatchNormalization] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1817 [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.4.bn1.bn.weight [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.4.bn1.bn.bias [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.4.bn1.bn.running_mean [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.4.bn1.bn.running_var [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_643 [BatchNormalization] inputs: [1817 -> (-1, 256, 14, 14)[FLOAT]], [sections.2.4.bn1.bn.weight -> (256)[FLOAT]], [sections.2.4.bn1.bn.bias -> (256)[FLOAT]], [sections.2.4.bn1.bn.running_mean -> (256)[FLOAT]], [sections.2.4.bn1.bn.running_var -> (256)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: BatchNormalization_643 for ONNX node: BatchNormalization_643 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1818 for ONNX tensor: 1818 [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_643 [BatchNormalization] outputs: [1818 -> (-1, 256, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Relu_644 [Relu] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1818 [03/25/2022-13:24:03] [V] [TRT] Relu_644 [Relu] inputs: [1818 -> (-1, 256, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: Relu_644 for ONNX node: Relu_644 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1819 for ONNX tensor: 1819 [03/25/2022-13:24:03] [V] [TRT] Relu_644 [Relu] outputs: [1819 -> (-1, 256, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: QuantizeLinear_647 [QuantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1819 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1820 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1821 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_647 [QuantizeLinear] inputs: [1819 -> (-1, 256, 14, 14)[FLOAT]], [1820 -> ()[FLOAT]], [1821 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1822 for ONNX tensor: 1822 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_647 [QuantizeLinear] outputs: [1822 -> (-1, 256, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_650 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1822 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1823 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1824 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_650 [DequantizeLinear] inputs: [1822 -> (-1, 256, 14, 14)[FLOAT]], [1823 -> ()[FLOAT]], [1824 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1825 for ONNX tensor: 1825 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_650 [DequantizeLinear] outputs: [1825 -> (-1, 256, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Conv_657 [Conv] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1825 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1831 [03/25/2022-13:24:03] [V] [TRT] Conv_657 [Conv] inputs: [1825 -> (-1, 256, 14, 14)[FLOAT]], [1831 -> (256, 256, 3, 3)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Convolution input dimensions: (-1, 256, 14, 14) [03/25/2022-13:24:03] [V] [TRT] Kernel weights are not set yet. Kernel weights must be set using setInput(1, kernel_tensor) API call. [03/25/2022-13:24:03] [V] [TRT] Registering layer: Conv_657 for ONNX node: Conv_657 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1832 for ONNX tensor: 1832 [03/25/2022-13:24:03] [V] [TRT] Conv_657 [Conv] outputs: [1832 -> (-1, 256, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: BatchNormalization_658 [BatchNormalization] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1832 [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.4.bn2.bn.weight [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.4.bn2.bn.bias [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.4.bn2.bn.running_mean [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.4.bn2.bn.running_var [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_658 [BatchNormalization] inputs: [1832 -> (-1, 256, 14, 14)[FLOAT]], [sections.2.4.bn2.bn.weight -> (256)[FLOAT]], [sections.2.4.bn2.bn.bias -> (256)[FLOAT]], [sections.2.4.bn2.bn.running_mean -> (256)[FLOAT]], [sections.2.4.bn2.bn.running_var -> (256)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: BatchNormalization_658 for ONNX node: BatchNormalization_658 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1833 for ONNX tensor: 1833 [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_658 [BatchNormalization] outputs: [1833 -> (-1, 256, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Relu_659 [Relu] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1833 [03/25/2022-13:24:03] [V] [TRT] Relu_659 [Relu] inputs: [1833 -> (-1, 256, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: Relu_659 for ONNX node: Relu_659 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1834 for ONNX tensor: 1834 [03/25/2022-13:24:03] [V] [TRT] Relu_659 [Relu] outputs: [1834 -> (-1, 256, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: QuantizeLinear_662 [QuantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1834 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1835 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1836 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_662 [QuantizeLinear] inputs: [1834 -> (-1, 256, 14, 14)[FLOAT]], [1835 -> ()[FLOAT]], [1836 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1837 for ONNX tensor: 1837 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_662 [QuantizeLinear] outputs: [1837 -> (-1, 256, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_665 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1837 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1838 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1839 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_665 [DequantizeLinear] inputs: [1837 -> (-1, 256, 14, 14)[FLOAT]], [1838 -> ()[FLOAT]], [1839 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1840 for ONNX tensor: 1840 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_665 [DequantizeLinear] outputs: [1840 -> (-1, 256, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Conv_672 [Conv] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1840 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1846 [03/25/2022-13:24:03] [V] [TRT] Conv_672 [Conv] inputs: [1840 -> (-1, 256, 14, 14)[FLOAT]], [1846 -> (1024, 256, 1, 1)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Convolution input dimensions: (-1, 256, 14, 14) [03/25/2022-13:24:03] [V] [TRT] Kernel weights are not set yet. Kernel weights must be set using setInput(1, kernel_tensor) API call. [03/25/2022-13:24:03] [V] [TRT] Registering layer: Conv_672 for ONNX node: Conv_672 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1847 for ONNX tensor: 1847 [03/25/2022-13:24:03] [V] [TRT] Conv_672 [Conv] outputs: [1847 -> (-1, 1024, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: BatchNormalization_673 [BatchNormalization] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1847 [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.4.bn3.bn.weight [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.4.bn3.bn.bias [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.4.bn3.bn.running_mean [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.4.bn3.bn.running_var [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_673 [BatchNormalization] inputs: [1847 -> (-1, 1024, 14, 14)[FLOAT]], [sections.2.4.bn3.bn.weight -> (1024)[FLOAT]], [sections.2.4.bn3.bn.bias -> (1024)[FLOAT]], [sections.2.4.bn3.bn.running_mean -> (1024)[FLOAT]], [sections.2.4.bn3.bn.running_var -> (1024)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: BatchNormalization_673 for ONNX node: BatchNormalization_673 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1848 for ONNX tensor: 1848 [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_673 [BatchNormalization] outputs: [1848 -> (-1, 1024, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Add_680 [Add] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1854 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1848 [03/25/2022-13:24:03] [V] [TRT] Add_680 [Add] inputs: [1854 -> (-1, 1024, 14, 14)[FLOAT]], [1848 -> (-1, 1024, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: Add_680 for ONNX node: Add_680 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1855 for ONNX tensor: 1855 [03/25/2022-13:24:03] [V] [TRT] Add_680 [Add] outputs: [1855 -> (-1, 1024, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Relu_681 [Relu] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1855 [03/25/2022-13:24:03] [V] [TRT] Relu_681 [Relu] inputs: [1855 -> (-1, 1024, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: Relu_681 for ONNX node: Relu_681 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1856 for ONNX tensor: 1856 [03/25/2022-13:24:03] [V] [TRT] Relu_681 [Relu] outputs: [1856 -> (-1, 1024, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: QuantizeLinear_684 [QuantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1856 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1857 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1858 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_684 [QuantizeLinear] inputs: [1856 -> (-1, 1024, 14, 14)[FLOAT]], [1857 -> ()[FLOAT]], [1858 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1859 for ONNX tensor: 1859 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_684 [QuantizeLinear] outputs: [1859 -> (-1, 1024, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: QuantizeLinear_728 [QuantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1856 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1901 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1902 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_728 [QuantizeLinear] inputs: [1856 -> (-1, 1024, 14, 14)[FLOAT]], [1901 -> ()[FLOAT]], [1902 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1903 for ONNX tensor: 1903 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_728 [QuantizeLinear] outputs: [1903 -> (-1, 1024, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_687 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1859 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1860 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1861 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_687 [DequantizeLinear] inputs: [1859 -> (-1, 1024, 14, 14)[FLOAT]], [1860 -> ()[FLOAT]], [1861 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1862 for ONNX tensor: 1862 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_687 [DequantizeLinear] outputs: [1862 -> (-1, 1024, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_731 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1903 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1904 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1905 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_731 [DequantizeLinear] inputs: [1903 -> (-1, 1024, 14, 14)[FLOAT]], [1904 -> ()[FLOAT]], [1905 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1906 for ONNX tensor: 1906 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_731 [DequantizeLinear] outputs: [1906 -> (-1, 1024, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Conv_694 [Conv] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1862 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1868 [03/25/2022-13:24:03] [V] [TRT] Conv_694 [Conv] inputs: [1862 -> (-1, 1024, 14, 14)[FLOAT]], [1868 -> (256, 1024, 1, 1)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Convolution input dimensions: (-1, 1024, 14, 14) [03/25/2022-13:24:03] [V] [TRT] Kernel weights are not set yet. Kernel weights must be set using setInput(1, kernel_tensor) API call. [03/25/2022-13:24:03] [V] [TRT] Registering layer: Conv_694 for ONNX node: Conv_694 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1869 for ONNX tensor: 1869 [03/25/2022-13:24:03] [V] [TRT] Conv_694 [Conv] outputs: [1869 -> (-1, 256, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: BatchNormalization_695 [BatchNormalization] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1869 [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.5.bn1.bn.weight [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.5.bn1.bn.bias [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.5.bn1.bn.running_mean [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.5.bn1.bn.running_var [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_695 [BatchNormalization] inputs: [1869 -> (-1, 256, 14, 14)[FLOAT]], [sections.2.5.bn1.bn.weight -> (256)[FLOAT]], [sections.2.5.bn1.bn.bias -> (256)[FLOAT]], [sections.2.5.bn1.bn.running_mean -> (256)[FLOAT]], [sections.2.5.bn1.bn.running_var -> (256)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: BatchNormalization_695 for ONNX node: BatchNormalization_695 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1870 for ONNX tensor: 1870 [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_695 [BatchNormalization] outputs: [1870 -> (-1, 256, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Relu_696 [Relu] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1870 [03/25/2022-13:24:03] [V] [TRT] Relu_696 [Relu] inputs: [1870 -> (-1, 256, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: Relu_696 for ONNX node: Relu_696 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1871 for ONNX tensor: 1871 [03/25/2022-13:24:03] [V] [TRT] Relu_696 [Relu] outputs: [1871 -> (-1, 256, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: QuantizeLinear_699 [QuantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1871 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1872 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1873 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_699 [QuantizeLinear] inputs: [1871 -> (-1, 256, 14, 14)[FLOAT]], [1872 -> ()[FLOAT]], [1873 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1874 for ONNX tensor: 1874 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_699 [QuantizeLinear] outputs: [1874 -> (-1, 256, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_702 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1874 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1875 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1876 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_702 [DequantizeLinear] inputs: [1874 -> (-1, 256, 14, 14)[FLOAT]], [1875 -> ()[FLOAT]], [1876 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1877 for ONNX tensor: 1877 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_702 [DequantizeLinear] outputs: [1877 -> (-1, 256, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Conv_709 [Conv] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1877 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1883 [03/25/2022-13:24:03] [V] [TRT] Conv_709 [Conv] inputs: [1877 -> (-1, 256, 14, 14)[FLOAT]], [1883 -> (256, 256, 3, 3)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Convolution input dimensions: (-1, 256, 14, 14) [03/25/2022-13:24:03] [V] [TRT] Kernel weights are not set yet. Kernel weights must be set using setInput(1, kernel_tensor) API call. [03/25/2022-13:24:03] [V] [TRT] Registering layer: Conv_709 for ONNX node: Conv_709 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1884 for ONNX tensor: 1884 [03/25/2022-13:24:03] [V] [TRT] Conv_709 [Conv] outputs: [1884 -> (-1, 256, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: BatchNormalization_710 [BatchNormalization] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1884 [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.5.bn2.bn.weight [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.5.bn2.bn.bias [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.5.bn2.bn.running_mean [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.5.bn2.bn.running_var [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_710 [BatchNormalization] inputs: [1884 -> (-1, 256, 14, 14)[FLOAT]], [sections.2.5.bn2.bn.weight -> (256)[FLOAT]], [sections.2.5.bn2.bn.bias -> (256)[FLOAT]], [sections.2.5.bn2.bn.running_mean -> (256)[FLOAT]], [sections.2.5.bn2.bn.running_var -> (256)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: BatchNormalization_710 for ONNX node: BatchNormalization_710 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1885 for ONNX tensor: 1885 [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_710 [BatchNormalization] outputs: [1885 -> (-1, 256, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Relu_711 [Relu] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1885 [03/25/2022-13:24:03] [V] [TRT] Relu_711 [Relu] inputs: [1885 -> (-1, 256, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: Relu_711 for ONNX node: Relu_711 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1886 for ONNX tensor: 1886 [03/25/2022-13:24:03] [V] [TRT] Relu_711 [Relu] outputs: [1886 -> (-1, 256, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: QuantizeLinear_714 [QuantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1886 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1887 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1888 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_714 [QuantizeLinear] inputs: [1886 -> (-1, 256, 14, 14)[FLOAT]], [1887 -> ()[FLOAT]], [1888 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1889 for ONNX tensor: 1889 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_714 [QuantizeLinear] outputs: [1889 -> (-1, 256, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_717 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1889 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1890 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1891 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_717 [DequantizeLinear] inputs: [1889 -> (-1, 256, 14, 14)[FLOAT]], [1890 -> ()[FLOAT]], [1891 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1892 for ONNX tensor: 1892 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_717 [DequantizeLinear] outputs: [1892 -> (-1, 256, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Conv_724 [Conv] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1892 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1898 [03/25/2022-13:24:03] [V] [TRT] Conv_724 [Conv] inputs: [1892 -> (-1, 256, 14, 14)[FLOAT]], [1898 -> (1024, 256, 1, 1)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Convolution input dimensions: (-1, 256, 14, 14) [03/25/2022-13:24:03] [V] [TRT] Kernel weights are not set yet. Kernel weights must be set using setInput(1, kernel_tensor) API call. [03/25/2022-13:24:03] [V] [TRT] Registering layer: Conv_724 for ONNX node: Conv_724 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1899 for ONNX tensor: 1899 [03/25/2022-13:24:03] [V] [TRT] Conv_724 [Conv] outputs: [1899 -> (-1, 1024, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: BatchNormalization_725 [BatchNormalization] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1899 [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.5.bn3.bn.weight [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.5.bn3.bn.bias [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.5.bn3.bn.running_mean [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.2.5.bn3.bn.running_var [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_725 [BatchNormalization] inputs: [1899 -> (-1, 1024, 14, 14)[FLOAT]], [sections.2.5.bn3.bn.weight -> (1024)[FLOAT]], [sections.2.5.bn3.bn.bias -> (1024)[FLOAT]], [sections.2.5.bn3.bn.running_mean -> (1024)[FLOAT]], [sections.2.5.bn3.bn.running_var -> (1024)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: BatchNormalization_725 for ONNX node: BatchNormalization_725 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1900 for ONNX tensor: 1900 [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_725 [BatchNormalization] outputs: [1900 -> (-1, 1024, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Add_732 [Add] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1906 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1900 [03/25/2022-13:24:03] [V] [TRT] Add_732 [Add] inputs: [1906 -> (-1, 1024, 14, 14)[FLOAT]], [1900 -> (-1, 1024, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: Add_732 for ONNX node: Add_732 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1907 for ONNX tensor: 1907 [03/25/2022-13:24:03] [V] [TRT] Add_732 [Add] outputs: [1907 -> (-1, 1024, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Relu_733 [Relu] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1907 [03/25/2022-13:24:03] [V] [TRT] Relu_733 [Relu] inputs: [1907 -> (-1, 1024, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: Relu_733 for ONNX node: Relu_733 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1908 for ONNX tensor: 1908 [03/25/2022-13:24:03] [V] [TRT] Relu_733 [Relu] outputs: [1908 -> (-1, 1024, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: QuantizeLinear_736 [QuantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1908 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1909 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1910 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_736 [QuantizeLinear] inputs: [1908 -> (-1, 1024, 14, 14)[FLOAT]], [1909 -> ()[FLOAT]], [1910 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1911 for ONNX tensor: 1911 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_736 [QuantizeLinear] outputs: [1911 -> (-1, 1024, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: QuantizeLinear_780 [QuantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1908 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1953 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1954 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_780 [QuantizeLinear] inputs: [1908 -> (-1, 1024, 14, 14)[FLOAT]], [1953 -> ()[FLOAT]], [1954 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1955 for ONNX tensor: 1955 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_780 [QuantizeLinear] outputs: [1955 -> (-1, 1024, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_739 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1911 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1912 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1913 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_739 [DequantizeLinear] inputs: [1911 -> (-1, 1024, 14, 14)[FLOAT]], [1912 -> ()[FLOAT]], [1913 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1914 for ONNX tensor: 1914 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_739 [DequantizeLinear] outputs: [1914 -> (-1, 1024, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_783 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1955 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1956 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1957 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_783 [DequantizeLinear] inputs: [1955 -> (-1, 1024, 14, 14)[FLOAT]], [1956 -> ()[FLOAT]], [1957 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1958 for ONNX tensor: 1958 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_783 [DequantizeLinear] outputs: [1958 -> (-1, 1024, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Conv_746 [Conv] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1914 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1920 [03/25/2022-13:24:03] [V] [TRT] Conv_746 [Conv] inputs: [1914 -> (-1, 1024, 14, 14)[FLOAT]], [1920 -> (512, 1024, 1, 1)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Convolution input dimensions: (-1, 1024, 14, 14) [03/25/2022-13:24:03] [V] [TRT] Kernel weights are not set yet. Kernel weights must be set using setInput(1, kernel_tensor) API call. [03/25/2022-13:24:03] [V] [TRT] Registering layer: Conv_746 for ONNX node: Conv_746 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1921 for ONNX tensor: 1921 [03/25/2022-13:24:03] [V] [TRT] Conv_746 [Conv] outputs: [1921 -> (-1, 512, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Conv_790 [Conv] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1958 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1964 [03/25/2022-13:24:03] [V] [TRT] Conv_790 [Conv] inputs: [1958 -> (-1, 1024, 14, 14)[FLOAT]], [1964 -> (2048, 1024, 1, 1)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Convolution input dimensions: (-1, 1024, 14, 14) [03/25/2022-13:24:03] [V] [TRT] Kernel weights are not set yet. Kernel weights must be set using setInput(1, kernel_tensor) API call. [03/25/2022-13:24:03] [V] [TRT] Registering layer: Conv_790 for ONNX node: Conv_790 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1965 for ONNX tensor: 1965 [03/25/2022-13:24:03] [V] [TRT] Conv_790 [Conv] outputs: [1965 -> (-1, 2048, 7, 7)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: BatchNormalization_747 [BatchNormalization] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1921 [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.3.0.bn1.bn.weight [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.3.0.bn1.bn.bias [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.3.0.bn1.bn.running_mean [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.3.0.bn1.bn.running_var [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_747 [BatchNormalization] inputs: [1921 -> (-1, 512, 14, 14)[FLOAT]], [sections.3.0.bn1.bn.weight -> (512)[FLOAT]], [sections.3.0.bn1.bn.bias -> (512)[FLOAT]], [sections.3.0.bn1.bn.running_mean -> (512)[FLOAT]], [sections.3.0.bn1.bn.running_var -> (512)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: BatchNormalization_747 for ONNX node: BatchNormalization_747 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1922 for ONNX tensor: 1922 [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_747 [BatchNormalization] outputs: [1922 -> (-1, 512, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: BatchNormalization_791 [BatchNormalization] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1965 [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.3.0.identity.bn.bn.weight [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.3.0.identity.bn.bn.bias [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.3.0.identity.bn.bn.running_mean [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.3.0.identity.bn.bn.running_var [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_791 [BatchNormalization] inputs: [1965 -> (-1, 2048, 7, 7)[FLOAT]], [sections.3.0.identity.bn.bn.weight -> (2048)[FLOAT]], [sections.3.0.identity.bn.bn.bias -> (2048)[FLOAT]], [sections.3.0.identity.bn.bn.running_mean -> (2048)[FLOAT]], [sections.3.0.identity.bn.bn.running_var -> (2048)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: BatchNormalization_791 for ONNX node: BatchNormalization_791 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1966 for ONNX tensor: 1966 [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_791 [BatchNormalization] outputs: [1966 -> (-1, 2048, 7, 7)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Relu_748 [Relu] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1922 [03/25/2022-13:24:03] [V] [TRT] Relu_748 [Relu] inputs: [1922 -> (-1, 512, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: Relu_748 for ONNX node: Relu_748 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1923 for ONNX tensor: 1923 [03/25/2022-13:24:03] [V] [TRT] Relu_748 [Relu] outputs: [1923 -> (-1, 512, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: QuantizeLinear_794 [QuantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1966 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1967 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1968 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_794 [QuantizeLinear] inputs: [1966 -> (-1, 2048, 7, 7)[FLOAT]], [1967 -> ()[FLOAT]], [1968 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1969 for ONNX tensor: 1969 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_794 [QuantizeLinear] outputs: [1969 -> (-1, 2048, 7, 7)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: QuantizeLinear_751 [QuantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1923 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1924 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1925 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_751 [QuantizeLinear] inputs: [1923 -> (-1, 512, 14, 14)[FLOAT]], [1924 -> ()[FLOAT]], [1925 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1926 for ONNX tensor: 1926 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_751 [QuantizeLinear] outputs: [1926 -> (-1, 512, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_797 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1969 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1970 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1971 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_797 [DequantizeLinear] inputs: [1969 -> (-1, 2048, 7, 7)[FLOAT]], [1970 -> ()[FLOAT]], [1971 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1972 for ONNX tensor: 1972 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_797 [DequantizeLinear] outputs: [1972 -> (-1, 2048, 7, 7)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_754 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1926 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1927 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1928 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_754 [DequantizeLinear] inputs: [1926 -> (-1, 512, 14, 14)[FLOAT]], [1927 -> ()[FLOAT]], [1928 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1929 for ONNX tensor: 1929 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_754 [DequantizeLinear] outputs: [1929 -> (-1, 512, 14, 14)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Conv_761 [Conv] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1929 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1935 [03/25/2022-13:24:03] [V] [TRT] Conv_761 [Conv] inputs: [1929 -> (-1, 512, 14, 14)[FLOAT]], [1935 -> (512, 512, 3, 3)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Convolution input dimensions: (-1, 512, 14, 14) [03/25/2022-13:24:03] [V] [TRT] Kernel weights are not set yet. Kernel weights must be set using setInput(1, kernel_tensor) API call. [03/25/2022-13:24:03] [V] [TRT] Registering layer: Conv_761 for ONNX node: Conv_761 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1936 for ONNX tensor: 1936 [03/25/2022-13:24:03] [V] [TRT] Conv_761 [Conv] outputs: [1936 -> (-1, 512, 7, 7)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: BatchNormalization_762 [BatchNormalization] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1936 [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.3.0.bn2.bn.weight [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.3.0.bn2.bn.bias [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.3.0.bn2.bn.running_mean [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.3.0.bn2.bn.running_var [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_762 [BatchNormalization] inputs: [1936 -> (-1, 512, 7, 7)[FLOAT]], [sections.3.0.bn2.bn.weight -> (512)[FLOAT]], [sections.3.0.bn2.bn.bias -> (512)[FLOAT]], [sections.3.0.bn2.bn.running_mean -> (512)[FLOAT]], [sections.3.0.bn2.bn.running_var -> (512)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: BatchNormalization_762 for ONNX node: BatchNormalization_762 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1937 for ONNX tensor: 1937 [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_762 [BatchNormalization] outputs: [1937 -> (-1, 512, 7, 7)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Relu_763 [Relu] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1937 [03/25/2022-13:24:03] [V] [TRT] Relu_763 [Relu] inputs: [1937 -> (-1, 512, 7, 7)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: Relu_763 for ONNX node: Relu_763 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1938 for ONNX tensor: 1938 [03/25/2022-13:24:03] [V] [TRT] Relu_763 [Relu] outputs: [1938 -> (-1, 512, 7, 7)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: QuantizeLinear_766 [QuantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1938 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1939 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1940 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_766 [QuantizeLinear] inputs: [1938 -> (-1, 512, 7, 7)[FLOAT]], [1939 -> ()[FLOAT]], [1940 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1941 for ONNX tensor: 1941 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_766 [QuantizeLinear] outputs: [1941 -> (-1, 512, 7, 7)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_769 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1941 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1942 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1943 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_769 [DequantizeLinear] inputs: [1941 -> (-1, 512, 7, 7)[FLOAT]], [1942 -> ()[FLOAT]], [1943 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1944 for ONNX tensor: 1944 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_769 [DequantizeLinear] outputs: [1944 -> (-1, 512, 7, 7)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Conv_776 [Conv] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1944 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1950 [03/25/2022-13:24:03] [V] [TRT] Conv_776 [Conv] inputs: [1944 -> (-1, 512, 7, 7)[FLOAT]], [1950 -> (2048, 512, 1, 1)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Convolution input dimensions: (-1, 512, 7, 7) [03/25/2022-13:24:03] [V] [TRT] Kernel weights are not set yet. Kernel weights must be set using setInput(1, kernel_tensor) API call. [03/25/2022-13:24:03] [V] [TRT] Registering layer: Conv_776 for ONNX node: Conv_776 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1951 for ONNX tensor: 1951 [03/25/2022-13:24:03] [V] [TRT] Conv_776 [Conv] outputs: [1951 -> (-1, 2048, 7, 7)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: BatchNormalization_777 [BatchNormalization] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1951 [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.3.0.bn3.bn.weight [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.3.0.bn3.bn.bias [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.3.0.bn3.bn.running_mean [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.3.0.bn3.bn.running_var [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_777 [BatchNormalization] inputs: [1951 -> (-1, 2048, 7, 7)[FLOAT]], [sections.3.0.bn3.bn.weight -> (2048)[FLOAT]], [sections.3.0.bn3.bn.bias -> (2048)[FLOAT]], [sections.3.0.bn3.bn.running_mean -> (2048)[FLOAT]], [sections.3.0.bn3.bn.running_var -> (2048)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: BatchNormalization_777 for ONNX node: BatchNormalization_777 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1952 for ONNX tensor: 1952 [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_777 [BatchNormalization] outputs: [1952 -> (-1, 2048, 7, 7)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Add_798 [Add] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1972 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1952 [03/25/2022-13:24:03] [V] [TRT] Add_798 [Add] inputs: [1972 -> (-1, 2048, 7, 7)[FLOAT]], [1952 -> (-1, 2048, 7, 7)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: Add_798 for ONNX node: Add_798 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1973 for ONNX tensor: 1973 [03/25/2022-13:24:03] [V] [TRT] Add_798 [Add] outputs: [1973 -> (-1, 2048, 7, 7)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Relu_799 [Relu] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1973 [03/25/2022-13:24:03] [V] [TRT] Relu_799 [Relu] inputs: [1973 -> (-1, 2048, 7, 7)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: Relu_799 for ONNX node: Relu_799 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1974 for ONNX tensor: 1974 [03/25/2022-13:24:03] [V] [TRT] Relu_799 [Relu] outputs: [1974 -> (-1, 2048, 7, 7)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: QuantizeLinear_802 [QuantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1974 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1975 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1976 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_802 [QuantizeLinear] inputs: [1974 -> (-1, 2048, 7, 7)[FLOAT]], [1975 -> ()[FLOAT]], [1976 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1977 for ONNX tensor: 1977 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_802 [QuantizeLinear] outputs: [1977 -> (-1, 2048, 7, 7)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: QuantizeLinear_846 [QuantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1974 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2019 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2020 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_846 [QuantizeLinear] inputs: [1974 -> (-1, 2048, 7, 7)[FLOAT]], [2019 -> ()[FLOAT]], [2020 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 2021 for ONNX tensor: 2021 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_846 [QuantizeLinear] outputs: [2021 -> (-1, 2048, 7, 7)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_805 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1977 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1978 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1979 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_805 [DequantizeLinear] inputs: [1977 -> (-1, 2048, 7, 7)[FLOAT]], [1978 -> ()[FLOAT]], [1979 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1980 for ONNX tensor: 1980 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_805 [DequantizeLinear] outputs: [1980 -> (-1, 2048, 7, 7)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_849 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2021 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2022 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2023 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_849 [DequantizeLinear] inputs: [2021 -> (-1, 2048, 7, 7)[FLOAT]], [2022 -> ()[FLOAT]], [2023 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 2024 for ONNX tensor: 2024 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_849 [DequantizeLinear] outputs: [2024 -> (-1, 2048, 7, 7)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Conv_812 [Conv] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1980 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1986 [03/25/2022-13:24:03] [V] [TRT] Conv_812 [Conv] inputs: [1980 -> (-1, 2048, 7, 7)[FLOAT]], [1986 -> (512, 2048, 1, 1)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Convolution input dimensions: (-1, 2048, 7, 7) [03/25/2022-13:24:03] [V] [TRT] Kernel weights are not set yet. Kernel weights must be set using setInput(1, kernel_tensor) API call. [03/25/2022-13:24:03] [V] [TRT] Registering layer: Conv_812 for ONNX node: Conv_812 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1987 for ONNX tensor: 1987 [03/25/2022-13:24:03] [V] [TRT] Conv_812 [Conv] outputs: [1987 -> (-1, 512, 7, 7)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: BatchNormalization_813 [BatchNormalization] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1987 [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.3.1.bn1.bn.weight [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.3.1.bn1.bn.bias [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.3.1.bn1.bn.running_mean [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.3.1.bn1.bn.running_var [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_813 [BatchNormalization] inputs: [1987 -> (-1, 512, 7, 7)[FLOAT]], [sections.3.1.bn1.bn.weight -> (512)[FLOAT]], [sections.3.1.bn1.bn.bias -> (512)[FLOAT]], [sections.3.1.bn1.bn.running_mean -> (512)[FLOAT]], [sections.3.1.bn1.bn.running_var -> (512)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: BatchNormalization_813 for ONNX node: BatchNormalization_813 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1988 for ONNX tensor: 1988 [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_813 [BatchNormalization] outputs: [1988 -> (-1, 512, 7, 7)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Relu_814 [Relu] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1988 [03/25/2022-13:24:03] [V] [TRT] Relu_814 [Relu] inputs: [1988 -> (-1, 512, 7, 7)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: Relu_814 for ONNX node: Relu_814 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1989 for ONNX tensor: 1989 [03/25/2022-13:24:03] [V] [TRT] Relu_814 [Relu] outputs: [1989 -> (-1, 512, 7, 7)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: QuantizeLinear_817 [QuantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1989 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1990 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1991 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_817 [QuantizeLinear] inputs: [1989 -> (-1, 512, 7, 7)[FLOAT]], [1990 -> ()[FLOAT]], [1991 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1992 for ONNX tensor: 1992 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_817 [QuantizeLinear] outputs: [1992 -> (-1, 512, 7, 7)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_820 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1992 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1993 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1994 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_820 [DequantizeLinear] inputs: [1992 -> (-1, 512, 7, 7)[FLOAT]], [1993 -> ()[FLOAT]], [1994 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 1995 for ONNX tensor: 1995 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_820 [DequantizeLinear] outputs: [1995 -> (-1, 512, 7, 7)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Conv_827 [Conv] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 1995 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2001 [03/25/2022-13:24:03] [V] [TRT] Conv_827 [Conv] inputs: [1995 -> (-1, 512, 7, 7)[FLOAT]], [2001 -> (512, 512, 3, 3)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Convolution input dimensions: (-1, 512, 7, 7) [03/25/2022-13:24:03] [V] [TRT] Kernel weights are not set yet. Kernel weights must be set using setInput(1, kernel_tensor) API call. [03/25/2022-13:24:03] [V] [TRT] Registering layer: Conv_827 for ONNX node: Conv_827 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 2002 for ONNX tensor: 2002 [03/25/2022-13:24:03] [V] [TRT] Conv_827 [Conv] outputs: [2002 -> (-1, 512, 7, 7)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: BatchNormalization_828 [BatchNormalization] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2002 [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.3.1.bn2.bn.weight [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.3.1.bn2.bn.bias [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.3.1.bn2.bn.running_mean [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.3.1.bn2.bn.running_var [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_828 [BatchNormalization] inputs: [2002 -> (-1, 512, 7, 7)[FLOAT]], [sections.3.1.bn2.bn.weight -> (512)[FLOAT]], [sections.3.1.bn2.bn.bias -> (512)[FLOAT]], [sections.3.1.bn2.bn.running_mean -> (512)[FLOAT]], [sections.3.1.bn2.bn.running_var -> (512)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: BatchNormalization_828 for ONNX node: BatchNormalization_828 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 2003 for ONNX tensor: 2003 [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_828 [BatchNormalization] outputs: [2003 -> (-1, 512, 7, 7)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Relu_829 [Relu] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2003 [03/25/2022-13:24:03] [V] [TRT] Relu_829 [Relu] inputs: [2003 -> (-1, 512, 7, 7)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: Relu_829 for ONNX node: Relu_829 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 2004 for ONNX tensor: 2004 [03/25/2022-13:24:03] [V] [TRT] Relu_829 [Relu] outputs: [2004 -> (-1, 512, 7, 7)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: QuantizeLinear_832 [QuantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2004 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2005 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2006 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_832 [QuantizeLinear] inputs: [2004 -> (-1, 512, 7, 7)[FLOAT]], [2005 -> ()[FLOAT]], [2006 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 2007 for ONNX tensor: 2007 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_832 [QuantizeLinear] outputs: [2007 -> (-1, 512, 7, 7)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_835 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2007 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2008 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2009 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_835 [DequantizeLinear] inputs: [2007 -> (-1, 512, 7, 7)[FLOAT]], [2008 -> ()[FLOAT]], [2009 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 2010 for ONNX tensor: 2010 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_835 [DequantizeLinear] outputs: [2010 -> (-1, 512, 7, 7)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Conv_842 [Conv] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2010 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2016 [03/25/2022-13:24:03] [V] [TRT] Conv_842 [Conv] inputs: [2010 -> (-1, 512, 7, 7)[FLOAT]], [2016 -> (2048, 512, 1, 1)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Convolution input dimensions: (-1, 512, 7, 7) [03/25/2022-13:24:03] [V] [TRT] Kernel weights are not set yet. Kernel weights must be set using setInput(1, kernel_tensor) API call. [03/25/2022-13:24:03] [V] [TRT] Registering layer: Conv_842 for ONNX node: Conv_842 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 2017 for ONNX tensor: 2017 [03/25/2022-13:24:03] [V] [TRT] Conv_842 [Conv] outputs: [2017 -> (-1, 2048, 7, 7)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: BatchNormalization_843 [BatchNormalization] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2017 [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.3.1.bn3.bn.weight [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.3.1.bn3.bn.bias [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.3.1.bn3.bn.running_mean [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.3.1.bn3.bn.running_var [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_843 [BatchNormalization] inputs: [2017 -> (-1, 2048, 7, 7)[FLOAT]], [sections.3.1.bn3.bn.weight -> (2048)[FLOAT]], [sections.3.1.bn3.bn.bias -> (2048)[FLOAT]], [sections.3.1.bn3.bn.running_mean -> (2048)[FLOAT]], [sections.3.1.bn3.bn.running_var -> (2048)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: BatchNormalization_843 for ONNX node: BatchNormalization_843 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 2018 for ONNX tensor: 2018 [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_843 [BatchNormalization] outputs: [2018 -> (-1, 2048, 7, 7)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Add_850 [Add] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2024 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2018 [03/25/2022-13:24:03] [V] [TRT] Add_850 [Add] inputs: [2024 -> (-1, 2048, 7, 7)[FLOAT]], [2018 -> (-1, 2048, 7, 7)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: Add_850 for ONNX node: Add_850 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 2025 for ONNX tensor: 2025 [03/25/2022-13:24:03] [V] [TRT] Add_850 [Add] outputs: [2025 -> (-1, 2048, 7, 7)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Relu_851 [Relu] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2025 [03/25/2022-13:24:03] [V] [TRT] Relu_851 [Relu] inputs: [2025 -> (-1, 2048, 7, 7)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: Relu_851 for ONNX node: Relu_851 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 2026 for ONNX tensor: 2026 [03/25/2022-13:24:03] [V] [TRT] Relu_851 [Relu] outputs: [2026 -> (-1, 2048, 7, 7)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: QuantizeLinear_854 [QuantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2026 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2027 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2028 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_854 [QuantizeLinear] inputs: [2026 -> (-1, 2048, 7, 7)[FLOAT]], [2027 -> ()[FLOAT]], [2028 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 2029 for ONNX tensor: 2029 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_854 [QuantizeLinear] outputs: [2029 -> (-1, 2048, 7, 7)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: QuantizeLinear_898 [QuantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2026 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2071 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2072 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_898 [QuantizeLinear] inputs: [2026 -> (-1, 2048, 7, 7)[FLOAT]], [2071 -> ()[FLOAT]], [2072 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 2073 for ONNX tensor: 2073 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_898 [QuantizeLinear] outputs: [2073 -> (-1, 2048, 7, 7)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_857 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2029 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2030 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2031 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_857 [DequantizeLinear] inputs: [2029 -> (-1, 2048, 7, 7)[FLOAT]], [2030 -> ()[FLOAT]], [2031 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 2032 for ONNX tensor: 2032 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_857 [DequantizeLinear] outputs: [2032 -> (-1, 2048, 7, 7)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_901 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2073 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2074 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2075 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_901 [DequantizeLinear] inputs: [2073 -> (-1, 2048, 7, 7)[FLOAT]], [2074 -> ()[FLOAT]], [2075 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 2076 for ONNX tensor: 2076 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_901 [DequantizeLinear] outputs: [2076 -> (-1, 2048, 7, 7)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Conv_864 [Conv] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2032 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2038 [03/25/2022-13:24:03] [V] [TRT] Conv_864 [Conv] inputs: [2032 -> (-1, 2048, 7, 7)[FLOAT]], [2038 -> (512, 2048, 1, 1)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Convolution input dimensions: (-1, 2048, 7, 7) [03/25/2022-13:24:03] [V] [TRT] Kernel weights are not set yet. Kernel weights must be set using setInput(1, kernel_tensor) API call. [03/25/2022-13:24:03] [V] [TRT] Registering layer: Conv_864 for ONNX node: Conv_864 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 2039 for ONNX tensor: 2039 [03/25/2022-13:24:03] [V] [TRT] Conv_864 [Conv] outputs: [2039 -> (-1, 512, 7, 7)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: BatchNormalization_865 [BatchNormalization] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2039 [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.3.2.bn1.bn.weight [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.3.2.bn1.bn.bias [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.3.2.bn1.bn.running_mean [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.3.2.bn1.bn.running_var [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_865 [BatchNormalization] inputs: [2039 -> (-1, 512, 7, 7)[FLOAT]], [sections.3.2.bn1.bn.weight -> (512)[FLOAT]], [sections.3.2.bn1.bn.bias -> (512)[FLOAT]], [sections.3.2.bn1.bn.running_mean -> (512)[FLOAT]], [sections.3.2.bn1.bn.running_var -> (512)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: BatchNormalization_865 for ONNX node: BatchNormalization_865 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 2040 for ONNX tensor: 2040 [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_865 [BatchNormalization] outputs: [2040 -> (-1, 512, 7, 7)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Relu_866 [Relu] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2040 [03/25/2022-13:24:03] [V] [TRT] Relu_866 [Relu] inputs: [2040 -> (-1, 512, 7, 7)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: Relu_866 for ONNX node: Relu_866 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 2041 for ONNX tensor: 2041 [03/25/2022-13:24:03] [V] [TRT] Relu_866 [Relu] outputs: [2041 -> (-1, 512, 7, 7)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: QuantizeLinear_869 [QuantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2041 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2042 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2043 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_869 [QuantizeLinear] inputs: [2041 -> (-1, 512, 7, 7)[FLOAT]], [2042 -> ()[FLOAT]], [2043 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 2044 for ONNX tensor: 2044 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_869 [QuantizeLinear] outputs: [2044 -> (-1, 512, 7, 7)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_872 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2044 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2045 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2046 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_872 [DequantizeLinear] inputs: [2044 -> (-1, 512, 7, 7)[FLOAT]], [2045 -> ()[FLOAT]], [2046 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 2047 for ONNX tensor: 2047 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_872 [DequantizeLinear] outputs: [2047 -> (-1, 512, 7, 7)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Conv_879 [Conv] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2047 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2053 [03/25/2022-13:24:03] [V] [TRT] Conv_879 [Conv] inputs: [2047 -> (-1, 512, 7, 7)[FLOAT]], [2053 -> (512, 512, 3, 3)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Convolution input dimensions: (-1, 512, 7, 7) [03/25/2022-13:24:03] [V] [TRT] Kernel weights are not set yet. Kernel weights must be set using setInput(1, kernel_tensor) API call. [03/25/2022-13:24:03] [V] [TRT] Registering layer: Conv_879 for ONNX node: Conv_879 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 2054 for ONNX tensor: 2054 [03/25/2022-13:24:03] [V] [TRT] Conv_879 [Conv] outputs: [2054 -> (-1, 512, 7, 7)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: BatchNormalization_880 [BatchNormalization] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2054 [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.3.2.bn2.bn.weight [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.3.2.bn2.bn.bias [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.3.2.bn2.bn.running_mean [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.3.2.bn2.bn.running_var [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_880 [BatchNormalization] inputs: [2054 -> (-1, 512, 7, 7)[FLOAT]], [sections.3.2.bn2.bn.weight -> (512)[FLOAT]], [sections.3.2.bn2.bn.bias -> (512)[FLOAT]], [sections.3.2.bn2.bn.running_mean -> (512)[FLOAT]], [sections.3.2.bn2.bn.running_var -> (512)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: BatchNormalization_880 for ONNX node: BatchNormalization_880 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 2055 for ONNX tensor: 2055 [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_880 [BatchNormalization] outputs: [2055 -> (-1, 512, 7, 7)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Relu_881 [Relu] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2055 [03/25/2022-13:24:03] [V] [TRT] Relu_881 [Relu] inputs: [2055 -> (-1, 512, 7, 7)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: Relu_881 for ONNX node: Relu_881 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 2056 for ONNX tensor: 2056 [03/25/2022-13:24:03] [V] [TRT] Relu_881 [Relu] outputs: [2056 -> (-1, 512, 7, 7)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: QuantizeLinear_884 [QuantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2056 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2057 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2058 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_884 [QuantizeLinear] inputs: [2056 -> (-1, 512, 7, 7)[FLOAT]], [2057 -> ()[FLOAT]], [2058 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 2059 for ONNX tensor: 2059 [03/25/2022-13:24:03] [V] [TRT] QuantizeLinear_884 [QuantizeLinear] outputs: [2059 -> (-1, 512, 7, 7)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: DequantizeLinear_887 [DequantizeLinear] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2059 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2060 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2061 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_887 [DequantizeLinear] inputs: [2059 -> (-1, 512, 7, 7)[FLOAT]], [2060 -> ()[FLOAT]], [2061 -> ()[INT8]], [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 2062 for ONNX tensor: 2062 [03/25/2022-13:24:03] [V] [TRT] DequantizeLinear_887 [DequantizeLinear] outputs: [2062 -> (-1, 512, 7, 7)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Conv_894 [Conv] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2062 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2068 [03/25/2022-13:24:03] [V] [TRT] Conv_894 [Conv] inputs: [2062 -> (-1, 512, 7, 7)[FLOAT]], [2068 -> (2048, 512, 1, 1)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Convolution input dimensions: (-1, 512, 7, 7) [03/25/2022-13:24:03] [V] [TRT] Kernel weights are not set yet. Kernel weights must be set using setInput(1, kernel_tensor) API call. [03/25/2022-13:24:03] [V] [TRT] Registering layer: Conv_894 for ONNX node: Conv_894 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 2069 for ONNX tensor: 2069 [03/25/2022-13:24:03] [V] [TRT] Conv_894 [Conv] outputs: [2069 -> (-1, 2048, 7, 7)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: BatchNormalization_895 [BatchNormalization] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2069 [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.3.2.bn3.bn.weight [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.3.2.bn3.bn.bias [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.3.2.bn3.bn.running_mean [03/25/2022-13:24:03] [V] [TRT] Searching for input: sections.3.2.bn3.bn.running_var [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_895 [BatchNormalization] inputs: [2069 -> (-1, 2048, 7, 7)[FLOAT]], [sections.3.2.bn3.bn.weight -> (2048)[FLOAT]], [sections.3.2.bn3.bn.bias -> (2048)[FLOAT]], [sections.3.2.bn3.bn.running_mean -> (2048)[FLOAT]], [sections.3.2.bn3.bn.running_var -> (2048)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: BatchNormalization_895 for ONNX node: BatchNormalization_895 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 2070 for ONNX tensor: 2070 [03/25/2022-13:24:03] [V] [TRT] BatchNormalization_895 [BatchNormalization] outputs: [2070 -> (-1, 2048, 7, 7)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Add_902 [Add] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2076 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2070 [03/25/2022-13:24:03] [V] [TRT] Add_902 [Add] inputs: [2076 -> (-1, 2048, 7, 7)[FLOAT]], [2070 -> (-1, 2048, 7, 7)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: Add_902 for ONNX node: Add_902 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 2077 for ONNX tensor: 2077 [03/25/2022-13:24:03] [V] [TRT] Add_902 [Add] outputs: [2077 -> (-1, 2048, 7, 7)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Relu_903 [Relu] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2077 [03/25/2022-13:24:03] [V] [TRT] Relu_903 [Relu] inputs: [2077 -> (-1, 2048, 7, 7)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: Relu_903 for ONNX node: Relu_903 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 2078 for ONNX tensor: 2078 [03/25/2022-13:24:03] [V] [TRT] Relu_903 [Relu] outputs: [2078 -> (-1, 2048, 7, 7)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: GlobalAveragePool_904 [GlobalAveragePool] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2078 [03/25/2022-13:24:03] [V] [TRT] GlobalAveragePool_904 [GlobalAveragePool] inputs: [2078 -> (-1, 2048, 7, 7)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: GlobalAveragePool_904 for ONNX node: GlobalAveragePool_904 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 2079 for ONNX tensor: 2079 [03/25/2022-13:24:03] [V] [TRT] GlobalAveragePool_904 [GlobalAveragePool] outputs: [2079 -> (-1, 2048, 1, 1)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Shape_905 [Shape] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2079 [03/25/2022-13:24:03] [V] [TRT] Shape_905 [Shape] inputs: [2079 -> (-1, 2048, 1, 1)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: Shape_905 for ONNX node: Shape_905 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 2080 for ONNX tensor: 2080 [03/25/2022-13:24:03] [V] [TRT] Shape_905 [Shape] outputs: [2080 -> (4)[INT32]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Gather_907 [Gather] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2080 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2081 [03/25/2022-13:24:03] [V] [TRT] Gather_907 [Gather] inputs: [2080 -> (4)[INT32]], [2081 -> ()[INT32]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: 2081 for ONNX node: 2081 [03/25/2022-13:24:03] [V] [TRT] Using Gather axis: 0 [03/25/2022-13:24:03] [V] [TRT] Registering layer: Gather_907 for ONNX node: Gather_907 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 2082 for ONNX tensor: 2082 [03/25/2022-13:24:03] [V] [TRT] Gather_907 [Gather] outputs: [2082 -> ()[INT32]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Unsqueeze_908 [Unsqueeze] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2082 [03/25/2022-13:24:03] [V] [TRT] Unsqueeze_908 [Unsqueeze] inputs: [2082 -> ()[INT32]], [03/25/2022-13:24:03] [V] [TRT] Original shape: (), unsqueezing to: (1,) [03/25/2022-13:24:03] [V] [TRT] Registering layer: Unsqueeze_908 for ONNX node: Unsqueeze_908 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 2084 for ONNX tensor: 2084 [03/25/2022-13:24:03] [V] [TRT] Unsqueeze_908 [Unsqueeze] outputs: [2084 -> (1)[INT32]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Concat_909 [Concat] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2084 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2090 [03/25/2022-13:24:03] [V] [TRT] Concat_909 [Concat] inputs: [2084 -> (1)[INT32]], [2090 -> (1)[INT32]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: 2090 for ONNX node: 2090 [03/25/2022-13:24:03] [V] [TRT] Registering layer: Concat_909 for ONNX node: Concat_909 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 2086 for ONNX tensor: 2086 [03/25/2022-13:24:03] [V] [TRT] Concat_909 [Concat] outputs: [2086 -> (2)[INT32]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Reshape_910 [Reshape] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2079 [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2086 [03/25/2022-13:24:03] [V] [TRT] Reshape_910 [Reshape] inputs: [2079 -> (-1, 2048, 1, 1)[FLOAT]], [2086 -> (2)[INT32]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: Reshape_910 for ONNX node: Reshape_910 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: 2087 for ONNX tensor: 2087 [03/25/2022-13:24:03] [V] [TRT] Reshape_910 [Reshape] outputs: [2087 -> (-1, 2048)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Gemm_911 [Gemm] [03/25/2022-13:24:03] [V] [TRT] Searching for input: 2087 [03/25/2022-13:24:03] [V] [TRT] Searching for input: classifier.fc.weight [03/25/2022-13:24:03] [V] [TRT] Searching for input: classifier.fc.bias [03/25/2022-13:24:03] [V] [TRT] Gemm_911 [Gemm] inputs: [2087 -> (-1, 2048)[FLOAT]], [classifier.fc.weight -> (1000, 2048)[FLOAT]], [classifier.fc.bias -> (1000)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] GEMM: using FC layer instead of MM because all criteria were met. [03/25/2022-13:24:03] [V] [TRT] Original shape: (_, 2048), unsqueezing to: (_, _, _, _) [03/25/2022-13:24:03] [V] [TRT] Registering layer: Gemm_911 for ONNX node: Gemm_911 [03/25/2022-13:24:03] [V] [TRT] Original shape: (_, 1000, 1, 1), squeezing to: (_, _) [03/25/2022-13:24:03] [V] [TRT] Registering tensor: output_0_0 for ONNX tensor: output_0 [03/25/2022-13:24:03] [V] [TRT] Gemm_911 [Gemm] outputs: [output_0 -> (-1, 1000)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Parsing node: Softmax_912 [Softmax] [03/25/2022-13:24:03] [V] [TRT] Searching for input: output_0 [03/25/2022-13:24:03] [V] [TRT] Softmax_912 [Softmax] inputs: [output_0 -> (-1, 1000)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Registering layer: Softmax_912 for ONNX node: Softmax_912 [03/25/2022-13:24:03] [V] [TRT] Registering tensor: output_1_1 for ONNX tensor: output_1 [03/25/2022-13:24:03] [V] [TRT] Softmax_912 [Softmax] outputs: [output_1 -> (-1, 1000)[FLOAT]], [03/25/2022-13:24:03] [V] [TRT] Marking output_0_0 as output: output_0 [03/25/2022-13:24:03] [V] [TRT] Marking output_1_1 as output: output_1 [03/25/2022-13:24:03] [I] Finish parsing network model [03/25/2022-13:24:03] [I] FP32 and INT8 precisions have been specified - more performance might be enabled by additionally specifying --fp16 or --best [03/25/2022-13:24:03] [V] [TRT] Applying generic optimizations to the graph for inference. [03/25/2022-13:24:03] [V] [TRT] Original: 965 layers [03/25/2022-13:24:03] [V] [TRT] After dead-layer removal: 965 layers [03/25/2022-13:24:03] [V] [TRT] QDQ graph optimizer - constant folding of Q/DQ initializers [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 1) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 0) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 9) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 8) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 17) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 16) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 25) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 24) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 33) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 32) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 41) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 40) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 49) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 48) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 57) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 56) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 65) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 64) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 73) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 72) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 81) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 80) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 89) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 88) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 97) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 96) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 105) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 104) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 113) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 112) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 121) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 120) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 129) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 128) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 137) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 136) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 145) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 144) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 153) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 152) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 161) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 160) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 169) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 168) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 177) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 176) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 185) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 184) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 193) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 192) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 201) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 200) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 209) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 208) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 216) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 215) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 222) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 221) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 228) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 227) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 234) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 233) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 240) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 239) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 246) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 245) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 252) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 251) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 258) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 257) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 264) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 263) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 270) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 269) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 276) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 275) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 282) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 281) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 288) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 287) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 294) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 293) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 300) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 299) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 306) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 305) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 312) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 311) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 318) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 317) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 324) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 323) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 330) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 329) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 336) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 335) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 342) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 341) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 348) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 347) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 354) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 353) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 360) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 359) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 366) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 365) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 372) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 371) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 382) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 381) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 388) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 387) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 399) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 398) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 405) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 404) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 414) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 413) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 424) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 423) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 430) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 429) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 439) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 438) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 448) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 447) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 458) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 457) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 464) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 463) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 473) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 472) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 482) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 481) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 492) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 491) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 498) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 497) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 509) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 508) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 515) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 514) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 524) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 523) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 534) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 533) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 540) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 539) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 549) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 548) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 558) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 557) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 568) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 567) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 574) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 573) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 583) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 582) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 592) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 591) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 602) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 601) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 608) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 607) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 617) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 616) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 626) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 625) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 636) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 635) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 642) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 641) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 653) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 652) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 659) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 658) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 668) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 667) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 678) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 677) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 684) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 683) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 693) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 692) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 702) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 701) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 712) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 711) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 718) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 717) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 727) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 726) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 736) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 735) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 746) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 745) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 752) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 751) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 761) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 760) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 770) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 769) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 780) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 779) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 786) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 785) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 795) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 794) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 804) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 803) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 814) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 813) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 820) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 819) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 829) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 828) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 838) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 837) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 848) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 847) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 854) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 853) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 865) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 864) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 871) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 870) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 880) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 879) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 890) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 889) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 896) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 895) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 905) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 904) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 914) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 913) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 924) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 923) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 930) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 929) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 939) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 938) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 948) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 947) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 5) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 4) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 13) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 12) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 21) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 20) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 29) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 28) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 37) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 36) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 45) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 44) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 53) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 52) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 61) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 60) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 69) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 68) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 77) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 76) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 85) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 84) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 93) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 92) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 101) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 100) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 109) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 108) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 117) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 116) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 125) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 124) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 133) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 132) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 141) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 140) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 149) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 148) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 157) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 156) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 165) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 164) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 173) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 172) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 181) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 180) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 189) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 188) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 197) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 196) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 205) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 204) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 213) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 212) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 219) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 218) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 225) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 224) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 231) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 230) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 237) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 236) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 243) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 242) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 249) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 248) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 255) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 254) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 261) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 260) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 267) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 266) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 273) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 272) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 279) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 278) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 285) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 284) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 291) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 290) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 297) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 296) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 303) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 302) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 309) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 308) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 315) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 314) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 321) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 320) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 327) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 326) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 333) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 332) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 339) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 338) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 345) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 344) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 351) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 350) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 357) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 356) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 363) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 362) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 369) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 368) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 375) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 374) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 385) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 384) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 391) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 390) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 402) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 401) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 408) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 407) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 417) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 416) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 427) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 426) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 433) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 432) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 442) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 441) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 451) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 450) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 461) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 460) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 467) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 466) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 476) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 475) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 485) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 484) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 495) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 494) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 501) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 500) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 512) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 511) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 518) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 517) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 527) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 526) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 537) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 536) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 543) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 542) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 552) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 551) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 561) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 560) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 571) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 570) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 577) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 576) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 586) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 585) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 595) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 594) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 605) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 604) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 611) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 610) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 620) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 619) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 629) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 628) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 639) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 638) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 645) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 644) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 656) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 655) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 662) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 661) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 671) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 670) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 681) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 680) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 687) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 686) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 696) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 695) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 705) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 704) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 715) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 714) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 721) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 720) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 730) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 729) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 739) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 738) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 749) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 748) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 755) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 754) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 764) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 763) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 773) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 772) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 783) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 782) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 789) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 788) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 798) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 797) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 807) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 806) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 817) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 816) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 823) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 822) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 832) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 831) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 841) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 840) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 851) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 850) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 857) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 856) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 868) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 867) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 874) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 873) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 883) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 882) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 893) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 892) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 899) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 898) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 908) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 907) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 917) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 916) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 927) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 926) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 933) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 932) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 942) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 941) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ConstQDQInitializersFusion [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 951) [Constant] [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 950) [Constant] [03/25/2022-13:24:03] [V] [TRT] Running: ShuffleShuffleFusion [03/25/2022-13:24:03] [V] [TRT] ShuffleShuffleFusion: Fusing Reshape_910 with (Unnamed Layer* 970) [Shuffle] [03/25/2022-13:24:03] [V] [TRT] Running: ShuffleErasure [03/25/2022-13:24:03] [V] [TRT] Removing Reshape_910 + (Unnamed Layer* 970) [Shuffle] [03/25/2022-13:24:03] [V] [TRT] Running: ShuffleErasure [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 981) [Shuffle] [03/25/2022-13:24:03] [V] [TRT] Running: ShuffleErasure [03/25/2022-13:24:03] [V] [TRT] Removing (Unnamed Layer* 983) [Shuffle] [03/25/2022-13:24:03] [V] [TRT] After Myelin optimization: 473 layers [03/25/2022-13:24:03] [V] [TRT] Running: FCToConvTransform [03/25/2022-13:24:03] [V] [TRT] Convert layer type of Gemm_911 from FULLY_CONNECTED to CONVOLUTION [03/25/2022-13:24:03] [V] [TRT] Running: ShuffleErasure [03/25/2022-13:24:03] [V] [TRT] Removing shuffle_between_2079_and_Gemm_911 [03/25/2022-13:24:03] [V] [TRT] QDQ graph optimizer - constant folding of Q/DQ initializers [03/25/2022-13:24:03] [V] [TRT] QDQ graph optimizer forward pass - DQ motions and fusions [03/25/2022-13:24:03] [V] [TRT] Running: EltReluFusion [03/25/2022-13:24:03] [V] [TRT] EltReluFusion: Fusing Add_80 with Relu_81 [03/25/2022-13:24:03] [V] [TRT] Running: EltReluFusion [03/25/2022-13:24:03] [V] [TRT] EltReluFusion: Fusing Add_132 with Relu_133 [03/25/2022-13:24:03] [V] [TRT] Running: EltReluFusion [03/25/2022-13:24:03] [V] [TRT] EltReluFusion: Fusing Add_184 with Relu_185 [03/25/2022-13:24:03] [V] [TRT] Running: EltReluFusion [03/25/2022-13:24:03] [V] [TRT] EltReluFusion: Fusing Add_250 with Relu_251 [03/25/2022-13:24:03] [V] [TRT] Running: EltReluFusion [03/25/2022-13:24:03] [V] [TRT] EltReluFusion: Fusing Add_302 with Relu_303 [03/25/2022-13:24:03] [V] [TRT] Running: EltReluFusion [03/25/2022-13:24:03] [V] [TRT] EltReluFusion: Fusing Add_354 with Relu_355 [03/25/2022-13:24:03] [V] [TRT] Running: EltReluFusion [03/25/2022-13:24:03] [V] [TRT] EltReluFusion: Fusing Add_406 with Relu_407 [03/25/2022-13:24:03] [V] [TRT] Running: EltReluFusion [03/25/2022-13:24:03] [V] [TRT] EltReluFusion: Fusing Add_472 with Relu_473 [03/25/2022-13:24:03] [V] [TRT] Running: EltReluFusion [03/25/2022-13:24:03] [V] [TRT] EltReluFusion: Fusing Add_524 with Relu_525 [03/25/2022-13:24:03] [V] [TRT] Running: EltReluFusion [03/25/2022-13:24:03] [V] [TRT] EltReluFusion: Fusing Add_576 with Relu_577 [03/25/2022-13:24:03] [V] [TRT] Running: EltReluFusion [03/25/2022-13:24:03] [V] [TRT] EltReluFusion: Fusing Add_628 with Relu_629 [03/25/2022-13:24:03] [V] [TRT] Running: EltReluFusion [03/25/2022-13:24:03] [V] [TRT] EltReluFusion: Fusing Add_680 with Relu_681 [03/25/2022-13:24:03] [V] [TRT] Running: EltReluFusion [03/25/2022-13:24:03] [V] [TRT] EltReluFusion: Fusing Add_732 with Relu_733 [03/25/2022-13:24:03] [V] [TRT] Running: EltReluFusion [03/25/2022-13:24:03] [V] [TRT] EltReluFusion: Fusing Add_798 with Relu_799 [03/25/2022-13:24:03] [V] [TRT] Running: EltReluFusion [03/25/2022-13:24:03] [V] [TRT] EltReluFusion: Fusing Add_850 with Relu_851 [03/25/2022-13:24:03] [V] [TRT] Running: EltReluFusion [03/25/2022-13:24:03] [V] [TRT] EltReluFusion: Fusing Add_902 with Relu_903 [03/25/2022-13:24:03] [V] [TRT] Running: ReduceToPoolingFusion [03/25/2022-13:24:03] [V] [TRT] Swap the layer type of GlobalAveragePool_904 from REDUCE to POOLING [03/25/2022-13:24:03] [V] [TRT] Running: ConstWeightsQuantizeFusion [03/25/2022-13:24:03] [V] [TRT] ConstWeightsQuantizeFusion: Fusing input.conv.module.weight with QuantizeLinear_8_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: ConstWeightsQuantizeFusion [03/25/2022-13:24:03] [V] [TRT] ConstWeightsQuantizeFusion: Fusing sections.0.0.conv1.module.weight with QuantizeLinear_24_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: ConstWeightsQuantizeFusion [03/25/2022-13:24:03] [V] [TRT] ConstWeightsQuantizeFusion: Fusing sections.0.0.conv2.module.weight with QuantizeLinear_39_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: ConstWeightsQuantizeFusion [03/25/2022-13:24:03] [V] [TRT] ConstWeightsQuantizeFusion: Fusing sections.0.0.conv3.module.weight with QuantizeLinear_54_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: ConstWeightsQuantizeFusion [03/25/2022-13:24:03] [V] [TRT] ConstWeightsQuantizeFusion: Fusing sections.0.0.identity.conv.module.weight with QuantizeLinear_68_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: ConstWeightsQuantizeFusion [03/25/2022-13:24:03] [V] [TRT] ConstWeightsQuantizeFusion: Fusing sections.0.1.conv1.module.weight with QuantizeLinear_90_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: ConstWeightsQuantizeFusion [03/25/2022-13:24:03] [V] [TRT] ConstWeightsQuantizeFusion: Fusing sections.0.1.conv2.module.weight with QuantizeLinear_105_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: ConstWeightsQuantizeFusion [03/25/2022-13:24:03] [V] [TRT] ConstWeightsQuantizeFusion: Fusing sections.0.1.conv3.module.weight with QuantizeLinear_120_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: ConstWeightsQuantizeFusion [03/25/2022-13:24:03] [V] [TRT] ConstWeightsQuantizeFusion: Fusing sections.0.2.conv1.module.weight with QuantizeLinear_142_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: ConstWeightsQuantizeFusion [03/25/2022-13:24:03] [V] [TRT] ConstWeightsQuantizeFusion: Fusing sections.0.2.conv2.module.weight with QuantizeLinear_157_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: ConstWeightsQuantizeFusion [03/25/2022-13:24:03] [V] [TRT] ConstWeightsQuantizeFusion: Fusing sections.0.2.conv3.module.weight with QuantizeLinear_172_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: ConstWeightsQuantizeFusion [03/25/2022-13:24:03] [V] [TRT] ConstWeightsQuantizeFusion: Fusing sections.1.0.conv1.module.weight with QuantizeLinear_194_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: ConstWeightsQuantizeFusion [03/25/2022-13:24:03] [V] [TRT] ConstWeightsQuantizeFusion: Fusing sections.1.0.conv2.module.weight with QuantizeLinear_209_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: ConstWeightsQuantizeFusion [03/25/2022-13:24:03] [V] [TRT] ConstWeightsQuantizeFusion: Fusing sections.1.0.conv3.module.weight with QuantizeLinear_224_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: ConstWeightsQuantizeFusion [03/25/2022-13:24:03] [V] [TRT] ConstWeightsQuantizeFusion: Fusing sections.1.0.identity.conv.module.weight with QuantizeLinear_238_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: ConstWeightsQuantizeFusion [03/25/2022-13:24:03] [V] [TRT] ConstWeightsQuantizeFusion: Fusing sections.1.1.conv1.module.weight with QuantizeLinear_260_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: ConstWeightsQuantizeFusion [03/25/2022-13:24:03] [V] [TRT] ConstWeightsQuantizeFusion: Fusing sections.1.1.conv2.module.weight with QuantizeLinear_275_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: ConstWeightsQuantizeFusion [03/25/2022-13:24:03] [V] [TRT] ConstWeightsQuantizeFusion: Fusing sections.1.1.conv3.module.weight with QuantizeLinear_290_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: ConstWeightsQuantizeFusion [03/25/2022-13:24:03] [V] [TRT] ConstWeightsQuantizeFusion: Fusing sections.1.2.conv1.module.weight with QuantizeLinear_312_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: ConstWeightsQuantizeFusion [03/25/2022-13:24:03] [V] [TRT] ConstWeightsQuantizeFusion: Fusing sections.1.2.conv2.module.weight with QuantizeLinear_327_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: ConstWeightsQuantizeFusion [03/25/2022-13:24:03] [V] [TRT] ConstWeightsQuantizeFusion: Fusing sections.1.2.conv3.module.weight with QuantizeLinear_342_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: ConstWeightsQuantizeFusion [03/25/2022-13:24:03] [V] [TRT] ConstWeightsQuantizeFusion: Fusing sections.1.3.conv1.module.weight with QuantizeLinear_364_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: ConstWeightsQuantizeFusion [03/25/2022-13:24:03] [V] [TRT] ConstWeightsQuantizeFusion: Fusing sections.1.3.conv2.module.weight with QuantizeLinear_379_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: ConstWeightsQuantizeFusion [03/25/2022-13:24:03] [V] [TRT] ConstWeightsQuantizeFusion: Fusing sections.1.3.conv3.module.weight with QuantizeLinear_394_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: ConstWeightsQuantizeFusion [03/25/2022-13:24:03] [V] [TRT] ConstWeightsQuantizeFusion: Fusing sections.2.0.conv1.module.weight with QuantizeLinear_416_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: ConstWeightsQuantizeFusion [03/25/2022-13:24:03] [V] [TRT] ConstWeightsQuantizeFusion: Fusing sections.2.0.conv2.module.weight with QuantizeLinear_431_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: ConstWeightsQuantizeFusion [03/25/2022-13:24:03] [V] [TRT] ConstWeightsQuantizeFusion: Fusing sections.2.0.conv3.module.weight with QuantizeLinear_446_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: ConstWeightsQuantizeFusion [03/25/2022-13:24:03] [V] [TRT] ConstWeightsQuantizeFusion: Fusing sections.2.0.identity.conv.module.weight with QuantizeLinear_460_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: ConstWeightsQuantizeFusion [03/25/2022-13:24:03] [V] [TRT] ConstWeightsQuantizeFusion: Fusing sections.2.1.conv1.module.weight with QuantizeLinear_482_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: ConstWeightsQuantizeFusion [03/25/2022-13:24:03] [V] [TRT] ConstWeightsQuantizeFusion: Fusing sections.2.1.conv2.module.weight with QuantizeLinear_497_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: ConstWeightsQuantizeFusion [03/25/2022-13:24:03] [V] [TRT] ConstWeightsQuantizeFusion: Fusing sections.2.1.conv3.module.weight with QuantizeLinear_512_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: ConstWeightsQuantizeFusion [03/25/2022-13:24:03] [V] [TRT] ConstWeightsQuantizeFusion: Fusing sections.2.2.conv1.module.weight with QuantizeLinear_534_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: ConstWeightsQuantizeFusion [03/25/2022-13:24:03] [V] [TRT] ConstWeightsQuantizeFusion: Fusing sections.2.2.conv2.module.weight with QuantizeLinear_549_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: ConstWeightsQuantizeFusion [03/25/2022-13:24:03] [V] [TRT] ConstWeightsQuantizeFusion: Fusing sections.2.2.conv3.module.weight with QuantizeLinear_564_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: ConstWeightsQuantizeFusion [03/25/2022-13:24:03] [V] [TRT] ConstWeightsQuantizeFusion: Fusing sections.2.3.conv1.module.weight with QuantizeLinear_586_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: ConstWeightsQuantizeFusion [03/25/2022-13:24:03] [V] [TRT] ConstWeightsQuantizeFusion: Fusing sections.2.3.conv2.module.weight with QuantizeLinear_601_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: ConstWeightsQuantizeFusion [03/25/2022-13:24:03] [V] [TRT] ConstWeightsQuantizeFusion: Fusing sections.2.3.conv3.module.weight with QuantizeLinear_616_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: ConstWeightsQuantizeFusion [03/25/2022-13:24:03] [V] [TRT] ConstWeightsQuantizeFusion: Fusing sections.2.4.conv1.module.weight with QuantizeLinear_638_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: ConstWeightsQuantizeFusion [03/25/2022-13:24:03] [V] [TRT] ConstWeightsQuantizeFusion: Fusing sections.2.4.conv2.module.weight with QuantizeLinear_653_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: ConstWeightsQuantizeFusion [03/25/2022-13:24:03] [V] [TRT] ConstWeightsQuantizeFusion: Fusing sections.2.4.conv3.module.weight with QuantizeLinear_668_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: ConstWeightsQuantizeFusion [03/25/2022-13:24:03] [V] [TRT] ConstWeightsQuantizeFusion: Fusing sections.2.5.conv1.module.weight with QuantizeLinear_690_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: ConstWeightsQuantizeFusion [03/25/2022-13:24:03] [V] [TRT] ConstWeightsQuantizeFusion: Fusing sections.2.5.conv2.module.weight with QuantizeLinear_705_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: ConstWeightsQuantizeFusion [03/25/2022-13:24:03] [V] [TRT] ConstWeightsQuantizeFusion: Fusing sections.2.5.conv3.module.weight with QuantizeLinear_720_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: ConstWeightsQuantizeFusion [03/25/2022-13:24:03] [V] [TRT] ConstWeightsQuantizeFusion: Fusing sections.3.0.conv1.module.weight with QuantizeLinear_742_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: ConstWeightsQuantizeFusion [03/25/2022-13:24:03] [V] [TRT] ConstWeightsQuantizeFusion: Fusing sections.3.0.conv2.module.weight with QuantizeLinear_757_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: ConstWeightsQuantizeFusion [03/25/2022-13:24:03] [V] [TRT] ConstWeightsQuantizeFusion: Fusing sections.3.0.conv3.module.weight with QuantizeLinear_772_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: ConstWeightsQuantizeFusion [03/25/2022-13:24:03] [V] [TRT] ConstWeightsQuantizeFusion: Fusing sections.3.0.identity.conv.module.weight with QuantizeLinear_786_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: ConstWeightsQuantizeFusion [03/25/2022-13:24:03] [V] [TRT] ConstWeightsQuantizeFusion: Fusing sections.3.1.conv1.module.weight with QuantizeLinear_808_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: ConstWeightsQuantizeFusion [03/25/2022-13:24:03] [V] [TRT] ConstWeightsQuantizeFusion: Fusing sections.3.1.conv2.module.weight with QuantizeLinear_823_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: ConstWeightsQuantizeFusion [03/25/2022-13:24:03] [V] [TRT] ConstWeightsQuantizeFusion: Fusing sections.3.1.conv3.module.weight with QuantizeLinear_838_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: ConstWeightsQuantizeFusion [03/25/2022-13:24:03] [V] [TRT] ConstWeightsQuantizeFusion: Fusing sections.3.2.conv1.module.weight with QuantizeLinear_860_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: ConstWeightsQuantizeFusion [03/25/2022-13:24:03] [V] [TRT] ConstWeightsQuantizeFusion: Fusing sections.3.2.conv2.module.weight with QuantizeLinear_875_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: ConstWeightsQuantizeFusion [03/25/2022-13:24:03] [V] [TRT] ConstWeightsQuantizeFusion: Fusing sections.3.2.conv3.module.weight with QuantizeLinear_890_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: VanillaSwapWithFollowingQ [03/25/2022-13:24:03] [V] [TRT] Swapping Relu_30 with QuantizeLinear_33_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: VanillaSwapWithFollowingQ [03/25/2022-13:24:03] [V] [TRT] Swapping Relu_45 with QuantizeLinear_48_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: VanillaSwapWithFollowingQ [03/25/2022-13:24:03] [V] [TRT] Swapping Relu_96 with QuantizeLinear_99_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: VanillaSwapWithFollowingQ [03/25/2022-13:24:03] [V] [TRT] Swapping Relu_111 with QuantizeLinear_114_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: VanillaSwapWithFollowingQ [03/25/2022-13:24:03] [V] [TRT] Swapping Relu_148 with QuantizeLinear_151_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: VanillaSwapWithFollowingQ [03/25/2022-13:24:03] [V] [TRT] Swapping Relu_163 with QuantizeLinear_166_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: VanillaSwapWithFollowingQ [03/25/2022-13:24:03] [V] [TRT] Swapping Relu_200 with QuantizeLinear_203_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: VanillaSwapWithFollowingQ [03/25/2022-13:24:03] [V] [TRT] Swapping Relu_215 with QuantizeLinear_218_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: VanillaSwapWithFollowingQ [03/25/2022-13:24:03] [V] [TRT] Swapping Relu_266 with QuantizeLinear_269_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: VanillaSwapWithFollowingQ [03/25/2022-13:24:03] [V] [TRT] Swapping Relu_281 with QuantizeLinear_284_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: VanillaSwapWithFollowingQ [03/25/2022-13:24:03] [V] [TRT] Swapping Relu_318 with QuantizeLinear_321_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: VanillaSwapWithFollowingQ [03/25/2022-13:24:03] [V] [TRT] Swapping Relu_333 with QuantizeLinear_336_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: VanillaSwapWithFollowingQ [03/25/2022-13:24:03] [V] [TRT] Swapping Relu_370 with QuantizeLinear_373_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: VanillaSwapWithFollowingQ [03/25/2022-13:24:03] [V] [TRT] Swapping Relu_385 with QuantizeLinear_388_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: VanillaSwapWithFollowingQ [03/25/2022-13:24:03] [V] [TRT] Swapping Relu_422 with QuantizeLinear_425_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: VanillaSwapWithFollowingQ [03/25/2022-13:24:03] [V] [TRT] Swapping Relu_437 with QuantizeLinear_440_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: VanillaSwapWithFollowingQ [03/25/2022-13:24:03] [V] [TRT] Swapping Relu_488 with QuantizeLinear_491_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: VanillaSwapWithFollowingQ [03/25/2022-13:24:03] [V] [TRT] Swapping Relu_503 with QuantizeLinear_506_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: VanillaSwapWithFollowingQ [03/25/2022-13:24:03] [V] [TRT] Swapping Relu_540 with QuantizeLinear_543_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: VanillaSwapWithFollowingQ [03/25/2022-13:24:03] [V] [TRT] Swapping Relu_555 with QuantizeLinear_558_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: VanillaSwapWithFollowingQ [03/25/2022-13:24:03] [V] [TRT] Swapping Relu_592 with QuantizeLinear_595_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: VanillaSwapWithFollowingQ [03/25/2022-13:24:03] [V] [TRT] Swapping Relu_607 with QuantizeLinear_610_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: VanillaSwapWithFollowingQ [03/25/2022-13:24:03] [V] [TRT] Swapping Relu_644 with QuantizeLinear_647_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: VanillaSwapWithFollowingQ [03/25/2022-13:24:03] [V] [TRT] Swapping Relu_659 with QuantizeLinear_662_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: VanillaSwapWithFollowingQ [03/25/2022-13:24:03] [V] [TRT] Swapping Relu_696 with QuantizeLinear_699_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: VanillaSwapWithFollowingQ [03/25/2022-13:24:03] [V] [TRT] Swapping Relu_711 with QuantizeLinear_714_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: VanillaSwapWithFollowingQ [03/25/2022-13:24:03] [V] [TRT] Swapping Relu_748 with QuantizeLinear_751_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: VanillaSwapWithFollowingQ [03/25/2022-13:24:03] [V] [TRT] Swapping Relu_763 with QuantizeLinear_766_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: VanillaSwapWithFollowingQ [03/25/2022-13:24:03] [V] [TRT] Swapping Relu_814 with QuantizeLinear_817_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: VanillaSwapWithFollowingQ [03/25/2022-13:24:03] [V] [TRT] Swapping Relu_829 with QuantizeLinear_832_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: VanillaSwapWithFollowingQ [03/25/2022-13:24:03] [V] [TRT] Swapping Relu_866 with QuantizeLinear_869_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: VanillaSwapWithFollowingQ [03/25/2022-13:24:03] [V] [TRT] Swapping Relu_881 with QuantizeLinear_884_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: HorizontalMergeQNodes [03/25/2022-13:24:03] [V] [TRT] Eliminating QuantizeLinear_62_quantize_scale_node which duplicates (Q) QuantizeLinear_18_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Removing QuantizeLinear_62_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: HorizontalMergeQNodes [03/25/2022-13:24:03] [V] [TRT] Eliminating QuantizeLinear_128_quantize_scale_node which duplicates (Q) QuantizeLinear_84_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Removing QuantizeLinear_128_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: HorizontalMergeQNodes [03/25/2022-13:24:03] [V] [TRT] Eliminating QuantizeLinear_180_quantize_scale_node which duplicates (Q) QuantizeLinear_136_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Removing QuantizeLinear_180_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: HorizontalMergeQNodes [03/25/2022-13:24:03] [V] [TRT] Eliminating QuantizeLinear_232_quantize_scale_node which duplicates (Q) QuantizeLinear_188_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Removing QuantizeLinear_232_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: HorizontalMergeQNodes [03/25/2022-13:24:03] [V] [TRT] Eliminating QuantizeLinear_298_quantize_scale_node which duplicates (Q) QuantizeLinear_254_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Removing QuantizeLinear_298_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: HorizontalMergeQNodes [03/25/2022-13:24:03] [V] [TRT] Eliminating QuantizeLinear_350_quantize_scale_node which duplicates (Q) QuantizeLinear_306_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Removing QuantizeLinear_350_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: HorizontalMergeQNodes [03/25/2022-13:24:03] [V] [TRT] Eliminating QuantizeLinear_402_quantize_scale_node which duplicates (Q) QuantizeLinear_358_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Removing QuantizeLinear_402_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: HorizontalMergeQNodes [03/25/2022-13:24:03] [V] [TRT] Eliminating QuantizeLinear_454_quantize_scale_node which duplicates (Q) QuantizeLinear_410_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Removing QuantizeLinear_454_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: HorizontalMergeQNodes [03/25/2022-13:24:03] [V] [TRT] Eliminating QuantizeLinear_520_quantize_scale_node which duplicates (Q) QuantizeLinear_476_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Removing QuantizeLinear_520_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: HorizontalMergeQNodes [03/25/2022-13:24:03] [V] [TRT] Eliminating QuantizeLinear_572_quantize_scale_node which duplicates (Q) QuantizeLinear_528_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Removing QuantizeLinear_572_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: HorizontalMergeQNodes [03/25/2022-13:24:03] [V] [TRT] Eliminating QuantizeLinear_624_quantize_scale_node which duplicates (Q) QuantizeLinear_580_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Removing QuantizeLinear_624_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: HorizontalMergeQNodes [03/25/2022-13:24:03] [V] [TRT] Eliminating QuantizeLinear_676_quantize_scale_node which duplicates (Q) QuantizeLinear_632_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Removing QuantizeLinear_676_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: HorizontalMergeQNodes [03/25/2022-13:24:03] [V] [TRT] Eliminating QuantizeLinear_728_quantize_scale_node which duplicates (Q) QuantizeLinear_684_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Removing QuantizeLinear_728_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: HorizontalMergeQNodes [03/25/2022-13:24:03] [V] [TRT] Eliminating QuantizeLinear_780_quantize_scale_node which duplicates (Q) QuantizeLinear_736_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Removing QuantizeLinear_780_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: HorizontalMergeQNodes [03/25/2022-13:24:03] [V] [TRT] Eliminating QuantizeLinear_846_quantize_scale_node which duplicates (Q) QuantizeLinear_802_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Removing QuantizeLinear_846_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: HorizontalMergeQNodes [03/25/2022-13:24:03] [V] [TRT] Eliminating QuantizeLinear_898_quantize_scale_node which duplicates (Q) QuantizeLinear_854_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Removing QuantizeLinear_898_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: VanillaSwapWithFollowingQ [03/25/2022-13:24:03] [V] [TRT] Swapping MaxPool_15 with QuantizeLinear_18_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] Running: VanillaSwapWithFollowingQ [03/25/2022-13:24:03] [V] [TRT] Swapping Relu_14 with QuantizeLinear_18_quantize_scale_node [03/25/2022-13:24:03] [V] [TRT] QDQ graph optimizer quantization pass - Generate quantized ops [03/25/2022-13:24:04] [V] [TRT] Running: QConvScaleFusion [03/25/2022-13:24:04] [V] [TRT] Removing BatchNormalization_13 [03/25/2022-13:24:04] [V] [TRT] Running: QConvScaleFusion [03/25/2022-13:24:04] [V] [TRT] Removing BatchNormalization_29 [03/25/2022-13:24:04] [V] [TRT] Running: QConvScaleFusion [03/25/2022-13:24:04] [V] [TRT] Removing BatchNormalization_73 [03/25/2022-13:24:04] [V] [TRT] Running: QConvScaleFusion [03/25/2022-13:24:04] [V] [TRT] Removing BatchNormalization_44 [03/25/2022-13:24:04] [V] [TRT] Running: QConvScaleFusion [03/25/2022-13:24:04] [V] [TRT] Removing BatchNormalization_59 [03/25/2022-13:24:04] [V] [TRT] Running: QConvScaleFusion [03/25/2022-13:24:04] [V] [TRT] Removing BatchNormalization_95 [03/25/2022-13:24:04] [V] [TRT] Running: QConvScaleFusion [03/25/2022-13:24:04] [V] [TRT] Removing BatchNormalization_110 [03/25/2022-13:24:04] [V] [TRT] Running: QConvScaleFusion [03/25/2022-13:24:04] [V] [TRT] Removing BatchNormalization_125 [03/25/2022-13:24:04] [V] [TRT] Running: QConvScaleFusion [03/25/2022-13:24:04] [V] [TRT] Removing BatchNormalization_147 [03/25/2022-13:24:04] [V] [TRT] Running: QConvScaleFusion [03/25/2022-13:24:04] [V] [TRT] Removing BatchNormalization_162 [03/25/2022-13:24:04] [V] [TRT] Running: QConvScaleFusion [03/25/2022-13:24:04] [V] [TRT] Removing BatchNormalization_177 [03/25/2022-13:24:04] [V] [TRT] Running: QConvScaleFusion [03/25/2022-13:24:04] [V] [TRT] Removing BatchNormalization_199 [03/25/2022-13:24:04] [V] [TRT] Running: QConvScaleFusion [03/25/2022-13:24:04] [V] [TRT] Removing BatchNormalization_243 [03/25/2022-13:24:04] [V] [TRT] Running: QConvScaleFusion [03/25/2022-13:24:04] [V] [TRT] Removing BatchNormalization_214 [03/25/2022-13:24:04] [V] [TRT] Running: QConvScaleFusion [03/25/2022-13:24:04] [V] [TRT] Removing BatchNormalization_229 [03/25/2022-13:24:04] [V] [TRT] Running: QConvScaleFusion [03/25/2022-13:24:04] [V] [TRT] Removing BatchNormalization_265 [03/25/2022-13:24:04] [V] [TRT] Running: QConvScaleFusion [03/25/2022-13:24:04] [V] [TRT] Removing BatchNormalization_280 [03/25/2022-13:24:04] [V] [TRT] Running: QConvScaleFusion [03/25/2022-13:24:04] [V] [TRT] Removing BatchNormalization_295 [03/25/2022-13:24:04] [V] [TRT] Running: QConvScaleFusion [03/25/2022-13:24:04] [V] [TRT] Removing BatchNormalization_317 [03/25/2022-13:24:04] [V] [TRT] Running: QConvScaleFusion [03/25/2022-13:24:04] [V] [TRT] Removing BatchNormalization_332 [03/25/2022-13:24:04] [V] [TRT] Running: QConvScaleFusion [03/25/2022-13:24:04] [V] [TRT] Removing BatchNormalization_347 [03/25/2022-13:24:04] [V] [TRT] Running: QConvScaleFusion [03/25/2022-13:24:04] [V] [TRT] Removing BatchNormalization_369 [03/25/2022-13:24:04] [V] [TRT] Running: QConvScaleFusion [03/25/2022-13:24:04] [V] [TRT] Removing BatchNormalization_384 [03/25/2022-13:24:04] [V] [TRT] Running: QConvScaleFusion [03/25/2022-13:24:04] [V] [TRT] Removing BatchNormalization_399 [03/25/2022-13:24:04] [V] [TRT] Running: QConvScaleFusion [03/25/2022-13:24:04] [V] [TRT] Removing BatchNormalization_421 [03/25/2022-13:24:04] [V] [TRT] Running: QConvScaleFusion [03/25/2022-13:24:04] [V] [TRT] Removing BatchNormalization_465 [03/25/2022-13:24:04] [V] [TRT] Running: QConvScaleFusion [03/25/2022-13:24:04] [V] [TRT] Removing BatchNormalization_436 [03/25/2022-13:24:04] [V] [TRT] Running: QConvScaleFusion [03/25/2022-13:24:04] [V] [TRT] Removing BatchNormalization_451 [03/25/2022-13:24:04] [V] [TRT] Running: QConvScaleFusion [03/25/2022-13:24:04] [V] [TRT] Removing BatchNormalization_487 [03/25/2022-13:24:04] [V] [TRT] Running: QConvScaleFusion [03/25/2022-13:24:04] [V] [TRT] Removing BatchNormalization_502 [03/25/2022-13:24:04] [V] [TRT] Running: QConvScaleFusion [03/25/2022-13:24:04] [V] [TRT] Removing BatchNormalization_517 [03/25/2022-13:24:04] [V] [TRT] Running: QConvScaleFusion [03/25/2022-13:24:04] [V] [TRT] Removing BatchNormalization_539 [03/25/2022-13:24:04] [V] [TRT] Running: QConvScaleFusion [03/25/2022-13:24:04] [V] [TRT] Removing BatchNormalization_554 [03/25/2022-13:24:04] [V] [TRT] Running: QConvScaleFusion [03/25/2022-13:24:04] [V] [TRT] Removing BatchNormalization_569 [03/25/2022-13:24:04] [V] [TRT] Running: QConvScaleFusion [03/25/2022-13:24:04] [V] [TRT] Removing BatchNormalization_591 [03/25/2022-13:24:04] [V] [TRT] Running: QConvScaleFusion [03/25/2022-13:24:04] [V] [TRT] Removing BatchNormalization_606 [03/25/2022-13:24:04] [V] [TRT] Running: QConvScaleFusion [03/25/2022-13:24:04] [V] [TRT] Removing BatchNormalization_621 [03/25/2022-13:24:04] [V] [TRT] Running: QConvScaleFusion [03/25/2022-13:24:04] [V] [TRT] Removing BatchNormalization_643 [03/25/2022-13:24:04] [V] [TRT] Running: QConvScaleFusion [03/25/2022-13:24:04] [V] [TRT] Removing BatchNormalization_658 [03/25/2022-13:24:04] [V] [TRT] Running: QConvScaleFusion [03/25/2022-13:24:04] [V] [TRT] Removing BatchNormalization_673 [03/25/2022-13:24:04] [V] [TRT] Running: QConvScaleFusion [03/25/2022-13:24:04] [V] [TRT] Removing BatchNormalization_695 [03/25/2022-13:24:04] [V] [TRT] Running: QConvScaleFusion [03/25/2022-13:24:04] [V] [TRT] Removing BatchNormalization_710 [03/25/2022-13:24:04] [V] [TRT] Running: QConvScaleFusion [03/25/2022-13:24:04] [V] [TRT] Removing BatchNormalization_725 [03/25/2022-13:24:04] [V] [TRT] Running: QConvScaleFusion [03/25/2022-13:24:04] [V] [TRT] Removing BatchNormalization_747 [03/25/2022-13:24:04] [V] [TRT] Running: QConvScaleFusion [03/25/2022-13:24:04] [V] [TRT] Removing BatchNormalization_791 [03/25/2022-13:24:04] [V] [TRT] Running: QConvScaleFusion [03/25/2022-13:24:04] [V] [TRT] Removing BatchNormalization_762 [03/25/2022-13:24:04] [V] [TRT] Running: QConvScaleFusion [03/25/2022-13:24:04] [V] [TRT] Removing BatchNormalization_777 [03/25/2022-13:24:04] [V] [TRT] Running: QConvScaleFusion [03/25/2022-13:24:04] [V] [TRT] Removing BatchNormalization_813 [03/25/2022-13:24:04] [V] [TRT] Running: QConvScaleFusion [03/25/2022-13:24:04] [V] [TRT] Removing BatchNormalization_828 [03/25/2022-13:24:04] [V] [TRT] Running: QConvScaleFusion [03/25/2022-13:24:04] [V] [TRT] Removing BatchNormalization_843 [03/25/2022-13:24:04] [V] [TRT] Running: QConvScaleFusion [03/25/2022-13:24:04] [V] [TRT] Removing BatchNormalization_865 [03/25/2022-13:24:04] [V] [TRT] Running: QConvScaleFusion [03/25/2022-13:24:04] [V] [TRT] Removing BatchNormalization_880 [03/25/2022-13:24:04] [V] [TRT] Running: QConvScaleFusion [03/25/2022-13:24:04] [V] [TRT] Removing BatchNormalization_895 [03/25/2022-13:24:04] [V] [TRT] Running: QuantizeConvWithResidualAdd [03/25/2022-13:24:04] [V] [TRT] Swapping Add_80 + Relu_81 with QuantizeLinear_84_quantize_scale_node [03/25/2022-13:24:04] [V] [TRT] QuantizeDoubleInputNodes: fusing QuantizeLinear_84_quantize_scale_node into Conv_58 [03/25/2022-13:24:04] [V] [TRT] QuantizeDoubleInputNodes: fusing (DequantizeLinear_51_dequantize_scale_node and DequantizeLinear_57_dequantize_scale_node) into Conv_58 [03/25/2022-13:24:04] [V] [TRT] Removing QuantizeLinear_84_quantize_scale_node [03/25/2022-13:24:04] [V] [TRT] Removing DequantizeLinear_51_dequantize_scale_node [03/25/2022-13:24:04] [V] [TRT] Removing DequantizeLinear_57_dequantize_scale_node [03/25/2022-13:24:04] [V] [TRT] ConstWeightsFusion: Fusing sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node with Conv_58 [03/25/2022-13:24:04] [V] [TRT] ConvEltwiseSumFusion: Fusing sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 with Add_80 + Relu_81 [03/25/2022-13:24:04] [V] [TRT] Removing DequantizeLinear_79_dequantize_scale_node [03/25/2022-13:24:04] [V] [TRT] Running: QuantizeConvWithResidualAdd [03/25/2022-13:24:04] [V] [TRT] Swapping Add_132 + Relu_133 with QuantizeLinear_136_quantize_scale_node [03/25/2022-13:24:04] [V] [TRT] QuantizeDoubleInputNodes: fusing QuantizeLinear_136_quantize_scale_node into Conv_124 [03/25/2022-13:24:04] [V] [TRT] QuantizeDoubleInputNodes: fusing (DequantizeLinear_117_dequantize_scale_node and DequantizeLinear_123_dequantize_scale_node) into Conv_124 [03/25/2022-13:24:04] [V] [TRT] Removing QuantizeLinear_136_quantize_scale_node [03/25/2022-13:24:04] [V] [TRT] Removing DequantizeLinear_117_dequantize_scale_node [03/25/2022-13:24:04] [V] [TRT] Removing DequantizeLinear_123_dequantize_scale_node [03/25/2022-13:24:04] [V] [TRT] ConstWeightsFusion: Fusing sections.0.1.conv3.module.weight + QuantizeLinear_120_quantize_scale_node with Conv_124 [03/25/2022-13:24:04] [V] [TRT] ConvEltwiseSumFusion: Fusing sections.0.1.conv3.module.weight + QuantizeLinear_120_quantize_scale_node + Conv_124 with Add_132 + Relu_133 [03/25/2022-13:24:04] [V] [TRT] Removing DequantizeLinear_131_dequantize_scale_node [03/25/2022-13:24:04] [V] [TRT] Running: QuantizeConvWithResidualAdd [03/25/2022-13:24:04] [V] [TRT] Swapping Add_184 + Relu_185 with QuantizeLinear_188_quantize_scale_node [03/25/2022-13:24:04] [V] [TRT] QuantizeDoubleInputNodes: fusing QuantizeLinear_188_quantize_scale_node into Conv_176 [03/25/2022-13:24:04] [V] [TRT] QuantizeDoubleInputNodes: fusing (DequantizeLinear_169_dequantize_scale_node and DequantizeLinear_175_dequantize_scale_node) into Conv_176 [03/25/2022-13:24:04] [V] [TRT] Removing QuantizeLinear_188_quantize_scale_node [03/25/2022-13:24:04] [V] [TRT] Removing DequantizeLinear_169_dequantize_scale_node [03/25/2022-13:24:04] [V] [TRT] Removing DequantizeLinear_175_dequantize_scale_node [03/25/2022-13:24:04] [V] [TRT] ConstWeightsFusion: Fusing sections.0.2.conv3.module.weight + QuantizeLinear_172_quantize_scale_node with Conv_176 [03/25/2022-13:24:04] [V] [TRT] ConvEltwiseSumFusion: Fusing sections.0.2.conv3.module.weight + QuantizeLinear_172_quantize_scale_node + Conv_176 with Add_184 + Relu_185 [03/25/2022-13:24:04] [V] [TRT] Removing DequantizeLinear_183_dequantize_scale_node [03/25/2022-13:24:04] [V] [TRT] Running: QuantizeConvWithResidualAdd [03/25/2022-13:24:04] [V] [TRT] Swapping Add_250 + Relu_251 with QuantizeLinear_254_quantize_scale_node [03/25/2022-13:24:04] [V] [TRT] QuantizeDoubleInputNodes: fusing QuantizeLinear_254_quantize_scale_node into Conv_228 [03/25/2022-13:24:04] [V] [TRT] QuantizeDoubleInputNodes: fusing (DequantizeLinear_221_dequantize_scale_node and DequantizeLinear_227_dequantize_scale_node) into Conv_228 [03/25/2022-13:24:04] [V] [TRT] Removing QuantizeLinear_254_quantize_scale_node [03/25/2022-13:24:04] [V] [TRT] Removing DequantizeLinear_221_dequantize_scale_node [03/25/2022-13:24:04] [V] [TRT] Removing DequantizeLinear_227_dequantize_scale_node [03/25/2022-13:24:04] [V] [TRT] ConstWeightsFusion: Fusing sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node with Conv_228 [03/25/2022-13:24:04] [V] [TRT] ConvEltwiseSumFusion: Fusing sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 with Add_250 + Relu_251 [03/25/2022-13:24:04] [V] [TRT] Removing DequantizeLinear_249_dequantize_scale_node [03/25/2022-13:24:04] [V] [TRT] Running: QuantizeConvWithResidualAdd [03/25/2022-13:24:04] [V] [TRT] Swapping Add_302 + Relu_303 with QuantizeLinear_306_quantize_scale_node [03/25/2022-13:24:04] [V] [TRT] QuantizeDoubleInputNodes: fusing QuantizeLinear_306_quantize_scale_node into Conv_294 [03/25/2022-13:24:04] [V] [TRT] QuantizeDoubleInputNodes: fusing (DequantizeLinear_287_dequantize_scale_node and DequantizeLinear_293_dequantize_scale_node) into Conv_294 [03/25/2022-13:24:04] [V] [TRT] Removing QuantizeLinear_306_quantize_scale_node [03/25/2022-13:24:04] [V] [TRT] Removing DequantizeLinear_287_dequantize_scale_node [03/25/2022-13:24:04] [V] [TRT] Removing DequantizeLinear_293_dequantize_scale_node [03/25/2022-13:24:04] [V] [TRT] ConstWeightsFusion: Fusing sections.1.1.conv3.module.weight + QuantizeLinear_290_quantize_scale_node with Conv_294 [03/25/2022-13:24:04] [V] [TRT] ConvEltwiseSumFusion: Fusing sections.1.1.conv3.module.weight + QuantizeLinear_290_quantize_scale_node + Conv_294 with Add_302 + Relu_303 [03/25/2022-13:24:04] [V] [TRT] Removing DequantizeLinear_301_dequantize_scale_node [03/25/2022-13:24:04] [V] [TRT] Running: QuantizeConvWithResidualAdd [03/25/2022-13:24:04] [V] [TRT] Swapping Add_354 + Relu_355 with QuantizeLinear_358_quantize_scale_node [03/25/2022-13:24:04] [V] [TRT] QuantizeDoubleInputNodes: fusing QuantizeLinear_358_quantize_scale_node into Conv_346 [03/25/2022-13:24:04] [V] [TRT] QuantizeDoubleInputNodes: fusing (DequantizeLinear_339_dequantize_scale_node and DequantizeLinear_345_dequantize_scale_node) into Conv_346 [03/25/2022-13:24:04] [V] [TRT] Removing QuantizeLinear_358_quantize_scale_node [03/25/2022-13:24:04] [V] [TRT] Removing DequantizeLinear_339_dequantize_scale_node [03/25/2022-13:24:04] [V] [TRT] Removing DequantizeLinear_345_dequantize_scale_node [03/25/2022-13:24:04] [V] [TRT] ConstWeightsFusion: Fusing sections.1.2.conv3.module.weight + QuantizeLinear_342_quantize_scale_node with Conv_346 [03/25/2022-13:24:04] [V] [TRT] ConvEltwiseSumFusion: Fusing sections.1.2.conv3.module.weight + QuantizeLinear_342_quantize_scale_node + Conv_346 with Add_354 + Relu_355 [03/25/2022-13:24:04] [V] [TRT] Removing DequantizeLinear_353_dequantize_scale_node [03/25/2022-13:24:04] [V] [TRT] Running: QuantizeConvWithResidualAdd [03/25/2022-13:24:04] [V] [TRT] Swapping Add_406 + Relu_407 with QuantizeLinear_410_quantize_scale_node [03/25/2022-13:24:04] [V] [TRT] QuantizeDoubleInputNodes: fusing QuantizeLinear_410_quantize_scale_node into Conv_398 [03/25/2022-13:24:04] [V] [TRT] QuantizeDoubleInputNodes: fusing (DequantizeLinear_391_dequantize_scale_node and DequantizeLinear_397_dequantize_scale_node) into Conv_398 [03/25/2022-13:24:04] [V] [TRT] Removing QuantizeLinear_410_quantize_scale_node [03/25/2022-13:24:04] [V] [TRT] Removing DequantizeLinear_391_dequantize_scale_node [03/25/2022-13:24:04] [V] [TRT] Removing DequantizeLinear_397_dequantize_scale_node [03/25/2022-13:24:04] [V] [TRT] ConstWeightsFusion: Fusing sections.1.3.conv3.module.weight + QuantizeLinear_394_quantize_scale_node with Conv_398 [03/25/2022-13:24:04] [V] [TRT] ConvEltwiseSumFusion: Fusing sections.1.3.conv3.module.weight + QuantizeLinear_394_quantize_scale_node + Conv_398 with Add_406 + Relu_407 [03/25/2022-13:24:04] [V] [TRT] Removing DequantizeLinear_405_dequantize_scale_node [03/25/2022-13:24:04] [V] [TRT] Running: QuantizeConvWithResidualAdd [03/25/2022-13:24:04] [V] [TRT] Swapping Add_472 + Relu_473 with QuantizeLinear_476_quantize_scale_node [03/25/2022-13:24:04] [V] [TRT] QuantizeDoubleInputNodes: fusing QuantizeLinear_476_quantize_scale_node into Conv_450 [03/25/2022-13:24:04] [V] [TRT] QuantizeDoubleInputNodes: fusing (DequantizeLinear_443_dequantize_scale_node and DequantizeLinear_449_dequantize_scale_node) into Conv_450 [03/25/2022-13:24:04] [V] [TRT] Removing QuantizeLinear_476_quantize_scale_node [03/25/2022-13:24:04] [V] [TRT] Removing DequantizeLinear_443_dequantize_scale_node [03/25/2022-13:24:04] [V] [TRT] Removing DequantizeLinear_449_dequantize_scale_node [03/25/2022-13:24:04] [V] [TRT] ConstWeightsFusion: Fusing sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node with Conv_450 [03/25/2022-13:24:04] [V] [TRT] ConvEltwiseSumFusion: Fusing sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 with Add_472 + Relu_473 [03/25/2022-13:24:04] [V] [TRT] Removing DequantizeLinear_471_dequantize_scale_node [03/25/2022-13:24:04] [V] [TRT] Running: QuantizeConvWithResidualAdd [03/25/2022-13:24:04] [V] [TRT] Swapping Add_524 + Relu_525 with QuantizeLinear_528_quantize_scale_node [03/25/2022-13:24:04] [V] [TRT] QuantizeDoubleInputNodes: fusing QuantizeLinear_528_quantize_scale_node into Conv_516 [03/25/2022-13:24:04] [V] [TRT] QuantizeDoubleInputNodes: fusing (DequantizeLinear_509_dequantize_scale_node and DequantizeLinear_515_dequantize_scale_node) into Conv_516 [03/25/2022-13:24:04] [V] [TRT] Removing QuantizeLinear_528_quantize_scale_node [03/25/2022-13:24:04] [V] [TRT] Removing DequantizeLinear_509_dequantize_scale_node [03/25/2022-13:24:04] [V] [TRT] Removing DequantizeLinear_515_dequantize_scale_node [03/25/2022-13:24:04] [V] [TRT] ConstWeightsFusion: Fusing sections.2.1.conv3.module.weight + QuantizeLinear_512_quantize_scale_node with Conv_516 [03/25/2022-13:24:04] [V] [TRT] ConvEltwiseSumFusion: Fusing sections.2.1.conv3.module.weight + QuantizeLinear_512_quantize_scale_node + Conv_516 with Add_524 + Relu_525 [03/25/2022-13:24:04] [V] [TRT] Removing DequantizeLinear_523_dequantize_scale_node [03/25/2022-13:24:04] [V] [TRT] Running: QuantizeConvWithResidualAdd [03/25/2022-13:24:04] [V] [TRT] Swapping Add_576 + Relu_577 with QuantizeLinear_580_quantize_scale_node [03/25/2022-13:24:04] [V] [TRT] QuantizeDoubleInputNodes: fusing QuantizeLinear_580_quantize_scale_node into Conv_568 [03/25/2022-13:24:04] [V] [TRT] QuantizeDoubleInputNodes: fusing (DequantizeLinear_561_dequantize_scale_node and DequantizeLinear_567_dequantize_scale_node) into Conv_568 [03/25/2022-13:24:04] [V] [TRT] Removing QuantizeLinear_580_quantize_scale_node [03/25/2022-13:24:04] [V] [TRT] Removing DequantizeLinear_561_dequantize_scale_node [03/25/2022-13:24:04] [V] [TRT] Removing DequantizeLinear_567_dequantize_scale_node [03/25/2022-13:24:04] [V] [TRT] ConstWeightsFusion: Fusing sections.2.2.conv3.module.weight + QuantizeLinear_564_quantize_scale_node with Conv_568 [03/25/2022-13:24:04] [V] [TRT] ConvEltwiseSumFusion: Fusing sections.2.2.conv3.module.weight + QuantizeLinear_564_quantize_scale_node + Conv_568 with Add_576 + Relu_577 [03/25/2022-13:24:04] [V] [TRT] Removing DequantizeLinear_575_dequantize_scale_node [03/25/2022-13:24:04] [V] [TRT] Running: QuantizeConvWithResidualAdd [03/25/2022-13:24:04] [V] [TRT] Swapping Add_628 + Relu_629 with QuantizeLinear_632_quantize_scale_node [03/25/2022-13:24:04] [V] [TRT] QuantizeDoubleInputNodes: fusing QuantizeLinear_632_quantize_scale_node into Conv_620 [03/25/2022-13:24:04] [V] [TRT] QuantizeDoubleInputNodes: fusing (DequantizeLinear_613_dequantize_scale_node and DequantizeLinear_619_dequantize_scale_node) into Conv_620 [03/25/2022-13:24:04] [V] [TRT] Removing QuantizeLinear_632_quantize_scale_node [03/25/2022-13:24:04] [V] [TRT] Removing DequantizeLinear_613_dequantize_scale_node [03/25/2022-13:24:04] [V] [TRT] Removing DequantizeLinear_619_dequantize_scale_node [03/25/2022-13:24:04] [V] [TRT] ConstWeightsFusion: Fusing sections.2.3.conv3.module.weight + QuantizeLinear_616_quantize_scale_node with Conv_620 [03/25/2022-13:24:04] [V] [TRT] ConvEltwiseSumFusion: Fusing sections.2.3.conv3.module.weight + QuantizeLinear_616_quantize_scale_node + Conv_620 with Add_628 + Relu_629 [03/25/2022-13:24:04] [V] [TRT] Removing DequantizeLinear_627_dequantize_scale_node [03/25/2022-13:24:04] [V] [TRT] Running: QuantizeConvWithResidualAdd [03/25/2022-13:24:04] [V] [TRT] Swapping Add_680 + Relu_681 with QuantizeLinear_684_quantize_scale_node [03/25/2022-13:24:04] [V] [TRT] QuantizeDoubleInputNodes: fusing QuantizeLinear_684_quantize_scale_node into Conv_672 [03/25/2022-13:24:04] [V] [TRT] QuantizeDoubleInputNodes: fusing (DequantizeLinear_665_dequantize_scale_node and DequantizeLinear_671_dequantize_scale_node) into Conv_672 [03/25/2022-13:24:04] [V] [TRT] Removing QuantizeLinear_684_quantize_scale_node [03/25/2022-13:24:04] [V] [TRT] Removing DequantizeLinear_665_dequantize_scale_node [03/25/2022-13:24:04] [V] [TRT] Removing DequantizeLinear_671_dequantize_scale_node [03/25/2022-13:24:04] [V] [TRT] ConstWeightsFusion: Fusing sections.2.4.conv3.module.weight + QuantizeLinear_668_quantize_scale_node with Conv_672 [03/25/2022-13:24:04] [V] [TRT] ConvEltwiseSumFusion: Fusing sections.2.4.conv3.module.weight + QuantizeLinear_668_quantize_scale_node + Conv_672 with Add_680 + Relu_681 [03/25/2022-13:24:04] [V] [TRT] Removing DequantizeLinear_679_dequantize_scale_node [03/25/2022-13:24:04] [V] [TRT] Running: QuantizeConvWithResidualAdd [03/25/2022-13:24:04] [V] [TRT] Swapping Add_732 + Relu_733 with QuantizeLinear_736_quantize_scale_node [03/25/2022-13:24:04] [V] [TRT] QuantizeDoubleInputNodes: fusing QuantizeLinear_736_quantize_scale_node into Conv_724 [03/25/2022-13:24:04] [V] [TRT] QuantizeDoubleInputNodes: fusing (DequantizeLinear_717_dequantize_scale_node and DequantizeLinear_723_dequantize_scale_node) into Conv_724 [03/25/2022-13:24:04] [V] [TRT] Removing QuantizeLinear_736_quantize_scale_node [03/25/2022-13:24:04] [V] [TRT] Removing DequantizeLinear_717_dequantize_scale_node [03/25/2022-13:24:04] [V] [TRT] Removing DequantizeLinear_723_dequantize_scale_node [03/25/2022-13:24:04] [V] [TRT] ConstWeightsFusion: Fusing sections.2.5.conv3.module.weight + QuantizeLinear_720_quantize_scale_node with Conv_724 [03/25/2022-13:24:04] [V] [TRT] ConvEltwiseSumFusion: Fusing sections.2.5.conv3.module.weight + QuantizeLinear_720_quantize_scale_node + Conv_724 with Add_732 + Relu_733 [03/25/2022-13:24:04] [V] [TRT] Removing DequantizeLinear_731_dequantize_scale_node [03/25/2022-13:24:04] [V] [TRT] Running: QuantizeConvWithResidualAdd [03/25/2022-13:24:04] [V] [TRT] Swapping Add_798 + Relu_799 with QuantizeLinear_802_quantize_scale_node [03/25/2022-13:24:04] [V] [TRT] QuantizeDoubleInputNodes: fusing QuantizeLinear_802_quantize_scale_node into Conv_776 [03/25/2022-13:24:04] [V] [TRT] QuantizeDoubleInputNodes: fusing (DequantizeLinear_769_dequantize_scale_node and DequantizeLinear_775_dequantize_scale_node) into Conv_776 [03/25/2022-13:24:04] [V] [TRT] Removing QuantizeLinear_802_quantize_scale_node [03/25/2022-13:24:04] [V] [TRT] Removing DequantizeLinear_769_dequantize_scale_node [03/25/2022-13:24:04] [V] [TRT] Removing DequantizeLinear_775_dequantize_scale_node [03/25/2022-13:24:04] [V] [TRT] ConstWeightsFusion: Fusing sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node with Conv_776 [03/25/2022-13:24:04] [V] [TRT] ConvEltwiseSumFusion: Fusing sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 with Add_798 + Relu_799 [03/25/2022-13:24:04] [V] [TRT] Removing DequantizeLinear_797_dequantize_scale_node [03/25/2022-13:24:04] [V] [TRT] Running: QuantizeConvWithResidualAdd [03/25/2022-13:24:04] [V] [TRT] Swapping Add_850 + Relu_851 with QuantizeLinear_854_quantize_scale_node [03/25/2022-13:24:04] [V] [TRT] QuantizeDoubleInputNodes: fusing QuantizeLinear_854_quantize_scale_node into Conv_842 [03/25/2022-13:24:04] [V] [TRT] QuantizeDoubleInputNodes: fusing (DequantizeLinear_835_dequantize_scale_node and DequantizeLinear_841_dequantize_scale_node) into Conv_842 [03/25/2022-13:24:04] [V] [TRT] Removing QuantizeLinear_854_quantize_scale_node [03/25/2022-13:24:04] [V] [TRT] Removing DequantizeLinear_835_dequantize_scale_node [03/25/2022-13:24:04] [V] [TRT] Removing DequantizeLinear_841_dequantize_scale_node [03/25/2022-13:24:04] [V] [TRT] ConstWeightsFusion: Fusing sections.3.1.conv3.module.weight + QuantizeLinear_838_quantize_scale_node with Conv_842 [03/25/2022-13:24:04] [V] [TRT] ConvEltwiseSumFusion: Fusing sections.3.1.conv3.module.weight + QuantizeLinear_838_quantize_scale_node + Conv_842 with Add_850 + Relu_851 [03/25/2022-13:24:04] [V] [TRT] Removing DequantizeLinear_849_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Running: QuantizeDoubleInputNodes [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing QuantizeLinear_18_quantize_scale_node into Conv_12 [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing (DequantizeLinear_5_dequantize_scale_node and DequantizeLinear_11_dequantize_scale_node) into Conv_12 [03/25/2022-13:24:05] [V] [TRT] Removing QuantizeLinear_18_quantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_5_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_11_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Running: QuantizeDoubleInputNodes [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing QuantizeLinear_33_quantize_scale_node into Conv_28 [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing (DequantizeLinear_21_dequantize_scale_node and DequantizeLinear_27_dequantize_scale_node) into Conv_28 [03/25/2022-13:24:05] [V] [TRT] Removing QuantizeLinear_33_quantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_21_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_27_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Running: QuantizeDoubleInputNodes [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing QuantizeLinear_48_quantize_scale_node into Conv_43 [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing (DequantizeLinear_36_dequantize_scale_node and DequantizeLinear_42_dequantize_scale_node) into Conv_43 [03/25/2022-13:24:05] [V] [TRT] Removing QuantizeLinear_48_quantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_36_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_42_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Running: QuantizeDoubleInputNodes [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing QuantizeLinear_99_quantize_scale_node into Conv_94 [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing (DequantizeLinear_87_dequantize_scale_node and DequantizeLinear_93_dequantize_scale_node) into Conv_94 [03/25/2022-13:24:05] [V] [TRT] Removing QuantizeLinear_99_quantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_87_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_93_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Running: QuantizeDoubleInputNodes [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing QuantizeLinear_114_quantize_scale_node into Conv_109 [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing (DequantizeLinear_102_dequantize_scale_node and DequantizeLinear_108_dequantize_scale_node) into Conv_109 [03/25/2022-13:24:05] [V] [TRT] Removing QuantizeLinear_114_quantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_102_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_108_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Running: QuantizeDoubleInputNodes [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing QuantizeLinear_151_quantize_scale_node into Conv_146 [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing (DequantizeLinear_139_dequantize_scale_node and DequantizeLinear_145_dequantize_scale_node) into Conv_146 [03/25/2022-13:24:05] [V] [TRT] Removing QuantizeLinear_151_quantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_139_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_145_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Running: QuantizeDoubleInputNodes [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing QuantizeLinear_166_quantize_scale_node into Conv_161 [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing (DequantizeLinear_154_dequantize_scale_node and DequantizeLinear_160_dequantize_scale_node) into Conv_161 [03/25/2022-13:24:05] [V] [TRT] Removing QuantizeLinear_166_quantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_154_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_160_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Running: QuantizeDoubleInputNodes [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing QuantizeLinear_203_quantize_scale_node into Conv_198 [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing (DequantizeLinear_191_dequantize_scale_node and DequantizeLinear_197_dequantize_scale_node) into Conv_198 [03/25/2022-13:24:05] [V] [TRT] Removing QuantizeLinear_203_quantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_191_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_197_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Running: QuantizeDoubleInputNodes [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing QuantizeLinear_218_quantize_scale_node into Conv_213 [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing (DequantizeLinear_206_dequantize_scale_node and DequantizeLinear_212_dequantize_scale_node) into Conv_213 [03/25/2022-13:24:05] [V] [TRT] Removing QuantizeLinear_218_quantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_206_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_212_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Running: QuantizeDoubleInputNodes [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing QuantizeLinear_269_quantize_scale_node into Conv_264 [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing (DequantizeLinear_257_dequantize_scale_node and DequantizeLinear_263_dequantize_scale_node) into Conv_264 [03/25/2022-13:24:05] [V] [TRT] Removing QuantizeLinear_269_quantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_257_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_263_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Running: QuantizeDoubleInputNodes [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing QuantizeLinear_284_quantize_scale_node into Conv_279 [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing (DequantizeLinear_272_dequantize_scale_node and DequantizeLinear_278_dequantize_scale_node) into Conv_279 [03/25/2022-13:24:05] [V] [TRT] Removing QuantizeLinear_284_quantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_272_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_278_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Running: QuantizeDoubleInputNodes [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing QuantizeLinear_321_quantize_scale_node into Conv_316 [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing (DequantizeLinear_309_dequantize_scale_node and DequantizeLinear_315_dequantize_scale_node) into Conv_316 [03/25/2022-13:24:05] [V] [TRT] Removing QuantizeLinear_321_quantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_309_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_315_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Running: QuantizeDoubleInputNodes [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing QuantizeLinear_336_quantize_scale_node into Conv_331 [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing (DequantizeLinear_324_dequantize_scale_node and DequantizeLinear_330_dequantize_scale_node) into Conv_331 [03/25/2022-13:24:05] [V] [TRT] Removing QuantizeLinear_336_quantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_324_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_330_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Running: QuantizeDoubleInputNodes [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing QuantizeLinear_373_quantize_scale_node into Conv_368 [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing (DequantizeLinear_361_dequantize_scale_node and DequantizeLinear_367_dequantize_scale_node) into Conv_368 [03/25/2022-13:24:05] [V] [TRT] Removing QuantizeLinear_373_quantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_361_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_367_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Running: QuantizeDoubleInputNodes [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing QuantizeLinear_388_quantize_scale_node into Conv_383 [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing (DequantizeLinear_376_dequantize_scale_node and DequantizeLinear_382_dequantize_scale_node) into Conv_383 [03/25/2022-13:24:05] [V] [TRT] Removing QuantizeLinear_388_quantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_376_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_382_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Running: QuantizeDoubleInputNodes [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing QuantizeLinear_425_quantize_scale_node into Conv_420 [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing (DequantizeLinear_413_dequantize_scale_node and DequantizeLinear_419_dequantize_scale_node) into Conv_420 [03/25/2022-13:24:05] [V] [TRT] Removing QuantizeLinear_425_quantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_413_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_419_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Running: QuantizeDoubleInputNodes [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing QuantizeLinear_440_quantize_scale_node into Conv_435 [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing (DequantizeLinear_428_dequantize_scale_node and DequantizeLinear_434_dequantize_scale_node) into Conv_435 [03/25/2022-13:24:05] [V] [TRT] Removing QuantizeLinear_440_quantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_428_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_434_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Running: QuantizeDoubleInputNodes [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing QuantizeLinear_491_quantize_scale_node into Conv_486 [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing (DequantizeLinear_479_dequantize_scale_node and DequantizeLinear_485_dequantize_scale_node) into Conv_486 [03/25/2022-13:24:05] [V] [TRT] Removing QuantizeLinear_491_quantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_479_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_485_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Running: QuantizeDoubleInputNodes [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing QuantizeLinear_506_quantize_scale_node into Conv_501 [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing (DequantizeLinear_494_dequantize_scale_node and DequantizeLinear_500_dequantize_scale_node) into Conv_501 [03/25/2022-13:24:05] [V] [TRT] Removing QuantizeLinear_506_quantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_494_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_500_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Running: QuantizeDoubleInputNodes [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing QuantizeLinear_543_quantize_scale_node into Conv_538 [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing (DequantizeLinear_531_dequantize_scale_node and DequantizeLinear_537_dequantize_scale_node) into Conv_538 [03/25/2022-13:24:05] [V] [TRT] Removing QuantizeLinear_543_quantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_531_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_537_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Running: QuantizeDoubleInputNodes [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing QuantizeLinear_558_quantize_scale_node into Conv_553 [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing (DequantizeLinear_546_dequantize_scale_node and DequantizeLinear_552_dequantize_scale_node) into Conv_553 [03/25/2022-13:24:05] [V] [TRT] Removing QuantizeLinear_558_quantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_546_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_552_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Running: QuantizeDoubleInputNodes [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing QuantizeLinear_595_quantize_scale_node into Conv_590 [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing (DequantizeLinear_583_dequantize_scale_node and DequantizeLinear_589_dequantize_scale_node) into Conv_590 [03/25/2022-13:24:05] [V] [TRT] Removing QuantizeLinear_595_quantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_583_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_589_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Running: QuantizeDoubleInputNodes [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing QuantizeLinear_610_quantize_scale_node into Conv_605 [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing (DequantizeLinear_598_dequantize_scale_node and DequantizeLinear_604_dequantize_scale_node) into Conv_605 [03/25/2022-13:24:05] [V] [TRT] Removing QuantizeLinear_610_quantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_598_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_604_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Running: QuantizeDoubleInputNodes [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing QuantizeLinear_647_quantize_scale_node into Conv_642 [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing (DequantizeLinear_635_dequantize_scale_node and DequantizeLinear_641_dequantize_scale_node) into Conv_642 [03/25/2022-13:24:05] [V] [TRT] Removing QuantizeLinear_647_quantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_635_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_641_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Running: QuantizeDoubleInputNodes [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing QuantizeLinear_662_quantize_scale_node into Conv_657 [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing (DequantizeLinear_650_dequantize_scale_node and DequantizeLinear_656_dequantize_scale_node) into Conv_657 [03/25/2022-13:24:05] [V] [TRT] Removing QuantizeLinear_662_quantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_650_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_656_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Running: QuantizeDoubleInputNodes [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing QuantizeLinear_699_quantize_scale_node into Conv_694 [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing (DequantizeLinear_687_dequantize_scale_node and DequantizeLinear_693_dequantize_scale_node) into Conv_694 [03/25/2022-13:24:05] [V] [TRT] Removing QuantizeLinear_699_quantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_687_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_693_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Running: QuantizeDoubleInputNodes [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing QuantizeLinear_714_quantize_scale_node into Conv_709 [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing (DequantizeLinear_702_dequantize_scale_node and DequantizeLinear_708_dequantize_scale_node) into Conv_709 [03/25/2022-13:24:05] [V] [TRT] Removing QuantizeLinear_714_quantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_702_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_708_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Running: QuantizeDoubleInputNodes [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing QuantizeLinear_751_quantize_scale_node into Conv_746 [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing (DequantizeLinear_739_dequantize_scale_node and DequantizeLinear_745_dequantize_scale_node) into Conv_746 [03/25/2022-13:24:05] [V] [TRT] Removing QuantizeLinear_751_quantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_739_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_745_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Running: QuantizeDoubleInputNodes [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing QuantizeLinear_766_quantize_scale_node into Conv_761 [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing (DequantizeLinear_754_dequantize_scale_node and DequantizeLinear_760_dequantize_scale_node) into Conv_761 [03/25/2022-13:24:05] [V] [TRT] Removing QuantizeLinear_766_quantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_754_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_760_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Running: QuantizeDoubleInputNodes [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing QuantizeLinear_817_quantize_scale_node into Conv_812 [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing (DequantizeLinear_805_dequantize_scale_node and DequantizeLinear_811_dequantize_scale_node) into Conv_812 [03/25/2022-13:24:05] [V] [TRT] Removing QuantizeLinear_817_quantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_805_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_811_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Running: QuantizeDoubleInputNodes [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing QuantizeLinear_832_quantize_scale_node into Conv_827 [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing (DequantizeLinear_820_dequantize_scale_node and DequantizeLinear_826_dequantize_scale_node) into Conv_827 [03/25/2022-13:24:05] [V] [TRT] Removing QuantizeLinear_832_quantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_820_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_826_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Running: QuantizeDoubleInputNodes [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing QuantizeLinear_869_quantize_scale_node into Conv_864 [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing (DequantizeLinear_857_dequantize_scale_node and DequantizeLinear_863_dequantize_scale_node) into Conv_864 [03/25/2022-13:24:05] [V] [TRT] Removing QuantizeLinear_869_quantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_857_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_863_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Running: QuantizeDoubleInputNodes [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing QuantizeLinear_884_quantize_scale_node into Conv_879 [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing (DequantizeLinear_872_dequantize_scale_node and DequantizeLinear_878_dequantize_scale_node) into Conv_879 [03/25/2022-13:24:05] [V] [TRT] Removing QuantizeLinear_884_quantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_872_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_878_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Running: QuantizeDoubleInputNodes [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing (DequantizeLinear_887_dequantize_scale_node and DequantizeLinear_893_dequantize_scale_node) into Conv_894 [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_887_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_893_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Running: QuantizeDoubleInputNodes [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing QuantizeLinear_76_quantize_scale_node into Conv_72 [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing (DequantizeLinear_65_dequantize_scale_node and DequantizeLinear_71_dequantize_scale_node) into Conv_72 [03/25/2022-13:24:05] [V] [TRT] Removing QuantizeLinear_76_quantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_65_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_71_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Running: QuantizeDoubleInputNodes [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing QuantizeLinear_246_quantize_scale_node into Conv_242 [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing (DequantizeLinear_235_dequantize_scale_node and DequantizeLinear_241_dequantize_scale_node) into Conv_242 [03/25/2022-13:24:05] [V] [TRT] Removing QuantizeLinear_246_quantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_235_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_241_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Running: QuantizeDoubleInputNodes [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing QuantizeLinear_468_quantize_scale_node into Conv_464 [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing (DequantizeLinear_457_dequantize_scale_node and DequantizeLinear_463_dequantize_scale_node) into Conv_464 [03/25/2022-13:24:05] [V] [TRT] Removing QuantizeLinear_468_quantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_457_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_463_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Running: QuantizeDoubleInputNodes [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing QuantizeLinear_794_quantize_scale_node into Conv_790 [03/25/2022-13:24:05] [V] [TRT] QuantizeDoubleInputNodes: fusing (DequantizeLinear_783_dequantize_scale_node and DequantizeLinear_789_dequantize_scale_node) into Conv_790 [03/25/2022-13:24:05] [V] [TRT] Removing QuantizeLinear_794_quantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_783_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Removing DequantizeLinear_789_dequantize_scale_node [03/25/2022-13:24:05] [V] [TRT] Running: ConstWeightsFusion [03/25/2022-13:24:05] [V] [TRT] ConstWeightsFusion: Fusing input.conv.module.weight + QuantizeLinear_8_quantize_scale_node with Conv_12 [03/25/2022-13:24:05] [V] [TRT] Running: ConstWeightsFusion [03/25/2022-13:24:05] [V] [TRT] ConstWeightsFusion: Fusing sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node with Conv_28 [03/25/2022-13:24:05] [V] [TRT] Running: ConstWeightsFusion [03/25/2022-13:24:05] [V] [TRT] ConstWeightsFusion: Fusing sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node with Conv_43 [03/25/2022-13:24:05] [V] [TRT] Running: ConstWeightsFusion [03/25/2022-13:24:05] [V] [TRT] ConstWeightsFusion: Fusing sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node with Conv_72 [03/25/2022-13:24:05] [V] [TRT] Running: ConstWeightsFusion [03/25/2022-13:24:05] [V] [TRT] ConstWeightsFusion: Fusing sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node with Conv_94 [03/25/2022-13:24:05] [V] [TRT] Running: ConstWeightsFusion [03/25/2022-13:24:05] [V] [TRT] ConstWeightsFusion: Fusing sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node with Conv_109 [03/25/2022-13:24:05] [V] [TRT] Running: ConstWeightsFusion [03/25/2022-13:24:05] [V] [TRT] ConstWeightsFusion: Fusing sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node with Conv_146 [03/25/2022-13:24:05] [V] [TRT] Running: ConstWeightsFusion [03/25/2022-13:24:05] [V] [TRT] ConstWeightsFusion: Fusing sections.0.2.conv2.module.weight + QuantizeLinear_157_quantize_scale_node with Conv_161 [03/25/2022-13:24:05] [V] [TRT] Running: ConstWeightsFusion [03/25/2022-13:24:05] [V] [TRT] ConstWeightsFusion: Fusing sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node with Conv_198 [03/25/2022-13:24:05] [V] [TRT] Running: ConstWeightsFusion [03/25/2022-13:24:05] [V] [TRT] ConstWeightsFusion: Fusing sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node with Conv_213 [03/25/2022-13:24:05] [V] [TRT] Running: ConstWeightsFusion [03/25/2022-13:24:05] [V] [TRT] ConstWeightsFusion: Fusing sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node with Conv_242 [03/25/2022-13:24:05] [V] [TRT] Running: ConstWeightsFusion [03/25/2022-13:24:05] [V] [TRT] ConstWeightsFusion: Fusing sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node with Conv_264 [03/25/2022-13:24:05] [V] [TRT] Running: ConstWeightsFusion [03/25/2022-13:24:05] [V] [TRT] ConstWeightsFusion: Fusing sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node with Conv_279 [03/25/2022-13:24:05] [V] [TRT] Running: ConstWeightsFusion [03/25/2022-13:24:05] [V] [TRT] ConstWeightsFusion: Fusing sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node with Conv_316 [03/25/2022-13:24:05] [V] [TRT] Running: ConstWeightsFusion [03/25/2022-13:24:05] [V] [TRT] ConstWeightsFusion: Fusing sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node with Conv_331 [03/25/2022-13:24:05] [V] [TRT] Running: ConstWeightsFusion [03/25/2022-13:24:05] [V] [TRT] ConstWeightsFusion: Fusing sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node with Conv_368 [03/25/2022-13:24:05] [V] [TRT] Running: ConstWeightsFusion [03/25/2022-13:24:05] [V] [TRT] ConstWeightsFusion: Fusing sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node with Conv_383 [03/25/2022-13:24:05] [V] [TRT] Running: ConstWeightsFusion [03/25/2022-13:24:05] [V] [TRT] ConstWeightsFusion: Fusing sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node with Conv_420 [03/25/2022-13:24:05] [V] [TRT] Running: ConstWeightsFusion [03/25/2022-13:24:05] [V] [TRT] ConstWeightsFusion: Fusing sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node with Conv_435 [03/25/2022-13:24:05] [V] [TRT] Running: ConstWeightsFusion [03/25/2022-13:24:05] [V] [TRT] ConstWeightsFusion: Fusing sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node with Conv_464 [03/25/2022-13:24:05] [V] [TRT] Running: ConstWeightsFusion [03/25/2022-13:24:05] [V] [TRT] ConstWeightsFusion: Fusing sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node with Conv_486 [03/25/2022-13:24:05] [V] [TRT] Running: ConstWeightsFusion [03/25/2022-13:24:05] [V] [TRT] ConstWeightsFusion: Fusing sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node with Conv_501 [03/25/2022-13:24:05] [V] [TRT] Running: ConstWeightsFusion [03/25/2022-13:24:05] [V] [TRT] ConstWeightsFusion: Fusing sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node with Conv_538 [03/25/2022-13:24:05] [V] [TRT] Running: ConstWeightsFusion [03/25/2022-13:24:05] [V] [TRT] ConstWeightsFusion: Fusing sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node with Conv_553 [03/25/2022-13:24:05] [V] [TRT] Running: ConstWeightsFusion [03/25/2022-13:24:05] [V] [TRT] ConstWeightsFusion: Fusing sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node with Conv_590 [03/25/2022-13:24:05] [V] [TRT] Running: ConstWeightsFusion [03/25/2022-13:24:05] [V] [TRT] ConstWeightsFusion: Fusing sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node with Conv_605 [03/25/2022-13:24:05] [V] [TRT] Running: ConstWeightsFusion [03/25/2022-13:24:05] [V] [TRT] ConstWeightsFusion: Fusing sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node with Conv_642 [03/25/2022-13:24:05] [V] [TRT] Running: ConstWeightsFusion [03/25/2022-13:24:05] [V] [TRT] ConstWeightsFusion: Fusing sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node with Conv_657 [03/25/2022-13:24:05] [V] [TRT] Running: ConstWeightsFusion [03/25/2022-13:24:05] [V] [TRT] ConstWeightsFusion: Fusing sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node with Conv_694 [03/25/2022-13:24:05] [V] [TRT] Running: ConstWeightsFusion [03/25/2022-13:24:05] [V] [TRT] ConstWeightsFusion: Fusing sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node with Conv_709 [03/25/2022-13:24:05] [V] [TRT] Running: ConstWeightsFusion [03/25/2022-13:24:05] [V] [TRT] ConstWeightsFusion: Fusing sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node with Conv_746 [03/25/2022-13:24:05] [V] [TRT] Running: ConstWeightsFusion [03/25/2022-13:24:05] [V] [TRT] ConstWeightsFusion: Fusing sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node with Conv_761 [03/25/2022-13:24:05] [V] [TRT] Running: ConstWeightsFusion [03/25/2022-13:24:05] [V] [TRT] ConstWeightsFusion: Fusing sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node with Conv_790 [03/25/2022-13:24:05] [V] [TRT] Running: ConstWeightsFusion [03/25/2022-13:24:05] [V] [TRT] ConstWeightsFusion: Fusing sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node with Conv_812 [03/25/2022-13:24:05] [V] [TRT] Running: ConstWeightsFusion [03/25/2022-13:24:05] [V] [TRT] ConstWeightsFusion: Fusing sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node with Conv_827 [03/25/2022-13:24:05] [V] [TRT] Running: ConstWeightsFusion [03/25/2022-13:24:05] [V] [TRT] ConstWeightsFusion: Fusing sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node with Conv_864 [03/25/2022-13:24:05] [V] [TRT] Running: ConstWeightsFusion [03/25/2022-13:24:05] [V] [TRT] ConstWeightsFusion: Fusing sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node with Conv_879 [03/25/2022-13:24:05] [V] [TRT] Running: ConstWeightsFusion [03/25/2022-13:24:05] [V] [TRT] ConstWeightsFusion: Fusing sections.3.2.conv3.module.weight + QuantizeLinear_890_quantize_scale_node with Conv_894 [03/25/2022-13:24:05] [V] [TRT] Running: ConvReluFusion [03/25/2022-13:24:05] [V] [TRT] ConvReluFusion: Fusing input.conv.module.weight + QuantizeLinear_8_quantize_scale_node + Conv_12 with Relu_14 [03/25/2022-13:24:05] [V] [TRT] Running: ConvReluFusion [03/25/2022-13:24:05] [V] [TRT] ConvReluFusion: Fusing sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 with Relu_30 [03/25/2022-13:24:05] [V] [TRT] Running: ConvReluFusion [03/25/2022-13:24:05] [V] [TRT] ConvReluFusion: Fusing sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 with Relu_45 [03/25/2022-13:24:05] [V] [TRT] Running: ConvReluFusion [03/25/2022-13:24:05] [V] [TRT] ConvReluFusion: Fusing sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 with Relu_96 [03/25/2022-13:24:05] [V] [TRT] Running: ConvReluFusion [03/25/2022-13:24:05] [V] [TRT] ConvReluFusion: Fusing sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 with Relu_111 [03/25/2022-13:24:05] [V] [TRT] Running: ConvReluFusion [03/25/2022-13:24:05] [V] [TRT] ConvReluFusion: Fusing sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 with Relu_148 [03/25/2022-13:24:05] [V] [TRT] Running: ConvReluFusion [03/25/2022-13:24:05] [V] [TRT] ConvReluFusion: Fusing sections.0.2.conv2.module.weight + QuantizeLinear_157_quantize_scale_node + Conv_161 with Relu_163 [03/25/2022-13:24:05] [V] [TRT] Running: ConvReluFusion [03/25/2022-13:24:05] [V] [TRT] ConvReluFusion: Fusing sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 with Relu_200 [03/25/2022-13:24:05] [V] [TRT] Running: ConvReluFusion [03/25/2022-13:24:05] [V] [TRT] ConvReluFusion: Fusing sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 with Relu_215 [03/25/2022-13:24:05] [V] [TRT] Running: ConvReluFusion [03/25/2022-13:24:05] [V] [TRT] ConvReluFusion: Fusing sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 with Relu_266 [03/25/2022-13:24:05] [V] [TRT] Running: ConvReluFusion [03/25/2022-13:24:05] [V] [TRT] ConvReluFusion: Fusing sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 with Relu_281 [03/25/2022-13:24:05] [V] [TRT] Running: ConvReluFusion [03/25/2022-13:24:05] [V] [TRT] ConvReluFusion: Fusing sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 with Relu_318 [03/25/2022-13:24:05] [V] [TRT] Running: ConvReluFusion [03/25/2022-13:24:05] [V] [TRT] ConvReluFusion: Fusing sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 with Relu_333 [03/25/2022-13:24:05] [V] [TRT] Running: ConvReluFusion [03/25/2022-13:24:05] [V] [TRT] ConvReluFusion: Fusing sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 with Relu_370 [03/25/2022-13:24:05] [V] [TRT] Running: ConvReluFusion [03/25/2022-13:24:05] [V] [TRT] ConvReluFusion: Fusing sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 with Relu_385 [03/25/2022-13:24:05] [V] [TRT] Running: ConvReluFusion [03/25/2022-13:24:05] [V] [TRT] ConvReluFusion: Fusing sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 with Relu_422 [03/25/2022-13:24:05] [V] [TRT] Running: ConvReluFusion [03/25/2022-13:24:05] [V] [TRT] ConvReluFusion: Fusing sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 with Relu_437 [03/25/2022-13:24:05] [V] [TRT] Running: ConvReluFusion [03/25/2022-13:24:05] [V] [TRT] ConvReluFusion: Fusing sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 with Relu_488 [03/25/2022-13:24:05] [V] [TRT] Running: ConvReluFusion [03/25/2022-13:24:05] [V] [TRT] ConvReluFusion: Fusing sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 with Relu_503 [03/25/2022-13:24:05] [V] [TRT] Running: ConvReluFusion [03/25/2022-13:24:05] [V] [TRT] ConvReluFusion: Fusing sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 with Relu_540 [03/25/2022-13:24:05] [V] [TRT] Running: ConvReluFusion [03/25/2022-13:24:05] [V] [TRT] ConvReluFusion: Fusing sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 with Relu_555 [03/25/2022-13:24:05] [V] [TRT] Running: ConvReluFusion [03/25/2022-13:24:05] [V] [TRT] ConvReluFusion: Fusing sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 with Relu_592 [03/25/2022-13:24:05] [V] [TRT] Running: ConvReluFusion [03/25/2022-13:24:05] [V] [TRT] ConvReluFusion: Fusing sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 with Relu_607 [03/25/2022-13:24:05] [V] [TRT] Running: ConvReluFusion [03/25/2022-13:24:05] [V] [TRT] ConvReluFusion: Fusing sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 with Relu_644 [03/25/2022-13:24:05] [V] [TRT] Running: ConvReluFusion [03/25/2022-13:24:05] [V] [TRT] ConvReluFusion: Fusing sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 with Relu_659 [03/25/2022-13:24:05] [V] [TRT] Running: ConvReluFusion [03/25/2022-13:24:05] [V] [TRT] ConvReluFusion: Fusing sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 with Relu_696 [03/25/2022-13:24:05] [V] [TRT] Running: ConvReluFusion [03/25/2022-13:24:05] [V] [TRT] ConvReluFusion: Fusing sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 with Relu_711 [03/25/2022-13:24:05] [V] [TRT] Running: ConvReluFusion [03/25/2022-13:24:05] [V] [TRT] ConvReluFusion: Fusing sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 with Relu_748 [03/25/2022-13:24:05] [V] [TRT] Running: ConvReluFusion [03/25/2022-13:24:05] [V] [TRT] ConvReluFusion: Fusing sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 with Relu_763 [03/25/2022-13:24:05] [V] [TRT] Running: ConvReluFusion [03/25/2022-13:24:05] [V] [TRT] ConvReluFusion: Fusing sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 with Relu_814 [03/25/2022-13:24:05] [V] [TRT] Running: ConvReluFusion [03/25/2022-13:24:05] [V] [TRT] ConvReluFusion: Fusing sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 with Relu_829 [03/25/2022-13:24:05] [V] [TRT] Running: ConvReluFusion [03/25/2022-13:24:05] [V] [TRT] ConvReluFusion: Fusing sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 with Relu_866 [03/25/2022-13:24:05] [V] [TRT] Running: ConvReluFusion [03/25/2022-13:24:05] [V] [TRT] ConvReluFusion: Fusing sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 with Relu_881 [03/25/2022-13:24:06] [V] [TRT] Running: ConvEltwiseSumFusion [03/25/2022-13:24:06] [V] [TRT] ConvEltwiseSumFusion: Fusing sections.3.2.conv3.module.weight + QuantizeLinear_890_quantize_scale_node + Conv_894 with Add_902 + Relu_903 [03/25/2022-13:24:06] [V] [TRT] Running: SetConvPrecision [03/25/2022-13:24:06] [V] [TRT] After vertical fusions: 60 layers [03/25/2022-13:24:06] [V] [TRT] After dupe layer removal: 60 layers [03/25/2022-13:24:06] [V] [TRT] After final dead-layer removal: 60 layers [03/25/2022-13:24:06] [V] [TRT] After tensor merging: 60 layers [03/25/2022-13:24:06] [V] [TRT] After concat removal: 60 layers [03/25/2022-13:24:06] [V] [TRT] Graph construction and optimization completed in 2.9934 seconds. [03/25/2022-13:24:07] [V] [TRT] Using cublasLt as a tactic source [03/25/2022-13:24:07] [I] [TRT] [MemUsageChange] Init cuBLAS/cuBLASLt: CPU +808, GPU +350, now: CPU 1668, GPU 1188 (MiB) [03/25/2022-13:24:07] [V] [TRT] Using cuDNN as a tactic source [03/25/2022-13:24:07] [I] [TRT] [MemUsageChange] Init cuDNN: CPU +126, GPU +60, now: CPU 1794, GPU 1248 (MiB) [03/25/2022-13:24:07] [I] [TRT] Local timing cache in use. Profiling results in this builder pass will not be stored. [03/25/2022-13:24:07] [V] [TRT] Constructing optimization profile number 0 [1/1]. [03/25/2022-13:24:07] [V] [TRT] Reserving memory for activation tensors. Host: 0 bytes Device: 78094336 bytes [03/25/2022-13:24:07] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:07] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:07] [V] [TRT] *************** Autotuning Reformat: Int8(150528,50176,224,1) -> Int8(50176,50176:4,224,1) *************** [03/25/2022-13:24:07] [V] [TRT] --------------- Timing Runner: Optimizer Reformat(1177 -> ) (Reformat) [03/25/2022-13:24:07] [V] [TRT] Tactic: 1002 Time: 0.137856 [03/25/2022-13:24:07] [V] [TRT] Tactic: 0 Time: 0.10048 [03/25/2022-13:24:07] [V] [TRT] Fastest Tactic: 0 Time: 0.10048 [03/25/2022-13:24:07] [V] [TRT] *************** Autotuning Reformat: Int8(150528,50176,224,1) -> Int8(50176,50176:32,224,1) *************** [03/25/2022-13:24:07] [V] [TRT] --------------- Timing Runner: Optimizer Reformat(1177 -> ) (Reformat) [03/25/2022-13:24:07] [V] [TRT] Tactic: 1002 Time: 0.977664 [03/25/2022-13:24:07] [V] [TRT] Tactic: 0 Time: 0.37696 [03/25/2022-13:24:07] [V] [TRT] Fastest Tactic: 0 Time: 0.37696 [03/25/2022-13:24:07] [V] [TRT] *************** Autotuning Reformat: Int8(50176,50176:4,224,1) -> Int8(150528,50176,224,1) *************** [03/25/2022-13:24:07] [V] [TRT] --------------- Timing Runner: Optimizer Reformat(1177 -> ) (Reformat) [03/25/2022-13:24:07] [V] [TRT] Tactic: 1002 Time: 0.179712 [03/25/2022-13:24:07] [V] [TRT] Tactic: 0 Time: 0.248704 [03/25/2022-13:24:07] [V] [TRT] Fastest Tactic: 1002 Time: 0.179712 [03/25/2022-13:24:07] [V] [TRT] *************** Autotuning Reformat: Int8(50176,50176:4,224,1) -> Int8(50176,50176:32,224,1) *************** [03/25/2022-13:24:07] [V] [TRT] --------------- Timing Runner: Optimizer Reformat(1177 -> ) (Reformat) [03/25/2022-13:24:07] [V] [TRT] Tactic: 1002 Time: 1.41402 [03/25/2022-13:24:07] [V] [TRT] Tactic: 0 Time: 0.320128 [03/25/2022-13:24:07] [V] [TRT] Fastest Tactic: 0 Time: 0.320128 [03/25/2022-13:24:07] [V] [TRT] *************** Autotuning Reformat: Int8(50176,50176:32,224,1) -> Int8(150528,50176,224,1) *************** [03/25/2022-13:24:07] [V] [TRT] --------------- Timing Runner: Optimizer Reformat(1177 -> ) (Reformat) [03/25/2022-13:24:07] [V] [TRT] Tactic: 1002 Time: 2.26227 [03/25/2022-13:24:07] [V] [TRT] Tactic: 0 Time: 0.254592 [03/25/2022-13:24:07] [V] [TRT] Fastest Tactic: 0 Time: 0.254592 [03/25/2022-13:24:07] [V] [TRT] *************** Autotuning Reformat: Int8(50176,50176:32,224,1) -> Int8(50176,50176:4,224,1) *************** [03/25/2022-13:24:07] [V] [TRT] --------------- Timing Runner: Optimizer Reformat(1177 -> ) (Reformat) [03/25/2022-13:24:07] [V] [TRT] Tactic: 1002 Time: 1.50054 [03/25/2022-13:24:07] [V] [TRT] Tactic: 0 Time: 0.173184 [03/25/2022-13:24:07] [V] [TRT] Fastest Tactic: 0 Time: 0.173184 [03/25/2022-13:24:07] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:07] [V] [TRT] *************** Autotuning Reformat: Int8(200704,12544:4,112,1) -> Int8(25088,12544:32,112,1) *************** [03/25/2022-13:24:07] [V] [TRT] --------------- Timing Runner: Optimizer Reformat(1190 -> ) (Reformat) [03/25/2022-13:24:07] [V] [TRT] Tactic: 1002 Time: 0.68416 [03/25/2022-13:24:07] [V] [TRT] Tactic: 0 Time: 0.226304 [03/25/2022-13:24:07] [V] [TRT] Fastest Tactic: 0 Time: 0.226304 [03/25/2022-13:24:07] [V] [TRT] *************** Autotuning Reformat: Int8(25088,12544:32,112,1) -> Int8(200704,12544:4,112,1) *************** [03/25/2022-13:24:07] [V] [TRT] --------------- Timing Runner: Optimizer Reformat(1190 -> ) (Reformat) [03/25/2022-13:24:07] [V] [TRT] Tactic: 1002 Time: 0.683904 [03/25/2022-13:24:07] [V] [TRT] Tactic: 0 Time: 0.157568 [03/25/2022-13:24:07] [V] [TRT] Fastest Tactic: 0 Time: 0.157568 [03/25/2022-13:24:07] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:07] [V] [TRT] *************** Autotuning Reformat: Int8(50176,3136:4,56,1) -> Int8(6272,3136:32,56,1) *************** [03/25/2022-13:24:07] [V] [TRT] --------------- Timing Runner: Optimizer Reformat( -> 1193) (Reformat) [03/25/2022-13:24:07] [V] [TRT] Tactic: 1002 Time: 0.180352 [03/25/2022-13:24:07] [V] [TRT] Tactic: 0 Time: 0.063872 [03/25/2022-13:24:07] [V] [TRT] Fastest Tactic: 0 Time: 0.063872 [03/25/2022-13:24:07] [V] [TRT] *************** Autotuning Reformat: Int8(6272,3136:32,56,1) -> Int8(50176,3136:4,56,1) *************** [03/25/2022-13:24:07] [V] [TRT] --------------- Timing Runner: Optimizer Reformat( -> 1193) (Reformat) [03/25/2022-13:24:07] [V] [TRT] Tactic: 1002 Time: 0.180352 [03/25/2022-13:24:07] [V] [TRT] Tactic: 0 Time: 0.045824 [03/25/2022-13:24:07] [V] [TRT] Fastest Tactic: 0 Time: 0.045824 [03/25/2022-13:24:07] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:07] [V] [TRT] *************** Autotuning Reformat: Int8(50176,3136:4,56,1) -> Int8(6272,3136:32,56,1) *************** [03/25/2022-13:24:07] [V] [TRT] --------------- Timing Runner: Optimizer Reformat(1193 -> ) (Reformat) [03/25/2022-13:24:07] [V] [TRT] Tactic: 1002 Time: 0.140544 [03/25/2022-13:24:07] [V] [TRT] Tactic: 0 Time: 0.052608 [03/25/2022-13:24:07] [V] [TRT] Fastest Tactic: 0 Time: 0.052608 [03/25/2022-13:24:07] [V] [TRT] *************** Autotuning Reformat: Int8(6272,3136:32,56,1) -> Int8(50176,3136:4,56,1) *************** [03/25/2022-13:24:07] [V] [TRT] --------------- Timing Runner: Optimizer Reformat(1193 -> ) (Reformat) [03/25/2022-13:24:07] [V] [TRT] Tactic: 1002 Time: 0.140544 [03/25/2022-13:24:07] [V] [TRT] Tactic: 0 Time: 0.045696 [03/25/2022-13:24:07] [V] [TRT] Fastest Tactic: 0 Time: 0.045696 [03/25/2022-13:24:07] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:07] [V] [TRT] *************** Autotuning Reformat: Int8(50176,3136:4,56,1) -> Int8(6272,3136:32,56,1) *************** [03/25/2022-13:24:07] [V] [TRT] *************** Autotuning Reformat: Int8(6272,3136:32,56,1) -> Int8(50176,3136:4,56,1) *************** [03/25/2022-13:24:07] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:07] [V] [TRT] *************** Autotuning Reformat: Int8(50176,3136:4,56,1) -> Int8(6272,3136:32,56,1) *************** [03/25/2022-13:24:07] [V] [TRT] *************** Autotuning Reformat: Int8(6272,3136:32,56,1) -> Int8(50176,3136:4,56,1) *************** [03/25/2022-13:24:07] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:07] [V] [TRT] *************** Autotuning Reformat: Int8(50176,3136:4,56,1) -> Int8(6272,3136:32,56,1) *************** [03/25/2022-13:24:07] [V] [TRT] *************** Autotuning Reformat: Int8(6272,3136:32,56,1) -> Int8(50176,3136:4,56,1) *************** [03/25/2022-13:24:07] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:07] [V] [TRT] *************** Autotuning Reformat: Int8(200704,3136:4,56,1) -> Int8(25088,3136:32,56,1) *************** [03/25/2022-13:24:07] [V] [TRT] --------------- Timing Runner: Optimizer Reformat(1251 -> ) (Reformat) [03/25/2022-13:24:07] [V] [TRT] Tactic: 1002 Time: 0.532736 [03/25/2022-13:24:07] [V] [TRT] Tactic: 0 Time: 0.258304 [03/25/2022-13:24:07] [V] [TRT] Fastest Tactic: 0 Time: 0.258304 [03/25/2022-13:24:07] [V] [TRT] *************** Autotuning Reformat: Int8(25088,3136:32,56,1) -> Int8(200704,3136:4,56,1) *************** [03/25/2022-13:24:07] [V] [TRT] --------------- Timing Runner: Optimizer Reformat(1251 -> ) (Reformat) [03/25/2022-13:24:07] [V] [TRT] Tactic: 1002 Time: 0.532352 [03/25/2022-13:24:07] [V] [TRT] Tactic: 0 Time: 0.157312 [03/25/2022-13:24:07] [V] [TRT] Fastest Tactic: 0 Time: 0.157312 [03/25/2022-13:24:07] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:07] [V] [TRT] *************** Autotuning Reformat: Int8(200704,3136:4,56,1) -> Int8(25088,3136:32,56,1) *************** [03/25/2022-13:24:07] [V] [TRT] --------------- Timing Runner: Optimizer Reformat( -> 1259) (Reformat) [03/25/2022-13:24:08] [V] [TRT] Tactic: 1002 Time: 0.532352 [03/25/2022-13:24:08] [V] [TRT] Tactic: 0 Time: 0.258304 [03/25/2022-13:24:08] [V] [TRT] Fastest Tactic: 0 Time: 0.258304 [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(25088,3136:32,56,1) -> Int8(200704,3136:4,56,1) *************** [03/25/2022-13:24:08] [V] [TRT] --------------- Timing Runner: Optimizer Reformat( -> 1259) (Reformat) [03/25/2022-13:24:08] [V] [TRT] Tactic: 1002 Time: 0.532608 [03/25/2022-13:24:08] [V] [TRT] Tactic: 0 Time: 0.157312 [03/25/2022-13:24:08] [V] [TRT] Fastest Tactic: 0 Time: 0.157312 [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(200704,3136:4,56,1) -> Int8(25088,3136:32,56,1) *************** [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(25088,3136:32,56,1) -> Int8(200704,3136:4,56,1) *************** [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(50176,3136:4,56,1) -> Int8(6272,3136:32,56,1) *************** [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(6272,3136:32,56,1) -> Int8(50176,3136:4,56,1) *************** [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(50176,3136:4,56,1) -> Int8(6272,3136:32,56,1) *************** [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(6272,3136:32,56,1) -> Int8(50176,3136:4,56,1) *************** [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(200704,3136:4,56,1) -> Int8(25088,3136:32,56,1) *************** [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(25088,3136:32,56,1) -> Int8(200704,3136:4,56,1) *************** [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(200704,3136:4,56,1) -> Int8(25088,3136:32,56,1) *************** [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(25088,3136:32,56,1) -> Int8(200704,3136:4,56,1) *************** [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(200704,3136:4,56,1) -> Int8(25088,3136:32,56,1) *************** [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(25088,3136:32,56,1) -> Int8(200704,3136:4,56,1) *************** [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(50176,3136:4,56,1) -> Int8(6272,3136:32,56,1) *************** [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(6272,3136:32,56,1) -> Int8(50176,3136:4,56,1) *************** [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(50176,3136:4,56,1) -> Int8(6272,3136:32,56,1) *************** [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(6272,3136:32,56,1) -> Int8(50176,3136:4,56,1) *************** [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(200704,3136:4,56,1) -> Int8(25088,3136:32,56,1) *************** [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(25088,3136:32,56,1) -> Int8(200704,3136:4,56,1) *************** [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(200704,3136:4,56,1) -> Int8(25088,3136:32,56,1) *************** [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(25088,3136:32,56,1) -> Int8(200704,3136:4,56,1) *************** [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(200704,3136:4,56,1) -> Int8(25088,3136:32,56,1) *************** [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(25088,3136:32,56,1) -> Int8(200704,3136:4,56,1) *************** [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(200704,3136:4,56,1) -> Int8(25088,3136:32,56,1) *************** [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(25088,3136:32,56,1) -> Int8(200704,3136:4,56,1) *************** [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(100352,3136:4,56,1) -> Int8(12544,3136:32,56,1) *************** [03/25/2022-13:24:08] [V] [TRT] --------------- Timing Runner: Optimizer Reformat(1378 -> ) (Reformat) [03/25/2022-13:24:08] [V] [TRT] Tactic: 1002 Time: 0.270976 [03/25/2022-13:24:08] [V] [TRT] Tactic: 0 Time: 0.132352 [03/25/2022-13:24:08] [V] [TRT] Fastest Tactic: 0 Time: 0.132352 [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(12544,3136:32,56,1) -> Int8(100352,3136:4,56,1) *************** [03/25/2022-13:24:08] [V] [TRT] --------------- Timing Runner: Optimizer Reformat(1378 -> ) (Reformat) [03/25/2022-13:24:08] [V] [TRT] Tactic: 1002 Time: 0.270976 [03/25/2022-13:24:08] [V] [TRT] Tactic: 0 Time: 0.082944 [03/25/2022-13:24:08] [V] [TRT] Fastest Tactic: 0 Time: 0.082944 [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(25088,784:4,28,1) -> Int8(3136,784:32,28,1) *************** [03/25/2022-13:24:08] [V] [TRT] --------------- Timing Runner: Optimizer Reformat(1393 -> ) (Reformat) [03/25/2022-13:24:08] [V] [TRT] Tactic: 1002 Time: 0.077056 [03/25/2022-13:24:08] [V] [TRT] Tactic: 0 Time: 0.033024 [03/25/2022-13:24:08] [V] [TRT] Fastest Tactic: 0 Time: 0.033024 [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(3136,784:32,28,1) -> Int8(25088,784:4,28,1) *************** [03/25/2022-13:24:08] [V] [TRT] --------------- Timing Runner: Optimizer Reformat(1393 -> ) (Reformat) [03/25/2022-13:24:08] [V] [TRT] Tactic: 1002 Time: 0.076672 [03/25/2022-13:24:08] [V] [TRT] Tactic: 0 Time: 0.022784 [03/25/2022-13:24:08] [V] [TRT] Fastest Tactic: 0 Time: 0.022784 [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(100352,784:4,28,1) -> Int8(12544,784:32,28,1) *************** [03/25/2022-13:24:08] [V] [TRT] --------------- Timing Runner: Optimizer Reformat(1421 -> ) (Reformat) [03/25/2022-13:24:08] [V] [TRT] Tactic: 1002 Time: 0.278784 [03/25/2022-13:24:08] [V] [TRT] Tactic: 0 Time: 0.172288 [03/25/2022-13:24:08] [V] [TRT] Fastest Tactic: 0 Time: 0.172288 [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(12544,784:32,28,1) -> Int8(100352,784:4,28,1) *************** [03/25/2022-13:24:08] [V] [TRT] --------------- Timing Runner: Optimizer Reformat(1421 -> ) (Reformat) [03/25/2022-13:24:08] [V] [TRT] Tactic: 1002 Time: 0.277888 [03/25/2022-13:24:08] [V] [TRT] Tactic: 0 Time: 0.082048 [03/25/2022-13:24:08] [V] [TRT] Fastest Tactic: 0 Time: 0.082048 [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(100352,784:4,28,1) -> Int8(12544,784:32,28,1) *************** [03/25/2022-13:24:08] [V] [TRT] --------------- Timing Runner: Optimizer Reformat( -> 1429) (Reformat) [03/25/2022-13:24:08] [V] [TRT] Tactic: 1002 Time: 0.278784 [03/25/2022-13:24:08] [V] [TRT] Tactic: 0 Time: 0.172288 [03/25/2022-13:24:08] [V] [TRT] Fastest Tactic: 0 Time: 0.172288 [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(12544,784:32,28,1) -> Int8(100352,784:4,28,1) *************** [03/25/2022-13:24:08] [V] [TRT] --------------- Timing Runner: Optimizer Reformat( -> 1429) (Reformat) [03/25/2022-13:24:08] [V] [TRT] Tactic: 1002 Time: 0.27776 [03/25/2022-13:24:08] [V] [TRT] Tactic: 0 Time: 0.08192 [03/25/2022-13:24:08] [V] [TRT] Fastest Tactic: 0 Time: 0.08192 [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(100352,784:4,28,1) -> Int8(12544,784:32,28,1) *************** [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(12544,784:32,28,1) -> Int8(100352,784:4,28,1) *************** [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(25088,784:4,28,1) -> Int8(3136,784:32,28,1) *************** [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(3136,784:32,28,1) -> Int8(25088,784:4,28,1) *************** [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(25088,784:4,28,1) -> Int8(3136,784:32,28,1) *************** [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(3136,784:32,28,1) -> Int8(25088,784:4,28,1) *************** [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(100352,784:4,28,1) -> Int8(12544,784:32,28,1) *************** [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(12544,784:32,28,1) -> Int8(100352,784:4,28,1) *************** [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(100352,784:4,28,1) -> Int8(12544,784:32,28,1) *************** [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(12544,784:32,28,1) -> Int8(100352,784:4,28,1) *************** [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(100352,784:4,28,1) -> Int8(12544,784:32,28,1) *************** [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(12544,784:32,28,1) -> Int8(100352,784:4,28,1) *************** [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(25088,784:4,28,1) -> Int8(3136,784:32,28,1) *************** [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(3136,784:32,28,1) -> Int8(25088,784:4,28,1) *************** [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(25088,784:4,28,1) -> Int8(3136,784:32,28,1) *************** [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(3136,784:32,28,1) -> Int8(25088,784:4,28,1) *************** [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(100352,784:4,28,1) -> Int8(12544,784:32,28,1) *************** [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(12544,784:32,28,1) -> Int8(100352,784:4,28,1) *************** [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(100352,784:4,28,1) -> Int8(12544,784:32,28,1) *************** [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(12544,784:32,28,1) -> Int8(100352,784:4,28,1) *************** [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(100352,784:4,28,1) -> Int8(12544,784:32,28,1) *************** [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(12544,784:32,28,1) -> Int8(100352,784:4,28,1) *************** [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(25088,784:4,28,1) -> Int8(3136,784:32,28,1) *************** [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(3136,784:32,28,1) -> Int8(25088,784:4,28,1) *************** [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(25088,784:4,28,1) -> Int8(3136,784:32,28,1) *************** [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(3136,784:32,28,1) -> Int8(25088,784:4,28,1) *************** [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(100352,784:4,28,1) -> Int8(12544,784:32,28,1) *************** [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(12544,784:32,28,1) -> Int8(100352,784:4,28,1) *************** [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(100352,784:4,28,1) -> Int8(12544,784:32,28,1) *************** [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(12544,784:32,28,1) -> Int8(100352,784:4,28,1) *************** [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(100352,784:4,28,1) -> Int8(12544,784:32,28,1) *************** [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(12544,784:32,28,1) -> Int8(100352,784:4,28,1) *************** [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(100352,784:4,28,1) -> Int8(12544,784:32,28,1) *************** [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(12544,784:32,28,1) -> Int8(100352,784:4,28,1) *************** [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(50176,784:4,28,1) -> Int8(6272,784:32,28,1) *************** [03/25/2022-13:24:08] [V] [TRT] --------------- Timing Runner: Optimizer Reformat(1600 -> ) (Reformat) [03/25/2022-13:24:08] [V] [TRT] Tactic: 1002 Time: 0.144 [03/25/2022-13:24:08] [V] [TRT] Tactic: 0 Time: 0.059136 [03/25/2022-13:24:08] [V] [TRT] Fastest Tactic: 0 Time: 0.059136 [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(6272,784:32,28,1) -> Int8(50176,784:4,28,1) *************** [03/25/2022-13:24:08] [V] [TRT] --------------- Timing Runner: Optimizer Reformat(1600 -> ) (Reformat) [03/25/2022-13:24:08] [V] [TRT] Tactic: 1002 Time: 0.143616 [03/25/2022-13:24:08] [V] [TRT] Tactic: 0 Time: 0.045824 [03/25/2022-13:24:08] [V] [TRT] Fastest Tactic: 0 Time: 0.045824 [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(12544,196:4,14,1) -> Int8(1568,196:32,14,1) *************** [03/25/2022-13:24:08] [V] [TRT] --------------- Timing Runner: Optimizer Reformat(1615 -> ) (Reformat) [03/25/2022-13:24:08] [V] [TRT] Tactic: 1002 Time: 0.047744 [03/25/2022-13:24:08] [V] [TRT] Tactic: 0 Time: 0.018432 [03/25/2022-13:24:08] [V] [TRT] Fastest Tactic: 0 Time: 0.018432 [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(1568,196:32,14,1) -> Int8(12544,196:4,14,1) *************** [03/25/2022-13:24:08] [V] [TRT] --------------- Timing Runner: Optimizer Reformat(1615 -> ) (Reformat) [03/25/2022-13:24:08] [V] [TRT] Tactic: 1002 Time: 0.047104 [03/25/2022-13:24:08] [V] [TRT] Tactic: 0 Time: 0.017408 [03/25/2022-13:24:08] [V] [TRT] Fastest Tactic: 0 Time: 0.017408 [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(50176,196:4,14,1) -> Int8(6272,196:32,14,1) *************** [03/25/2022-13:24:08] [V] [TRT] --------------- Timing Runner: Optimizer Reformat(1643 -> ) (Reformat) [03/25/2022-13:24:08] [V] [TRT] Tactic: 1002 Time: 0.1632 [03/25/2022-13:24:08] [V] [TRT] Tactic: 0 Time: 0.091008 [03/25/2022-13:24:08] [V] [TRT] Fastest Tactic: 0 Time: 0.091008 [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(6272,196:32,14,1) -> Int8(50176,196:4,14,1) *************** [03/25/2022-13:24:08] [V] [TRT] --------------- Timing Runner: Optimizer Reformat(1643 -> ) (Reformat) [03/25/2022-13:24:08] [V] [TRT] Tactic: 1002 Time: 0.162304 [03/25/2022-13:24:08] [V] [TRT] Tactic: 0 Time: 0.051712 [03/25/2022-13:24:08] [V] [TRT] Fastest Tactic: 0 Time: 0.051712 [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(50176,196:4,14,1) -> Int8(6272,196:32,14,1) *************** [03/25/2022-13:24:08] [V] [TRT] --------------- Timing Runner: Optimizer Reformat( -> 1651) (Reformat) [03/25/2022-13:24:08] [V] [TRT] Tactic: 1002 Time: 0.163072 [03/25/2022-13:24:08] [V] [TRT] Tactic: 0 Time: 0.091136 [03/25/2022-13:24:08] [V] [TRT] Fastest Tactic: 0 Time: 0.091136 [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(6272,196:32,14,1) -> Int8(50176,196:4,14,1) *************** [03/25/2022-13:24:08] [V] [TRT] --------------- Timing Runner: Optimizer Reformat( -> 1651) (Reformat) [03/25/2022-13:24:08] [V] [TRT] Tactic: 1002 Time: 0.162304 [03/25/2022-13:24:08] [V] [TRT] Tactic: 0 Time: 0.05184 [03/25/2022-13:24:08] [V] [TRT] Fastest Tactic: 0 Time: 0.05184 [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(50176,196:4,14,1) -> Int8(6272,196:32,14,1) *************** [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(6272,196:32,14,1) -> Int8(50176,196:4,14,1) *************** [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(12544,196:4,14,1) -> Int8(1568,196:32,14,1) *************** [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(1568,196:32,14,1) -> Int8(12544,196:4,14,1) *************** [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(12544,196:4,14,1) -> Int8(1568,196:32,14,1) *************** [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(1568,196:32,14,1) -> Int8(12544,196:4,14,1) *************** [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(50176,196:4,14,1) -> Int8(6272,196:32,14,1) *************** [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(6272,196:32,14,1) -> Int8(50176,196:4,14,1) *************** [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(50176,196:4,14,1) -> Int8(6272,196:32,14,1) *************** [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(6272,196:32,14,1) -> Int8(50176,196:4,14,1) *************** [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(50176,196:4,14,1) -> Int8(6272,196:32,14,1) *************** [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(6272,196:32,14,1) -> Int8(50176,196:4,14,1) *************** [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(12544,196:4,14,1) -> Int8(1568,196:32,14,1) *************** [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(1568,196:32,14,1) -> Int8(12544,196:4,14,1) *************** [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(12544,196:4,14,1) -> Int8(1568,196:32,14,1) *************** [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(1568,196:32,14,1) -> Int8(12544,196:4,14,1) *************** [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(50176,196:4,14,1) -> Int8(6272,196:32,14,1) *************** [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(6272,196:32,14,1) -> Int8(50176,196:4,14,1) *************** [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(50176,196:4,14,1) -> Int8(6272,196:32,14,1) *************** [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(6272,196:32,14,1) -> Int8(50176,196:4,14,1) *************** [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(50176,196:4,14,1) -> Int8(6272,196:32,14,1) *************** [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(6272,196:32,14,1) -> Int8(50176,196:4,14,1) *************** [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(12544,196:4,14,1) -> Int8(1568,196:32,14,1) *************** [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(1568,196:32,14,1) -> Int8(12544,196:4,14,1) *************** [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(12544,196:4,14,1) -> Int8(1568,196:32,14,1) *************** [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(1568,196:32,14,1) -> Int8(12544,196:4,14,1) *************** [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(50176,196:4,14,1) -> Int8(6272,196:32,14,1) *************** [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(6272,196:32,14,1) -> Int8(50176,196:4,14,1) *************** [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(50176,196:4,14,1) -> Int8(6272,196:32,14,1) *************** [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(6272,196:32,14,1) -> Int8(50176,196:4,14,1) *************** [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(50176,196:4,14,1) -> Int8(6272,196:32,14,1) *************** [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(6272,196:32,14,1) -> Int8(50176,196:4,14,1) *************** [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(12544,196:4,14,1) -> Int8(1568,196:32,14,1) *************** [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(1568,196:32,14,1) -> Int8(12544,196:4,14,1) *************** [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(12544,196:4,14,1) -> Int8(1568,196:32,14,1) *************** [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(1568,196:32,14,1) -> Int8(12544,196:4,14,1) *************** [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(50176,196:4,14,1) -> Int8(6272,196:32,14,1) *************** [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(6272,196:32,14,1) -> Int8(50176,196:4,14,1) *************** [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(50176,196:4,14,1) -> Int8(6272,196:32,14,1) *************** [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(6272,196:32,14,1) -> Int8(50176,196:4,14,1) *************** [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(50176,196:4,14,1) -> Int8(6272,196:32,14,1) *************** [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(6272,196:32,14,1) -> Int8(50176,196:4,14,1) *************** [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(12544,196:4,14,1) -> Int8(1568,196:32,14,1) *************** [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(1568,196:32,14,1) -> Int8(12544,196:4,14,1) *************** [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(12544,196:4,14,1) -> Int8(1568,196:32,14,1) *************** [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(1568,196:32,14,1) -> Int8(12544,196:4,14,1) *************** [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(50176,196:4,14,1) -> Int8(6272,196:32,14,1) *************** [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(6272,196:32,14,1) -> Int8(50176,196:4,14,1) *************** [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(50176,196:4,14,1) -> Int8(6272,196:32,14,1) *************** [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(6272,196:32,14,1) -> Int8(50176,196:4,14,1) *************** [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(50176,196:4,14,1) -> Int8(6272,196:32,14,1) *************** [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(6272,196:32,14,1) -> Int8(50176,196:4,14,1) *************** [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(50176,196:4,14,1) -> Int8(6272,196:32,14,1) *************** [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(6272,196:32,14,1) -> Int8(50176,196:4,14,1) *************** [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(25088,196:4,14,1) -> Int8(3136,196:32,14,1) *************** [03/25/2022-13:24:08] [V] [TRT] --------------- Timing Runner: Optimizer Reformat(1926 -> ) (Reformat) [03/25/2022-13:24:08] [V] [TRT] Tactic: 1002 Time: 0.086272 [03/25/2022-13:24:08] [V] [TRT] Tactic: 0 Time: 0.046592 [03/25/2022-13:24:08] [V] [TRT] Fastest Tactic: 0 Time: 0.046592 [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(3136,196:32,14,1) -> Int8(25088,196:4,14,1) *************** [03/25/2022-13:24:08] [V] [TRT] --------------- Timing Runner: Optimizer Reformat(1926 -> ) (Reformat) [03/25/2022-13:24:08] [V] [TRT] Tactic: 1002 Time: 0.085376 [03/25/2022-13:24:08] [V] [TRT] Tactic: 0 Time: 0.028928 [03/25/2022-13:24:08] [V] [TRT] Fastest Tactic: 0 Time: 0.028928 [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(6272,49:4,7,1) -> Int8(784,49:32,7,1) *************** [03/25/2022-13:24:08] [V] [TRT] --------------- Timing Runner: Optimizer Reformat(1941 -> ) (Reformat) [03/25/2022-13:24:08] [V] [TRT] Tactic: 1002 Time: 0.032768 [03/25/2022-13:24:08] [V] [TRT] Tactic: 0 Time: 0.01728 [03/25/2022-13:24:08] [V] [TRT] Fastest Tactic: 0 Time: 0.01728 [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(784,49:32,7,1) -> Int8(6272,49:4,7,1) *************** [03/25/2022-13:24:08] [V] [TRT] --------------- Timing Runner: Optimizer Reformat(1941 -> ) (Reformat) [03/25/2022-13:24:08] [V] [TRT] Tactic: 1002 Time: 0.032256 [03/25/2022-13:24:08] [V] [TRT] Tactic: 0 Time: 0.012288 [03/25/2022-13:24:08] [V] [TRT] Fastest Tactic: 0 Time: 0.012288 [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(25088,49:4,7,1) -> Int8(3136,49:32,7,1) *************** [03/25/2022-13:24:08] [V] [TRT] --------------- Timing Runner: Optimizer Reformat(1969 -> ) (Reformat) [03/25/2022-13:24:08] [V] [TRT] Tactic: 1002 Time: 0.102272 [03/25/2022-13:24:08] [V] [TRT] Tactic: 0 Time: 0.048 [03/25/2022-13:24:08] [V] [TRT] Fastest Tactic: 0 Time: 0.048 [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(3136,49:32,7,1) -> Int8(25088,49:4,7,1) *************** [03/25/2022-13:24:08] [V] [TRT] --------------- Timing Runner: Optimizer Reformat(1969 -> ) (Reformat) [03/25/2022-13:24:08] [V] [TRT] Tactic: 1002 Time: 0.100736 [03/25/2022-13:24:08] [V] [TRT] Tactic: 0 Time: 0.027904 [03/25/2022-13:24:08] [V] [TRT] Fastest Tactic: 0 Time: 0.027904 [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(25088,49:4,7,1) -> Int8(3136,49:32,7,1) *************** [03/25/2022-13:24:08] [V] [TRT] --------------- Timing Runner: Optimizer Reformat( -> 1977) (Reformat) [03/25/2022-13:24:08] [V] [TRT] Tactic: 1002 Time: 0.102144 [03/25/2022-13:24:08] [V] [TRT] Tactic: 0 Time: 0.048256 [03/25/2022-13:24:08] [V] [TRT] Fastest Tactic: 0 Time: 0.048256 [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(3136,49:32,7,1) -> Int8(25088,49:4,7,1) *************** [03/25/2022-13:24:08] [V] [TRT] --------------- Timing Runner: Optimizer Reformat( -> 1977) (Reformat) [03/25/2022-13:24:08] [V] [TRT] Tactic: 1002 Time: 0.100608 [03/25/2022-13:24:08] [V] [TRT] Tactic: 0 Time: 0.027904 [03/25/2022-13:24:08] [V] [TRT] Fastest Tactic: 0 Time: 0.027904 [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(25088,49:4,7,1) -> Int8(3136,49:32,7,1) *************** [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(3136,49:32,7,1) -> Int8(25088,49:4,7,1) *************** [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(6272,49:4,7,1) -> Int8(784,49:32,7,1) *************** [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(784,49:32,7,1) -> Int8(6272,49:4,7,1) *************** [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(6272,49:4,7,1) -> Int8(784,49:32,7,1) *************** [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(784,49:32,7,1) -> Int8(6272,49:4,7,1) *************** [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(25088,49:4,7,1) -> Int8(3136,49:32,7,1) *************** [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(3136,49:32,7,1) -> Int8(25088,49:4,7,1) *************** [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(25088,49:4,7,1) -> Int8(100352,49,7,1) *************** [03/25/2022-13:24:08] [V] [TRT] --------------- Timing Runner: Optimizer Reformat( -> 2029) (Reformat) [03/25/2022-13:24:08] [V] [TRT] Tactic: 1002 Time: 0.087296 [03/25/2022-13:24:08] [V] [TRT] Tactic: 0 Time: 0.133376 [03/25/2022-13:24:08] [V] [TRT] Fastest Tactic: 1002 Time: 0.087296 [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(25088,49:4,7,1) -> Int8(3136,49:32,7,1) *************** [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(3136,49:32,7,1) -> Int8(100352,49,7,1) *************** [03/25/2022-13:24:08] [V] [TRT] --------------- Timing Runner: Optimizer Reformat( -> 2029) (Reformat) [03/25/2022-13:24:08] [V] [TRT] Tactic: 1002 Time: 0.189184 [03/25/2022-13:24:08] [V] [TRT] Tactic: 0 Time: 0.134272 [03/25/2022-13:24:08] [V] [TRT] Fastest Tactic: 0 Time: 0.134272 [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(3136,49:32,7,1) -> Int8(25088,49:4,7,1) *************** [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(100352,49,7,1) -> Int8(25088,49:4,7,1) *************** [03/25/2022-13:24:08] [V] [TRT] --------------- Timing Runner: Optimizer Reformat(2029 -> ) (Reformat) [03/25/2022-13:24:08] [V] [TRT] Tactic: 1002 Time: 0.084992 [03/25/2022-13:24:08] [V] [TRT] Tactic: 0 Time: 0.061696 [03/25/2022-13:24:08] [V] [TRT] Fastest Tactic: 0 Time: 0.061696 [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(100352,49,7,1) -> Int8(3136,49:32,7,1) *************** [03/25/2022-13:24:08] [V] [TRT] --------------- Timing Runner: Optimizer Reformat(2029 -> ) (Reformat) [03/25/2022-13:24:08] [V] [TRT] Tactic: 1002 Time: 0.077952 [03/25/2022-13:24:08] [V] [TRT] Tactic: 0 Time: 0.217088 [03/25/2022-13:24:08] [V] [TRT] Fastest Tactic: 1002 Time: 0.077952 [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(25088,49:4,7,1) -> Int8(100352,49,7,1) *************** [03/25/2022-13:24:08] [V] [TRT] --------------- Timing Runner: Optimizer Reformat(2029 -> ) (Reformat) [03/25/2022-13:24:08] [V] [TRT] Tactic: 1002 Time: 0.08704 [03/25/2022-13:24:08] [V] [TRT] Tactic: 0 Time: 0.133376 [03/25/2022-13:24:08] [V] [TRT] Fastest Tactic: 1002 Time: 0.08704 [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(25088,49:4,7,1) -> Int8(3136,49:32,7,1) *************** [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(3136,49:32,7,1) -> Int8(100352,49,7,1) *************** [03/25/2022-13:24:08] [V] [TRT] --------------- Timing Runner: Optimizer Reformat(2029 -> ) (Reformat) [03/25/2022-13:24:08] [V] [TRT] Tactic: 1002 Time: 0.188928 [03/25/2022-13:24:08] [V] [TRT] Tactic: 0 Time: 0.134272 [03/25/2022-13:24:08] [V] [TRT] Fastest Tactic: 0 Time: 0.134272 [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(3136,49:32,7,1) -> Int8(25088,49:4,7,1) *************** [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(100352,49,7,1) -> Int8(25088,49:4,7,1) *************** [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(100352,49,7,1) -> Int8(3136,49:32,7,1) *************** [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(25088,49:4,7,1) -> Int8(3136,49:32,7,1) *************** [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(3136,49:32,7,1) -> Int8(25088,49:4,7,1) *************** [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(6272,49:4,7,1) -> Int8(784,49:32,7,1) *************** [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(784,49:32,7,1) -> Int8(6272,49:4,7,1) *************** [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(6272,49:4,7,1) -> Int8(784,49:32,7,1) *************** [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Int8(784,49:32,7,1) -> Int8(6272,49:4,7,1) *************** [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Float(100352,49,7,1) -> Float(3136,49:32,7,1) *************** [03/25/2022-13:24:08] [V] [TRT] --------------- Timing Runner: Optimizer Reformat(2076 -> ) (Reformat) [03/25/2022-13:24:08] [V] [TRT] Tactic: 1002 Time: 0.094336 [03/25/2022-13:24:08] [V] [TRT] Tactic: 0 Time: 0.323072 [03/25/2022-13:24:08] [V] [TRT] Fastest Tactic: 1002 Time: 0.094336 [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Float(3136,49:32,7,1) -> Float(100352,49,7,1) *************** [03/25/2022-13:24:08] [V] [TRT] --------------- Timing Runner: Optimizer Reformat(2076 -> ) (Reformat) [03/25/2022-13:24:08] [V] [TRT] Tactic: 1002 Time: 0.100224 [03/25/2022-13:24:08] [V] [TRT] Tactic: 0 Time: 0.107264 [03/25/2022-13:24:08] [V] [TRT] Fastest Tactic: 1002 Time: 0.100224 [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Float(3136,49:32,7,1) -> Float(100352,49,7,1) *************** [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Float(2048,1,1,1) -> Float(2048,1,2048,2048) *************** [03/25/2022-13:24:08] [V] [TRT] --------------- Timing Runner: Optimizer Reformat( -> 2079) (Reformat) [03/25/2022-13:24:08] [V] [TRT] Tactic: 1002 Time: 0.01024 [03/25/2022-13:24:08] [V] [TRT] Tactic: 0 Time: 0.009216 [03/25/2022-13:24:08] [V] [TRT] Fastest Tactic: 0 Time: 0.009216 [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Float(2048,1,1,1) -> Float(512,1:4,512,512) *************** [03/25/2022-13:24:08] [V] [TRT] --------------- Timing Runner: Optimizer Reformat( -> 2079) (Reformat) [03/25/2022-13:24:08] [V] [TRT] Tactic: 1002 Time: 0.0128 [03/25/2022-13:24:08] [V] [TRT] Tactic: 0 Time: 0.010112 [03/25/2022-13:24:08] [V] [TRT] Fastest Tactic: 0 Time: 0.010112 [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Float(2048,1,1,1) -> Float(2048,1,2048,2048) *************** [03/25/2022-13:24:08] [V] [TRT] --------------- Timing Runner: Optimizer Reformat(2079 -> ) (Reformat) [03/25/2022-13:24:08] [V] [TRT] Tactic: 1002 Time: 0.01024 [03/25/2022-13:24:08] [V] [TRT] Tactic: 0 Time: 0.009216 [03/25/2022-13:24:08] [V] [TRT] Fastest Tactic: 0 Time: 0.009216 [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Float(2048,1,1,1) -> Float(512,1:4,512,512) *************** [03/25/2022-13:24:08] [V] [TRT] --------------- Timing Runner: Optimizer Reformat(2079 -> ) (Reformat) [03/25/2022-13:24:08] [V] [TRT] Tactic: 1002 Time: 0.013184 [03/25/2022-13:24:08] [V] [TRT] Tactic: 0 Time: 0.009984 [03/25/2022-13:24:08] [V] [TRT] Fastest Tactic: 0 Time: 0.009984 [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Float(2048,1,2048,2048) -> Float(2048,1,1,1) *************** [03/25/2022-13:24:08] [V] [TRT] --------------- Timing Runner: Optimizer Reformat(2079 -> ) (Reformat) [03/25/2022-13:24:08] [V] [TRT] Tactic: 1002 Time: 0.01024 [03/25/2022-13:24:08] [V] [TRT] Tactic: 0 Time: 0.009216 [03/25/2022-13:24:08] [V] [TRT] Fastest Tactic: 0 Time: 0.009216 [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Float(2048,1,2048,2048) -> Float(512,1:4,512,512) *************** [03/25/2022-13:24:08] [V] [TRT] --------------- Timing Runner: Optimizer Reformat(2079 -> ) (Reformat) [03/25/2022-13:24:08] [V] [TRT] Tactic: 1002 Time: 0.012288 [03/25/2022-13:24:08] [V] [TRT] Tactic: 0 Time: 0.01024 [03/25/2022-13:24:08] [V] [TRT] Fastest Tactic: 0 Time: 0.01024 [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Float(512,1:4,512,512) -> Float(2048,1,1,1) *************** [03/25/2022-13:24:08] [V] [TRT] --------------- Timing Runner: Optimizer Reformat(2079 -> ) (Reformat) [03/25/2022-13:24:08] [V] [TRT] Tactic: 1002 Time: 0.084992 [03/25/2022-13:24:08] [V] [TRT] Tactic: 0 Time: 0.009472 [03/25/2022-13:24:08] [V] [TRT] Fastest Tactic: 0 Time: 0.009472 [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Float(512,1:4,512,512) -> Float(2048,1,2048,2048) *************** [03/25/2022-13:24:08] [V] [TRT] --------------- Timing Runner: Optimizer Reformat(2079 -> ) (Reformat) [03/25/2022-13:24:08] [V] [TRT] Tactic: 1002 Time: 0.012288 [03/25/2022-13:24:08] [V] [TRT] Tactic: 0 Time: 0.010112 [03/25/2022-13:24:08] [V] [TRT] Fastest Tactic: 0 Time: 0.010112 [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Float(1000,1,1000,1000) -> Float(1000,1,1,1) *************** [03/25/2022-13:24:08] [V] [TRT] --------------- Timing Runner: Optimizer Reformat((Unnamed Layer* 971) [Fully Connected]_output -> ) (Reformat) [03/25/2022-13:24:08] [V] [TRT] Tactic: 1002 Time: 0.010112 [03/25/2022-13:24:08] [V] [TRT] Tactic: 0 Time: 0.008832 [03/25/2022-13:24:08] [V] [TRT] Fastest Tactic: 0 Time: 0.008832 [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning Reformat: Float(250,1:4,250,250) -> Float(1000,1,1,1) *************** [03/25/2022-13:24:08] [V] [TRT] --------------- Timing Runner: Optimizer Reformat((Unnamed Layer* 971) [Fully Connected]_output -> ) (Reformat) [03/25/2022-13:24:08] [V] [TRT] Tactic: 1002 Time: 0.047104 [03/25/2022-13:24:08] [V] [TRT] Tactic: 0 Time: 0.00896 [03/25/2022-13:24:08] [V] [TRT] Fastest Tactic: 0 Time: 0.00896 [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] =============== Computing reformatting costs [03/25/2022-13:24:08] [V] [TRT] =============== Computing costs for [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning format combination: Float(150528,50176,224,1) -> Int8(150528,50176,224,1) *************** [03/25/2022-13:24:08] [V] [TRT] --------------- Timing Runner: QuantizeLinear_2_quantize_scale_node (Scale) [03/25/2022-13:24:08] [V] [TRT] Tactic: 0 Time: 0.077824 [03/25/2022-13:24:08] [V] [TRT] Fastest Tactic: 0 Time: 0.077824 [03/25/2022-13:24:08] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: Scale Tactic: 0 [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning format combination: Float(150528,50176,224,1) -> Int8(50176,50176:4,224,1) *************** [03/25/2022-13:24:08] [V] [TRT] --------------- Timing Runner: QuantizeLinear_2_quantize_scale_node (Scale) [03/25/2022-13:24:08] [V] [TRT] Tactic: 0 Time: 0.106624 [03/25/2022-13:24:08] [V] [TRT] Fastest Tactic: 0 Time: 0.106624 [03/25/2022-13:24:08] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: Scale Tactic: 0 [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning format combination: Float(150528,50176,224,1) -> Int8(50176,50176:32,224,1) *************** [03/25/2022-13:24:08] [V] [TRT] --------------- Timing Runner: QuantizeLinear_2_quantize_scale_node (Scale) [03/25/2022-13:24:08] [V] [TRT] Tactic: 0 Time: 0.655616 [03/25/2022-13:24:08] [V] [TRT] Fastest Tactic: 0 Time: 0.655616 [03/25/2022-13:24:08] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: Scale Tactic: 0 [03/25/2022-13:24:08] [V] [TRT] =============== Computing costs for [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning format combination: Int8(150528,50176,224,1) -> Int8(25088,12544:32,112,1) *************** [03/25/2022-13:24:08] [V] [TRT] --------------- Timing Runner: input.conv.module.weight + QuantizeLinear_8_quantize_scale_node + Conv_12 + Relu_14 (CaskConvolution) [03/25/2022-13:24:08] [V] [TRT] input.conv.module.weight + QuantizeLinear_8_quantize_scale_node + Conv_12 + Relu_14 Set Tactic Name: volta_first_layer_filter7x7_imma_fwd Tactic: -5510956450195747703 [03/25/2022-13:24:08] [V] [TRT] Tactic: -5510956450195747703 Time: 0.316928 [03/25/2022-13:24:08] [V] [TRT] Fastest Tactic: -5510956450195747703 Time: 0.316928 [03/25/2022-13:24:08] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: -5510956450195747703 [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning format combination: Int8(50176,50176:4,224,1) -> Int8(200704,12544:4,112,1) *************** [03/25/2022-13:24:08] [V] [TRT] --------------- Timing Runner: input.conv.module.weight + QuantizeLinear_8_quantize_scale_node + Conv_12 + Relu_14 (CudaDepthwiseConvolution) [03/25/2022-13:24:08] [V] [TRT] CudaDepthwiseConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:08] [V] [TRT] --------------- Timing Runner: input.conv.module.weight + QuantizeLinear_8_quantize_scale_node + Conv_12 + Relu_14 (FusedConvActConvolution) [03/25/2022-13:24:08] [V] [TRT] FusedConvActConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:08] [V] [TRT] --------------- Timing Runner: input.conv.module.weight + QuantizeLinear_8_quantize_scale_node + Conv_12 + Relu_14 (CaskConvolution) [03/25/2022-13:24:08] [V] [TRT] input.conv.module.weight + QuantizeLinear_8_quantize_scale_node + Conv_12 + Relu_14 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_medium_nn_v1 Tactic: 175853789719975416 [03/25/2022-13:24:08] [V] [TRT] Tactic: 175853789719975416 Time: 0.695552 [03/25/2022-13:24:08] [V] [TRT] input.conv.module.weight + QuantizeLinear_8_quantize_scale_node + Conv_12 + Relu_14 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_medium_nn_v1 Tactic: 2171150287007712632 [03/25/2022-13:24:08] [V] [TRT] Tactic: 2171150287007712632 Time: 0.76608 [03/25/2022-13:24:08] [V] [TRT] input.conv.module.weight + QuantizeLinear_8_quantize_scale_node + Conv_12 + Relu_14 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_medium_nn_v1 Tactic: 5834048089706882838 [03/25/2022-13:24:08] [V] [TRT] Tactic: 5834048089706882838 Time: 0.683392 [03/25/2022-13:24:08] [V] [TRT] input.conv.module.weight + QuantizeLinear_8_quantize_scale_node + Conv_12 + Relu_14 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_medium_nn_v1 Tactic: -6585664687867083638 [03/25/2022-13:24:08] [V] [TRT] Tactic: -6585664687867083638 Time: 1.31738 [03/25/2022-13:24:08] [V] [TRT] input.conv.module.weight + QuantizeLinear_8_quantize_scale_node + Conv_12 + Relu_14 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_xregs_large_nn_v1 Tactic: -3730012925709297561 [03/25/2022-13:24:08] [V] [TRT] Tactic: -3730012925709297561 Time: 0.684288 [03/25/2022-13:24:08] [V] [TRT] input.conv.module.weight + QuantizeLinear_8_quantize_scale_node + Conv_12 + Relu_14 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_xregs_large_nn_v1 Tactic: -2277259417488004546 [03/25/2022-13:24:08] [V] [TRT] Tactic: -2277259417488004546 Time: 1.7961 [03/25/2022-13:24:08] [V] [TRT] Fastest Tactic: 5834048089706882838 Time: 0.683392 [03/25/2022-13:24:08] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: 5834048089706882838 [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning format combination: Int8(50176,50176:4,224,1) -> Int8(25088,12544:32,112,1) *************** [03/25/2022-13:24:08] [V] [TRT] --------------- Timing Runner: input.conv.module.weight + QuantizeLinear_8_quantize_scale_node + Conv_12 + Relu_14 (CaskConvolution) [03/25/2022-13:24:08] [V] [TRT] input.conv.module.weight + QuantizeLinear_8_quantize_scale_node + Conv_12 + Relu_14 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_xregs_large_c32_nn_v1 Tactic: 984309058095623735 [03/25/2022-13:24:08] [V] [TRT] Tactic: 984309058095623735 Time: 0.68096 [03/25/2022-13:24:08] [V] [TRT] input.conv.module.weight + QuantizeLinear_8_quantize_scale_node + Conv_12 + Relu_14 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_xregs_large_c32_nn_v1 Tactic: 3238312825609165543 [03/25/2022-13:24:08] [V] [TRT] Tactic: 3238312825609165543 Time: 1.77088 [03/25/2022-13:24:08] [V] [TRT] input.conv.module.weight + QuantizeLinear_8_quantize_scale_node + Conv_12 + Relu_14 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_medium_c32_nn_v1 Tactic: 3606311198834416176 [03/25/2022-13:24:08] [V] [TRT] Tactic: 3606311198834416176 Time: 0.680704 [03/25/2022-13:24:08] [V] [TRT] input.conv.module.weight + QuantizeLinear_8_quantize_scale_node + Conv_12 + Relu_14 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_medium_c32_nn_v1 Tactic: -4255737803793506479 [03/25/2022-13:24:08] [V] [TRT] Tactic: -4255737803793506479 Time: 1.30688 [03/25/2022-13:24:08] [V] [TRT] input.conv.module.weight + QuantizeLinear_8_quantize_scale_node + Conv_12 + Relu_14 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_medium_c32_nn_v1 Tactic: -3111968753064955248 [03/25/2022-13:24:08] [V] [TRT] Tactic: -3111968753064955248 Time: 0.752 [03/25/2022-13:24:08] [V] [TRT] input.conv.module.weight + QuantizeLinear_8_quantize_scale_node + Conv_12 + Relu_14 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_medium_c32_nn_v1 Tactic: -1492575840277333548 [03/25/2022-13:24:08] [V] [TRT] Tactic: -1492575840277333548 Time: 0.690432 [03/25/2022-13:24:08] [V] [TRT] Fastest Tactic: 3606311198834416176 Time: 0.680704 [03/25/2022-13:24:08] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: 3606311198834416176 [03/25/2022-13:24:08] [V] [TRT] *************** Autotuning format combination: Int8(50176,50176:32,224,1) -> Int8(25088,12544:32,112,1) *************** [03/25/2022-13:24:09] [V] [TRT] --------------- Timing Runner: input.conv.module.weight + QuantizeLinear_8_quantize_scale_node + Conv_12 + Relu_14 (CudaGroupConvolution) [03/25/2022-13:24:09] [V] [TRT] CudaGroupConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:09] [V] [TRT] --------------- Timing Runner: input.conv.module.weight + QuantizeLinear_8_quantize_scale_node + Conv_12 + Relu_14 (CudaDepthwiseConvolution) [03/25/2022-13:24:09] [V] [TRT] CudaDepthwiseConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:09] [V] [TRT] --------------- Timing Runner: input.conv.module.weight + QuantizeLinear_8_quantize_scale_node + Conv_12 + Relu_14 (FusedConvActConvolution) [03/25/2022-13:24:09] [V] [TRT] FusedConvActConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:09] [V] [TRT] --------------- Timing Runner: input.conv.module.weight + QuantizeLinear_8_quantize_scale_node + Conv_12 + Relu_14 (CaskConvolution) [03/25/2022-13:24:09] [V] [TRT] input.conv.module.weight + QuantizeLinear_8_quantize_scale_node + Conv_12 + Relu_14 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_medium_nt_v1 Tactic: 2985940154541537814 [03/25/2022-13:24:09] [V] [TRT] Tactic: 2985940154541537814 Time: 2.52122 [03/25/2022-13:24:09] [V] [TRT] input.conv.module.weight + QuantizeLinear_8_quantize_scale_node + Conv_12 + Relu_14 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_medium_nt_v1 Tactic: 3899284354987683408 [03/25/2022-13:24:09] [V] [TRT] Tactic: 3899284354987683408 Time: 2.73357 [03/25/2022-13:24:09] [V] [TRT] input.conv.module.weight + QuantizeLinear_8_quantize_scale_node + Conv_12 + Relu_14 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_large_nt_v1 Tactic: 4182625619810185112 [03/25/2022-13:24:09] [V] [TRT] Tactic: 4182625619810185112 Time: 1.36563 [03/25/2022-13:24:09] [V] [TRT] input.conv.module.weight + QuantizeLinear_8_quantize_scale_node + Conv_12 + Relu_14 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_indexed_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: 8751622450593766232 [03/25/2022-13:24:09] [V] [TRT] Tactic: 8751622450593766232 Time: 1.48762 [03/25/2022-13:24:09] [V] [TRT] input.conv.module.weight + QuantizeLinear_8_quantize_scale_node + Conv_12 + Relu_14 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_indexed_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32 Tactic: 9064458886956700976 [03/25/2022-13:24:09] [V] [TRT] Tactic: 9064458886956700976 Time: 1.50144 [03/25/2022-13:24:09] [V] [TRT] input.conv.module.weight + QuantizeLinear_8_quantize_scale_node + Conv_12 + Relu_14 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_medium_nt_v1 Tactic: -5766140806760372989 [03/25/2022-13:24:09] [V] [TRT] Tactic: -5766140806760372989 Time: 1.2928 [03/25/2022-13:24:09] [V] [TRT] input.conv.module.weight + QuantizeLinear_8_quantize_scale_node + Conv_12 + Relu_14 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_medium_nt_v1 Tactic: -4516822589357530549 [03/25/2022-13:24:09] [V] [TRT] Tactic: -4516822589357530549 Time: 1.3239 [03/25/2022-13:24:09] [V] [TRT] input.conv.module.weight + QuantizeLinear_8_quantize_scale_node + Conv_12 + Relu_14 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_large_nt_v1 Tactic: -2917455979290586480 [03/25/2022-13:24:09] [V] [TRT] Tactic: -2917455979290586480 Time: 2.72269 [03/25/2022-13:24:09] [V] [TRT] input.conv.module.weight + QuantizeLinear_8_quantize_scale_node + Conv_12 + Relu_14 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_large_nt_v1 Tactic: -2571022005763160364 [03/25/2022-13:24:09] [V] [TRT] Tactic: -2571022005763160364 Time: 2.60058 [03/25/2022-13:24:09] [V] [TRT] input.conv.module.weight + QuantizeLinear_8_quantize_scale_node + Conv_12 + Relu_14 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_large_nt_v1 Tactic: -428104331444385564 [03/25/2022-13:24:09] [V] [TRT] Tactic: -428104331444385564 Time: 1.32096 [03/25/2022-13:24:09] [V] [TRT] Fastest Tactic: -5766140806760372989 Time: 1.2928 [03/25/2022-13:24:09] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: -5766140806760372989 [03/25/2022-13:24:09] [V] [TRT] =============== Computing costs for [03/25/2022-13:24:09] [V] [TRT] *************** Autotuning format combination: Int8(200704,12544:4,112,1) -> Int8(50176,3136:4,56,1) *************** [03/25/2022-13:24:09] [V] [TRT] --------------- Timing Runner: MaxPool_15 (TiledPooling) [03/25/2022-13:24:09] [V] [TRT] Tactic: 257 Time: 0.352256 [03/25/2022-13:24:09] [V] [TRT] Tactic: 65793 Time: 0.286208 [03/25/2022-13:24:09] [V] [TRT] Tactic: 131329 Time: 0.368128 [03/25/2022-13:24:09] [V] [TRT] Tactic: 196865 Time: 0.392576 [03/25/2022-13:24:09] [V] [TRT] Tactic: 262401 Time: 0.313856 [03/25/2022-13:24:09] [V] [TRT] Tactic: 327937 Time: 0.331392 [03/25/2022-13:24:09] [V] [TRT] Tactic: 393473 Time: 0.35968 [03/25/2022-13:24:09] [V] [TRT] Tactic: 459009 Time: 0.291456 [03/25/2022-13:24:09] [V] [TRT] Tactic: 524545 Time: 0.219392 [03/25/2022-13:24:09] [V] [TRT] Tactic: 590081 Time: 0.25984 [03/25/2022-13:24:09] [V] [TRT] Tactic: 655617 Time: 0.27328 [03/25/2022-13:24:09] [V] [TRT] Tactic: 721153 Time: 0.215552 [03/25/2022-13:24:09] [V] [TRT] Tactic: 786689 Time: 0.208 [03/25/2022-13:24:09] [V] [TRT] Tactic: 852225 Time: 0.244992 [03/25/2022-13:24:09] [V] [TRT] Tactic: 917761 Time: 0.285568 [03/25/2022-13:24:09] [V] [TRT] Tactic: 983297 Time: 0.21568 [03/25/2022-13:24:09] [V] [TRT] Tactic: 1048833 Time: 0.233216 [03/25/2022-13:24:09] [V] [TRT] Tactic: 1114369 Time: 0.223616 [03/25/2022-13:24:09] [V] [TRT] Tactic: 1179905 Time: 0.204032 [03/25/2022-13:24:09] [V] [TRT] Tactic: 1245441 Time: 0.192384 [03/25/2022-13:24:09] [V] [TRT] Tactic: 1310977 Time: 0.211328 [03/25/2022-13:24:09] [V] [TRT] Tactic: 1376513 Time: 0.285056 [03/25/2022-13:24:09] [V] [TRT] Tactic: 1442049 Time: 0.210816 [03/25/2022-13:24:09] [V] [TRT] Tactic: 1507585 Time: 0.217472 [03/25/2022-13:24:09] [V] [TRT] Tactic: 1573121 Time: 0.197888 [03/25/2022-13:24:09] [V] [TRT] Tactic: 1638657 Time: 0.183552 [03/25/2022-13:24:10] [V] [TRT] Tactic: 1704193 Time: 0.17664 [03/25/2022-13:24:10] [V] [TRT] Tactic: 1769729 Time: 0.188544 [03/25/2022-13:24:10] [V] [TRT] Tactic: 1835265 Time: 0.288768 [03/25/2022-13:24:10] [V] [TRT] Tactic: 1900801 Time: 0.215424 [03/25/2022-13:24:10] [V] [TRT] Tactic: 1966337 Time: 0.225792 [03/25/2022-13:24:10] [V] [TRT] Tactic: 2031873 Time: 0.20928 [03/25/2022-13:24:10] [V] [TRT] Tactic: 2097409 Time: 0.192256 [03/25/2022-13:24:10] [V] [TRT] Tactic: 2162945 Time: 0.182272 [03/25/2022-13:24:10] [V] [TRT] Tactic: 2228481 Time: 0.201856 [03/25/2022-13:24:10] [V] [TRT] Tactic: 2294017 Time: 0.291072 [03/25/2022-13:24:10] [V] [TRT] Tactic: 2359553 Time: 0.213504 [03/25/2022-13:24:10] [V] [TRT] Tactic: 2425089 Time: 0.217344 [03/25/2022-13:24:10] [V] [TRT] Tactic: 2490625 Time: 0.186624 [03/25/2022-13:24:10] [V] [TRT] Tactic: 2556161 Time: 0.176768 [03/25/2022-13:24:10] [V] [TRT] Tactic: 2621697 Time: 0.17536 [03/25/2022-13:24:10] [V] [TRT] Tactic: 2687233 Time: 0.18688 [03/25/2022-13:24:10] [V] [TRT] Tactic: 6947073 Time: 0.139648 [03/25/2022-13:24:10] [V] [TRT] Fastest Tactic: 6947073 Time: 0.139648 [03/25/2022-13:24:10] [V] [TRT] --------------- Timing Runner: MaxPool_15 (CudaPooling) [03/25/2022-13:24:10] [V] [TRT] Tactic: -3 Time: 0.140928 [03/25/2022-13:24:10] [V] [TRT] Fastest Tactic: -3 Time: 0.140928 [03/25/2022-13:24:10] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: TiledPooling Tactic: 6947073 [03/25/2022-13:24:10] [V] [TRT] *************** Autotuning format combination: Int8(25088,12544:32,112,1) -> Int8(6272,3136:32,56,1) *************** [03/25/2022-13:24:10] [V] [TRT] --------------- Timing Runner: MaxPool_15 (TiledPooling) [03/25/2022-13:24:10] [V] [TRT] TiledPooling has no valid tactics for this config, skipping [03/25/2022-13:24:10] [V] [TRT] --------------- Timing Runner: MaxPool_15 (CudaPooling) [03/25/2022-13:24:10] [V] [TRT] Tactic: -4 Time: 0.15872 [03/25/2022-13:24:10] [V] [TRT] Fastest Tactic: -4 Time: 0.15872 [03/25/2022-13:24:10] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CudaPooling Tactic: -4 [03/25/2022-13:24:10] [V] [TRT] =============== Computing costs for [03/25/2022-13:24:10] [V] [TRT] *************** Autotuning format combination: Int8(50176,3136:4,56,1) -> Int8(50176,3136:4,56,1) *************** [03/25/2022-13:24:10] [V] [TRT] --------------- Timing Runner: sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 (CudaDepthwiseConvolution) [03/25/2022-13:24:10] [V] [TRT] CudaDepthwiseConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:10] [V] [TRT] --------------- Timing Runner: sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 (FusedConvActConvolution) [03/25/2022-13:24:10] [V] [TRT] FusedConvActConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:10] [V] [TRT] --------------- Timing Runner: sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 (CaskConvolution) [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_medium_nn_v1 Tactic: 175853789719975416 [03/25/2022-13:24:10] [V] [TRT] Tactic: 175853789719975416 Time: 0.096256 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_medium_nn_v1 Tactic: 2171150287007712632 [03/25/2022-13:24:10] [V] [TRT] Tactic: 2171150287007712632 Time: 0.107904 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_small_nn_v1 Tactic: 2234457234705232274 [03/25/2022-13:24:10] [V] [TRT] Tactic: 2234457234705232274 Time: 0.076928 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_medium_nn_v1 Tactic: 5834048089706882838 [03/25/2022-13:24:10] [V] [TRT] Tactic: 5834048089706882838 Time: 0.078848 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_interior_nn_v1 Tactic: 6299962968199310600 [03/25/2022-13:24:10] [V] [TRT] Tactic: 6299962968199310600 Time: 0.146048 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_interior_nn_v1 Tactic: 6341572697076960911 [03/25/2022-13:24:10] [V] [TRT] Tactic: 6341572697076960911 Time: 0.099712 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_small_nn_v1 Tactic: -8626990807754934295 [03/25/2022-13:24:10] [V] [TRT] Tactic: -8626990807754934295 Time: 0.094592 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_interior_nn_v1 Tactic: -8498217049614706532 [03/25/2022-13:24:10] [V] [TRT] Tactic: -8498217049614706532 Time: 0.075008 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_small_nn_v1 Tactic: -7303593854972602201 [03/25/2022-13:24:10] [V] [TRT] Tactic: -7303593854972602201 Time: 0.102784 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_medium_nn_v1 Tactic: -6585664687867083638 [03/25/2022-13:24:10] [V] [TRT] Tactic: -6585664687867083638 Time: 0.14976 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_interior_nn_v1 Tactic: -3326139578711341011 [03/25/2022-13:24:10] [V] [TRT] Tactic: -3326139578711341011 Time: 0.091776 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_small_nn_v1 Tactic: -683636008127039856 [03/25/2022-13:24:10] [V] [TRT] Tactic: -683636008127039856 Time: 0.148608 [03/25/2022-13:24:10] [V] [TRT] Fastest Tactic: -8498217049614706532 Time: 0.075008 [03/25/2022-13:24:10] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: -8498217049614706532 [03/25/2022-13:24:10] [V] [TRT] *************** Autotuning format combination: Int8(50176,3136:4,56,1) -> Int8(6272,3136:32,56,1) *************** [03/25/2022-13:24:10] [V] [TRT] --------------- Timing Runner: sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 (CaskConvolution) [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_small_c32_nn_v1 Tactic: 1100922622480907544 [03/25/2022-13:24:10] [V] [TRT] Tactic: 1100922622480907544 Time: 0.092032 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_interior_c32_nn_v1 Tactic: 2855900226702061782 [03/25/2022-13:24:10] [V] [TRT] Tactic: 2855900226702061782 Time: 0.141568 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_medium_c32_nn_v1 Tactic: 3606311198834416176 [03/25/2022-13:24:10] [V] [TRT] Tactic: 3606311198834416176 Time: 0.077056 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_small_c32_nn_v1 Tactic: 4325765560739862899 [03/25/2022-13:24:10] [V] [TRT] Tactic: 4325765560739862899 Time: 0.143104 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_interior_c32_nn_v1 Tactic: 8803458114157674373 [03/25/2022-13:24:10] [V] [TRT] Tactic: 8803458114157674373 Time: 0.073984 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_interior_c32_nn_v1 Tactic: -6934773036503365000 [03/25/2022-13:24:10] [V] [TRT] Tactic: -6934773036503365000 Time: 0.089856 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_interior_c32_nn_v1 Tactic: -4431642509665791294 [03/25/2022-13:24:10] [V] [TRT] Tactic: -4431642509665791294 Time: 0.097536 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_medium_c32_nn_v1 Tactic: -4255737803793506479 [03/25/2022-13:24:10] [V] [TRT] Tactic: -4255737803793506479 Time: 0.142848 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_small_c32_nn_v1 Tactic: -3958182351168863467 [03/25/2022-13:24:10] [V] [TRT] Tactic: -3958182351168863467 Time: 0.100224 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_medium_c32_nn_v1 Tactic: -3111968753064955248 [03/25/2022-13:24:10] [V] [TRT] Tactic: -3111968753064955248 Time: 0.105088 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_medium_c32_nn_v1 Tactic: -1492575840277333548 [03/25/2022-13:24:10] [V] [TRT] Tactic: -1492575840277333548 Time: 0.093952 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_small_c32_nn_v1 Tactic: -868495160148524802 [03/25/2022-13:24:10] [V] [TRT] Tactic: -868495160148524802 Time: 0.075264 [03/25/2022-13:24:10] [V] [TRT] Fastest Tactic: 8803458114157674373 Time: 0.073984 [03/25/2022-13:24:10] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: 8803458114157674373 [03/25/2022-13:24:10] [V] [TRT] *************** Autotuning format combination: Int8(6272,3136:32,56,1) -> Int8(6272,3136:32,56,1) *************** [03/25/2022-13:24:10] [V] [TRT] --------------- Timing Runner: sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 (CudaGroupConvolution) [03/25/2022-13:24:10] [V] [TRT] CudaGroupConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:10] [V] [TRT] --------------- Timing Runner: sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 (CudaDepthwiseConvolution) [03/25/2022-13:24:10] [V] [TRT] CudaDepthwiseConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:10] [V] [TRT] --------------- Timing Runner: sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 (FusedConvActConvolution) [03/25/2022-13:24:10] [V] [TRT] FusedConvActConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:10] [V] [TRT] --------------- Timing Runner: sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 (CaskConvolution) [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16_t1r1s1 Tactic: 68468667201176803 [03/25/2022-13:24:10] [V] [TRT] Tactic: 68468667201176803 Time: 0.058112 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 125145153013230687 [03/25/2022-13:24:10] [V] [TRT] Tactic: 125145153013230687 Time: 0.168448 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_interior_nt_v1 Tactic: 434957160407688216 [03/25/2022-13:24:10] [V] [TRT] Tactic: 434957160407688216 Time: 0.120064 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 805889586762897346 [03/25/2022-13:24:10] [V] [TRT] Tactic: 805889586762897346 Time: 0.088192 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: 857001784974286465 [03/25/2022-13:24:10] [V] [TRT] Tactic: 857001784974286465 Time: 0.19008 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 1214130898909872671 [03/25/2022-13:24:10] [V] [TRT] Tactic: 1214130898909872671 Time: 0.070016 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 1278425129871930205 [03/25/2022-13:24:10] [V] [TRT] Tactic: 1278425129871930205 Time: 0.081408 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 1583811548148740665 [03/25/2022-13:24:10] [V] [TRT] Tactic: 1583811548148740665 Time: 0.146944 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 1701344857577810806 [03/25/2022-13:24:10] [V] [TRT] Tactic: 1701344857577810806 Time: 0.082176 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 1797231177354918208 [03/25/2022-13:24:10] [V] [TRT] Tactic: 1797231177354918208 Time: 0.06592 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2004812516525036381 [03/25/2022-13:24:10] [V] [TRT] Tactic: 2004812516525036381 Time: 0.11648 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2030033463723799063 [03/25/2022-13:24:10] [V] [TRT] Tactic: 2030033463723799063 Time: 0.080128 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_interior_nt_v1 Tactic: 2346437292116182513 [03/25/2022-13:24:10] [V] [TRT] Tactic: 2346437292116182513 Time: 0.095872 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2376898825218218566 [03/25/2022-13:24:10] [V] [TRT] Tactic: 2376898825218218566 Time: 0.071936 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_interior_nt_v1 Tactic: 2522133112320625287 [03/25/2022-13:24:10] [V] [TRT] Tactic: 2522133112320625287 Time: 0.060672 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2548171972648455240 [03/25/2022-13:24:10] [V] [TRT] Tactic: 2548171972648455240 Time: 0.050816 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 2548946449357458230 [03/25/2022-13:24:10] [V] [TRT] Tactic: 2548946449357458230 Time: 0.0576 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 2678520742286844763 [03/25/2022-13:24:10] [V] [TRT] Tactic: 2678520742286844763 Time: 0.20352 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 2756291002030759362 [03/25/2022-13:24:10] [V] [TRT] Tactic: 2756291002030759362 Time: 0.064 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2972948223367788520 [03/25/2022-13:24:10] [V] [TRT] Tactic: 2972948223367788520 Time: 0.07232 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_medium_nt_v1 Tactic: 2985940154541537814 [03/25/2022-13:24:10] [V] [TRT] Tactic: 2985940154541537814 Time: 0.097792 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 3043273137345374664 [03/25/2022-13:24:10] [V] [TRT] Tactic: 3043273137345374664 Time: 0.083072 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 3221677093659484230 [03/25/2022-13:24:10] [V] [TRT] Tactic: 3221677093659484230 Time: 0.115584 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: 3242897809704328258 [03/25/2022-13:24:10] [V] [TRT] Tactic: 3242897809704328258 Time: 0.055808 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 3312456766204252694 [03/25/2022-13:24:10] [V] [TRT] Tactic: 3312456766204252694 Time: 0.057984 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 3538565962642681625 [03/25/2022-13:24:10] [V] [TRT] Tactic: 3538565962642681625 Time: 0.0608 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 3541919052468401776 [03/25/2022-13:24:10] [V] [TRT] Tactic: 3541919052468401776 Time: 0.083968 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 3593397928177382100 [03/25/2022-13:24:10] [V] [TRT] Tactic: 3593397928177382100 Time: 0.070144 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 3670282018109435863 [03/25/2022-13:24:10] [V] [TRT] Tactic: 3670282018109435863 Time: 0.049664 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 3671413346254027573 [03/25/2022-13:24:10] [V] [TRT] Tactic: 3671413346254027573 Time: 0.071936 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_medium_nt_v1 Tactic: 3899284354987683408 [03/25/2022-13:24:10] [V] [TRT] Tactic: 3899284354987683408 Time: 0.12224 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 3927509214678622419 [03/25/2022-13:24:10] [V] [TRT] Tactic: 3927509214678622419 Time: 0.052224 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 4112572034735311841 [03/25/2022-13:24:10] [V] [TRT] Tactic: 4112572034735311841 Time: 0.083072 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 4239974928951431644 [03/25/2022-13:24:10] [V] [TRT] Tactic: 4239974928951431644 Time: 0.073088 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 4610760414797216079 [03/25/2022-13:24:10] [V] [TRT] Tactic: 4610760414797216079 Time: 0.092032 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_small_nt_v1 Tactic: 4717285412741024953 [03/25/2022-13:24:10] [V] [TRT] Tactic: 4717285412741024953 Time: 0.061952 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 4796956614760326119 [03/25/2022-13:24:10] [V] [TRT] Tactic: 4796956614760326119 Time: 0.082176 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 4919361344804309192 [03/25/2022-13:24:10] [V] [TRT] Tactic: 4919361344804309192 Time: 0.068352 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 5043674678294309681 [03/25/2022-13:24:10] [V] [TRT] Tactic: 5043674678294309681 Time: 0.071424 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_interior_nt_v1 Tactic: 5126565865931538390 [03/25/2022-13:24:10] [V] [TRT] Tactic: 5126565865931538390 Time: 0.060544 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5204702486885981735 [03/25/2022-13:24:10] [V] [TRT] Tactic: 5204702486885981735 Time: 0.0512 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 5375256703210220108 [03/25/2022-13:24:10] [V] [TRT] Tactic: 5375256703210220108 Time: 0.0544 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 5424258848951129084 [03/25/2022-13:24:10] [V] [TRT] Tactic: 5424258848951129084 Time: 0.085376 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5443897483205284103 [03/25/2022-13:24:10] [V] [TRT] Tactic: 5443897483205284103 Time: 0.078592 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5707566217891294846 [03/25/2022-13:24:10] [V] [TRT] Tactic: 5707566217891294846 Time: 0.055296 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5986622376339202983 [03/25/2022-13:24:10] [V] [TRT] Tactic: 5986622376339202983 Time: 0.084352 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6007888770437705057 [03/25/2022-13:24:10] [V] [TRT] Tactic: 6007888770437705057 Time: 0.104832 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 6405251167055673379 [03/25/2022-13:24:10] [V] [TRT] Tactic: 6405251167055673379 Time: 0.085248 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 6433368103202497147 [03/25/2022-13:24:10] [V] [TRT] Tactic: 6433368103202497147 Time: 0.057728 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6441948709525127755 [03/25/2022-13:24:10] [V] [TRT] Tactic: 6441948709525127755 Time: 0.074112 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 6443933097134654777 [03/25/2022-13:24:10] [V] [TRT] Tactic: 6443933097134654777 Time: 0.188672 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6457435868048963632 [03/25/2022-13:24:10] [V] [TRT] Tactic: 6457435868048963632 Time: 0.095488 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1 Tactic: 6510345569544721081 [03/25/2022-13:24:10] [V] [TRT] Tactic: 6510345569544721081 Time: 0.112768 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 6793988781414507278 [03/25/2022-13:24:10] [V] [TRT] Tactic: 6793988781414507278 Time: 0.07808 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 6880710371738875469 [03/25/2022-13:24:10] [V] [TRT] Tactic: 6880710371738875469 Time: 0.107776 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6925201228918187099 [03/25/2022-13:24:10] [V] [TRT] Tactic: 6925201228918187099 Time: 0.083968 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6991524515605108718 [03/25/2022-13:24:10] [V] [TRT] Tactic: 6991524515605108718 Time: 0.176128 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 7245509442265271220 [03/25/2022-13:24:10] [V] [TRT] Tactic: 7245509442265271220 Time: 0.07744 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 7318929579222925725 [03/25/2022-13:24:10] [V] [TRT] Tactic: 7318929579222925725 Time: 0.053376 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 7731430299029542276 [03/25/2022-13:24:10] [V] [TRT] Tactic: 7731430299029542276 Time: 0.079488 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 7738495016763012180 [03/25/2022-13:24:10] [V] [TRT] Tactic: 7738495016763012180 Time: 0.15296 [03/25/2022-13:24:10] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 8142283985160822229 [03/25/2022-13:24:11] [V] [TRT] Tactic: 8142283985160822229 Time: 0.102656 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 8173975624668590862 [03/25/2022-13:24:11] [V] [TRT] Tactic: 8173975624668590862 Time: 0.186624 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 8234775147403903473 [03/25/2022-13:24:11] [V] [TRT] Tactic: 8234775147403903473 Time: 0.109824 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 8524082966802584889 [03/25/2022-13:24:11] [V] [TRT] Tactic: 8524082966802584889 Time: 0.078208 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 8684013308930763400 [03/25/2022-13:24:11] [V] [TRT] Tactic: 8684013308930763400 Time: 0.054784 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 8765382722978397630 [03/25/2022-13:24:11] [V] [TRT] Tactic: 8765382722978397630 Time: 0.082176 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 8843193587782643431 [03/25/2022-13:24:11] [V] [TRT] Tactic: 8843193587782643431 Time: 0.081152 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 8883810517410230831 [03/25/2022-13:24:11] [V] [TRT] Tactic: 8883810517410230831 Time: 0.064768 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 8930797211803511337 [03/25/2022-13:24:11] [V] [TRT] Tactic: 8930797211803511337 Time: 0.164352 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 8935070489925739043 [03/25/2022-13:24:11] [V] [TRT] Tactic: 8935070489925739043 Time: 0.061056 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1 Tactic: 9062173295331155069 [03/25/2022-13:24:11] [V] [TRT] Tactic: 9062173295331155069 Time: 0.203904 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_small_nt_v1 Tactic: -9118785798277698619 [03/25/2022-13:24:11] [V] [TRT] Tactic: -9118785798277698619 Time: 0.061568 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -8985599729413291927 [03/25/2022-13:24:11] [V] [TRT] Tactic: -8985599729413291927 Time: 0.052352 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -8972697510150675429 [03/25/2022-13:24:11] [V] [TRT] Tactic: -8972697510150675429 Time: 0.05248 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -8943710627305202139 [03/25/2022-13:24:11] [V] [TRT] Tactic: -8943710627305202139 Time: 0.087424 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -8859846367886814331 [03/25/2022-13:24:11] [V] [TRT] Tactic: -8859846367886814331 Time: 0.054528 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -8638624340850784688 [03/25/2022-13:24:11] [V] [TRT] Tactic: -8638624340850784688 Time: 0.095488 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -8556775352640313933 [03/25/2022-13:24:11] [V] [TRT] Tactic: -8556775352640313933 Time: 0.052992 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -8382298409581540699 [03/25/2022-13:24:11] [V] [TRT] Tactic: -8382298409581540699 Time: 0.06272 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -8172318747337038866 [03/25/2022-13:24:11] [V] [TRT] Tactic: -8172318747337038866 Time: 0.118272 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -8038164441468184723 [03/25/2022-13:24:11] [V] [TRT] Tactic: -8038164441468184723 Time: 0.144256 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -7844028314176826857 [03/25/2022-13:24:11] [V] [TRT] Tactic: -7844028314176826857 Time: 0.0864 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -7364286662638617917 [03/25/2022-13:24:11] [V] [TRT] Tactic: -7364286662638617917 Time: 0.181888 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r1s1 Tactic: -7361755530333096258 [03/25/2022-13:24:11] [V] [TRT] Tactic: -7361755530333096258 Time: 0.07808 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_t1r1s1 Tactic: -7289760022626653388 [03/25/2022-13:24:11] [V] [TRT] Tactic: -7289760022626653388 Time: 0.080256 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -7106539943789766885 [03/25/2022-13:24:11] [V] [TRT] Tactic: -7106539943789766885 Time: 0.2592 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -6969478418607271266 [03/25/2022-13:24:11] [V] [TRT] Tactic: -6969478418607271266 Time: 0.141568 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -6930438165437733000 [03/25/2022-13:24:11] [V] [TRT] Tactic: -6930438165437733000 Time: 0.100608 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -6879607992933502380 [03/25/2022-13:24:11] [V] [TRT] Tactic: -6879607992933502380 Time: 0.060544 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -6527178416855951297 [03/25/2022-13:24:11] [V] [TRT] Tactic: -6527178416855951297 Time: 0.090496 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -6510232214299595844 [03/25/2022-13:24:11] [V] [TRT] Tactic: -6510232214299595844 Time: 0.096896 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_small_nt_v1 Tactic: -6400348606759295499 [03/25/2022-13:24:11] [V] [TRT] Tactic: -6400348606759295499 Time: 0.122624 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -6346247605026339453 [03/25/2022-13:24:11] [V] [TRT] Tactic: -6346247605026339453 Time: 0.104448 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -6232597026469067819 [03/25/2022-13:24:11] [V] [TRT] Tactic: -6232597026469067819 Time: 0.130176 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_small_nt_v1 Tactic: -5980889159865208399 [03/25/2022-13:24:11] [V] [TRT] Tactic: -5980889159865208399 Time: 0.153856 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_medium_nt_v1 Tactic: -5766140806760372989 [03/25/2022-13:24:11] [V] [TRT] Tactic: -5766140806760372989 Time: 0.077824 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -5697614955743334137 [03/25/2022-13:24:11] [V] [TRT] Tactic: -5697614955743334137 Time: 0.11392 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -5671123121710113970 [03/25/2022-13:24:11] [V] [TRT] Tactic: -5671123121710113970 Time: 0.105344 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -5615581362569252260 [03/25/2022-13:24:11] [V] [TRT] Tactic: -5615581362569252260 Time: 0.070144 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -5562968047117507056 [03/25/2022-13:24:11] [V] [TRT] Tactic: -5562968047117507056 Time: 0.128768 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -5516472881360101487 [03/25/2022-13:24:11] [V] [TRT] Tactic: -5516472881360101487 Time: 0.115072 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -5311474420963248369 [03/25/2022-13:24:11] [V] [TRT] Tactic: -5311474420963248369 Time: 0.079616 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -5170003087447722174 [03/25/2022-13:24:11] [V] [TRT] Tactic: -5170003087447722174 Time: 0.093184 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -4889586143772361690 [03/25/2022-13:24:11] [V] [TRT] Tactic: -4889586143772361690 Time: 0.101376 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -4849712423393454704 [03/25/2022-13:24:11] [V] [TRT] Tactic: -4849712423393454704 Time: 0.109824 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: -4681913707320020520 [03/25/2022-13:24:11] [V] [TRT] Tactic: -4681913707320020520 Time: 0.232064 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_medium_nt_v1 Tactic: -4516822589357530549 [03/25/2022-13:24:11] [V] [TRT] Tactic: -4516822589357530549 Time: 0.078336 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -4455415102719506646 [03/25/2022-13:24:11] [V] [TRT] Tactic: -4455415102719506646 Time: 0.071168 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -4425346730823666456 [03/25/2022-13:24:11] [V] [TRT] Tactic: -4425346730823666456 Time: 0.131712 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r1s1 Tactic: -4260476497340370474 [03/25/2022-13:24:11] [V] [TRT] Tactic: -4260476497340370474 Time: 0.086016 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -4182501876984672402 [03/25/2022-13:24:11] [V] [TRT] Tactic: -4182501876984672402 Time: 0.108928 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_t1r1s1 Tactic: -4151617293257698859 [03/25/2022-13:24:11] [V] [TRT] Tactic: -4151617293257698859 Time: 0.244096 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -3825889760337461729 [03/25/2022-13:24:11] [V] [TRT] Tactic: -3825889760337461729 Time: 0.08832 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -3797022944823726673 [03/25/2022-13:24:11] [V] [TRT] Tactic: -3797022944823726673 Time: 0.068608 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -3613322253849278738 [03/25/2022-13:24:11] [V] [TRT] Tactic: -3613322253849278738 Time: 0.108672 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -3577322188448771475 [03/25/2022-13:24:11] [V] [TRT] Tactic: -3577322188448771475 Time: 0.075008 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16_t1r1s1 Tactic: -3531681826488401618 [03/25/2022-13:24:11] [V] [TRT] Tactic: -3531681826488401618 Time: 0.272896 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -3288585994448820820 [03/25/2022-13:24:11] [V] [TRT] Tactic: -3288585994448820820 Time: 0.206976 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -2754311112012636251 [03/25/2022-13:24:11] [V] [TRT] Tactic: -2754311112012636251 Time: 0.079744 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -2432868635536396215 [03/25/2022-13:24:11] [V] [TRT] Tactic: -2432868635536396215 Time: 0.182016 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -2379804152300264660 [03/25/2022-13:24:11] [V] [TRT] Tactic: -2379804152300264660 Time: 0.09536 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: -2352253835013627337 [03/25/2022-13:24:11] [V] [TRT] Tactic: -2352253835013627337 Time: 0.106752 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -2335587136911650799 [03/25/2022-13:24:11] [V] [TRT] Tactic: -2335587136911650799 Time: 0.100736 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r1s1 Tactic: -2315453944962430928 [03/25/2022-13:24:11] [V] [TRT] Tactic: -2315453944962430928 Time: 0.248832 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: -2238364958919154661 [03/25/2022-13:24:11] [V] [TRT] Tactic: -2238364958919154661 Time: 0.142464 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -1916483171117495388 [03/25/2022-13:24:11] [V] [TRT] Tactic: -1916483171117495388 Time: 0.214784 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -1549742793039499659 [03/25/2022-13:24:11] [V] [TRT] Tactic: -1549742793039499659 Time: 0.1344 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -1499578657823798783 [03/25/2022-13:24:11] [V] [TRT] Tactic: -1499578657823798783 Time: 0.073856 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -1494157908358500249 [03/25/2022-13:24:11] [V] [TRT] Tactic: -1494157908358500249 Time: 0.073984 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_t1r1s1 Tactic: -1328736756812546664 [03/25/2022-13:24:11] [V] [TRT] Tactic: -1328736756812546664 Time: 0.11392 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -1006589727652607355 [03/25/2022-13:24:11] [V] [TRT] Tactic: -1006589727652607355 Time: 0.07488 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -713022856474991236 [03/25/2022-13:24:11] [V] [TRT] Tactic: -713022856474991236 Time: 0.107776 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -405554772060757402 [03/25/2022-13:24:11] [V] [TRT] Tactic: -405554772060757402 Time: 0.0672 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -375949437730908730 [03/25/2022-13:24:11] [V] [TRT] Tactic: -375949437730908730 Time: 0.072832 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: -233227833606287806 [03/25/2022-13:24:11] [V] [TRT] Tactic: -233227833606287806 Time: 0.089216 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -111878368089469751 [03/25/2022-13:24:11] [V] [TRT] Tactic: -111878368089469751 Time: 0.1248 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -48936598874722005 [03/25/2022-13:24:11] [V] [TRT] Tactic: -48936598874722005 Time: 0.057088 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -19707840769375107 [03/25/2022-13:24:11] [V] [TRT] Tactic: -19707840769375107 Time: 0.070912 [03/25/2022-13:24:11] [V] [TRT] Fastest Tactic: 3670282018109435863 Time: 0.049664 [03/25/2022-13:24:11] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: 3670282018109435863 [03/25/2022-13:24:11] [V] [TRT] =============== Computing costs for [03/25/2022-13:24:11] [V] [TRT] *************** Autotuning format combination: Int8(50176,3136:4,56,1) -> Int8(200704,3136:4,56,1) *************** [03/25/2022-13:24:11] [V] [TRT] --------------- Timing Runner: sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 (CudaDepthwiseConvolution) [03/25/2022-13:24:11] [V] [TRT] CudaDepthwiseConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:11] [V] [TRT] --------------- Timing Runner: sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 (FusedConvActConvolution) [03/25/2022-13:24:11] [V] [TRT] FusedConvActConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:11] [V] [TRT] --------------- Timing Runner: sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 (CaskConvolution) [03/25/2022-13:24:11] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_medium_nn_v1 Tactic: 175853789719975416 [03/25/2022-13:24:11] [V] [TRT] Tactic: 175853789719975416 Time: 0.416384 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_medium_nn_v1 Tactic: 2171150287007712632 [03/25/2022-13:24:11] [V] [TRT] Tactic: 2171150287007712632 Time: 0.481408 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_small_nn_v1 Tactic: 2234457234705232274 [03/25/2022-13:24:11] [V] [TRT] Tactic: 2234457234705232274 Time: 0.339328 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_medium_nn_v1 Tactic: 5834048089706882838 [03/25/2022-13:24:11] [V] [TRT] Tactic: 5834048089706882838 Time: 0.348416 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_interior_nn_v1 Tactic: 6299962968199310600 [03/25/2022-13:24:11] [V] [TRT] Tactic: 6299962968199310600 Time: 0.363264 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_interior_nn_v1 Tactic: 6341572697076960911 [03/25/2022-13:24:11] [V] [TRT] Tactic: 6341572697076960911 Time: 0.442496 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_small_nn_v1 Tactic: -8626990807754934295 [03/25/2022-13:24:11] [V] [TRT] Tactic: -8626990807754934295 Time: 0.406912 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_interior_nn_v1 Tactic: -8498217049614706532 [03/25/2022-13:24:11] [V] [TRT] Tactic: -8498217049614706532 Time: 0.326528 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_small_nn_v1 Tactic: -7303593854972602201 [03/25/2022-13:24:11] [V] [TRT] Tactic: -7303593854972602201 Time: 0.452352 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_medium_nn_v1 Tactic: -6585664687867083638 [03/25/2022-13:24:11] [V] [TRT] Tactic: -6585664687867083638 Time: 0.37056 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_interior_nn_v1 Tactic: -3326139578711341011 [03/25/2022-13:24:11] [V] [TRT] Tactic: -3326139578711341011 Time: 0.388864 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_small_nn_v1 Tactic: -683636008127039856 [03/25/2022-13:24:11] [V] [TRT] Tactic: -683636008127039856 Time: 0.368 [03/25/2022-13:24:11] [V] [TRT] Fastest Tactic: -8498217049614706532 Time: 0.326528 [03/25/2022-13:24:11] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: -8498217049614706532 [03/25/2022-13:24:11] [V] [TRT] *************** Autotuning format combination: Int8(50176,3136:4,56,1) -> Int8(25088,3136:32,56,1) *************** [03/25/2022-13:24:11] [V] [TRT] --------------- Timing Runner: sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 (CaskConvolution) [03/25/2022-13:24:11] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_small_c32_nn_v1 Tactic: 1100922622480907544 [03/25/2022-13:24:11] [V] [TRT] Tactic: 1100922622480907544 Time: 0.4 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_interior_c32_nn_v1 Tactic: 2855900226702061782 [03/25/2022-13:24:11] [V] [TRT] Tactic: 2855900226702061782 Time: 0.362112 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_medium_c32_nn_v1 Tactic: 3606311198834416176 [03/25/2022-13:24:11] [V] [TRT] Tactic: 3606311198834416176 Time: 0.343552 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_small_c32_nn_v1 Tactic: 4325765560739862899 [03/25/2022-13:24:11] [V] [TRT] Tactic: 4325765560739862899 Time: 0.366976 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_interior_c32_nn_v1 Tactic: 8803458114157674373 [03/25/2022-13:24:11] [V] [TRT] Tactic: 8803458114157674373 Time: 0.324736 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_interior_c32_nn_v1 Tactic: -6934773036503365000 [03/25/2022-13:24:11] [V] [TRT] Tactic: -6934773036503365000 Time: 0.383872 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_interior_c32_nn_v1 Tactic: -4431642509665791294 [03/25/2022-13:24:11] [V] [TRT] Tactic: -4431642509665791294 Time: 0.339968 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_medium_c32_nn_v1 Tactic: -4255737803793506479 [03/25/2022-13:24:11] [V] [TRT] Tactic: -4255737803793506479 Time: 0.289024 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_small_c32_nn_v1 Tactic: -3958182351168863467 [03/25/2022-13:24:11] [V] [TRT] Tactic: -3958182351168863467 Time: 0.3584 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_medium_c32_nn_v1 Tactic: -3111968753064955248 [03/25/2022-13:24:11] [V] [TRT] Tactic: -3111968753064955248 Time: 0.371712 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_medium_c32_nn_v1 Tactic: -1492575840277333548 [03/25/2022-13:24:11] [V] [TRT] Tactic: -1492575840277333548 Time: 0.33088 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_small_c32_nn_v1 Tactic: -868495160148524802 [03/25/2022-13:24:11] [V] [TRT] Tactic: -868495160148524802 Time: 0.264832 [03/25/2022-13:24:11] [V] [TRT] Fastest Tactic: -868495160148524802 Time: 0.264832 [03/25/2022-13:24:11] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: -868495160148524802 [03/25/2022-13:24:11] [V] [TRT] *************** Autotuning format combination: Int8(6272,3136:32,56,1) -> Int8(25088,3136:32,56,1) *************** [03/25/2022-13:24:11] [V] [TRT] --------------- Timing Runner: sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 (CudaGroupConvolution) [03/25/2022-13:24:11] [V] [TRT] CudaGroupConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:11] [V] [TRT] --------------- Timing Runner: sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 (CudaDepthwiseConvolution) [03/25/2022-13:24:11] [V] [TRT] CudaDepthwiseConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:11] [V] [TRT] --------------- Timing Runner: sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 (FusedConvActConvolution) [03/25/2022-13:24:11] [V] [TRT] FusedConvActConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:11] [V] [TRT] --------------- Timing Runner: sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 (CaskConvolution) [03/25/2022-13:24:11] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16_t1r1s1 Tactic: 68468667201176803 [03/25/2022-13:24:11] [V] [TRT] Tactic: 68468667201176803 Time: 0.184064 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 125145153013230687 [03/25/2022-13:24:11] [V] [TRT] Tactic: 125145153013230687 Time: 0.32064 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_interior_nt_v1 Tactic: 434957160407688216 [03/25/2022-13:24:11] [V] [TRT] Tactic: 434957160407688216 Time: 0.23104 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 805889586762897346 [03/25/2022-13:24:11] [V] [TRT] Tactic: 805889586762897346 Time: 0.169088 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: 857001784974286465 [03/25/2022-13:24:11] [V] [TRT] Tactic: 857001784974286465 Time: 0.199552 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 1214130898909872671 [03/25/2022-13:24:11] [V] [TRT] Tactic: 1214130898909872671 Time: 0.245248 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 1278425129871930205 [03/25/2022-13:24:11] [V] [TRT] Tactic: 1278425129871930205 Time: 0.157056 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 1583811548148740665 [03/25/2022-13:24:11] [V] [TRT] Tactic: 1583811548148740665 Time: 0.272384 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 1701344857577810806 [03/25/2022-13:24:11] [V] [TRT] Tactic: 1701344857577810806 Time: 0.223488 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 1797231177354918208 [03/25/2022-13:24:11] [V] [TRT] Tactic: 1797231177354918208 Time: 0.225152 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2004812516525036381 [03/25/2022-13:24:11] [V] [TRT] Tactic: 2004812516525036381 Time: 0.234112 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2030033463723799063 [03/25/2022-13:24:11] [V] [TRT] Tactic: 2030033463723799063 Time: 0.16128 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_interior_nt_v1 Tactic: 2346437292116182513 [03/25/2022-13:24:11] [V] [TRT] Tactic: 2346437292116182513 Time: 0.182528 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2376898825218218566 [03/25/2022-13:24:11] [V] [TRT] Tactic: 2376898825218218566 Time: 0.140544 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_interior_nt_v1 Tactic: 2522133112320625287 [03/25/2022-13:24:11] [V] [TRT] Tactic: 2522133112320625287 Time: 0.193536 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2548171972648455240 [03/25/2022-13:24:11] [V] [TRT] Tactic: 2548171972648455240 Time: 0.144384 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 2548946449357458230 [03/25/2022-13:24:11] [V] [TRT] Tactic: 2548946449357458230 Time: 0.193664 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 2678520742286844763 [03/25/2022-13:24:11] [V] [TRT] Tactic: 2678520742286844763 Time: 0.377344 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 2756291002030759362 [03/25/2022-13:24:11] [V] [TRT] Tactic: 2756291002030759362 Time: 0.210688 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2972948223367788520 [03/25/2022-13:24:11] [V] [TRT] Tactic: 2972948223367788520 Time: 0.13568 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_medium_nt_v1 Tactic: 2985940154541537814 [03/25/2022-13:24:11] [V] [TRT] Tactic: 2985940154541537814 Time: 0.186624 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 3043273137345374664 [03/25/2022-13:24:11] [V] [TRT] Tactic: 3043273137345374664 Time: 0.227584 [03/25/2022-13:24:11] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 3221677093659484230 [03/25/2022-13:24:12] [V] [TRT] Tactic: 3221677093659484230 Time: 0.299392 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: 3242897809704328258 [03/25/2022-13:24:12] [V] [TRT] Tactic: 3242897809704328258 Time: 0.18112 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 3312456766204252694 [03/25/2022-13:24:12] [V] [TRT] Tactic: 3312456766204252694 Time: 0.198144 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 3538565962642681625 [03/25/2022-13:24:12] [V] [TRT] Tactic: 3538565962642681625 Time: 0.209536 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 3541919052468401776 [03/25/2022-13:24:12] [V] [TRT] Tactic: 3541919052468401776 Time: 0.159872 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 3593397928177382100 [03/25/2022-13:24:12] [V] [TRT] Tactic: 3593397928177382100 Time: 0.247936 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 3670282018109435863 [03/25/2022-13:24:12] [V] [TRT] Tactic: 3670282018109435863 Time: 0.15296 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 3671413346254027573 [03/25/2022-13:24:12] [V] [TRT] Tactic: 3671413346254027573 Time: 0.194688 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_medium_nt_v1 Tactic: 3899284354987683408 [03/25/2022-13:24:12] [V] [TRT] Tactic: 3899284354987683408 Time: 0.236032 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 3927509214678622419 [03/25/2022-13:24:12] [V] [TRT] Tactic: 3927509214678622419 Time: 0.162432 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 4112572034735311841 [03/25/2022-13:24:12] [V] [TRT] Tactic: 4112572034735311841 Time: 0.29696 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 4239974928951431644 [03/25/2022-13:24:12] [V] [TRT] Tactic: 4239974928951431644 Time: 0.137984 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 4610760414797216079 [03/25/2022-13:24:12] [V] [TRT] Tactic: 4610760414797216079 Time: 0.179456 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_small_nt_v1 Tactic: 4717285412741024953 [03/25/2022-13:24:12] [V] [TRT] Tactic: 4717285412741024953 Time: 0.198656 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 4796956614760326119 [03/25/2022-13:24:12] [V] [TRT] Tactic: 4796956614760326119 Time: 0.202496 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 4919361344804309192 [03/25/2022-13:24:12] [V] [TRT] Tactic: 4919361344804309192 Time: 0.236928 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 5043674678294309681 [03/25/2022-13:24:12] [V] [TRT] Tactic: 5043674678294309681 Time: 0.221312 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_interior_nt_v1 Tactic: 5126565865931538390 [03/25/2022-13:24:12] [V] [TRT] Tactic: 5126565865931538390 Time: 0.193024 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5204702486885981735 [03/25/2022-13:24:12] [V] [TRT] Tactic: 5204702486885981735 Time: 0.161536 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 5375256703210220108 [03/25/2022-13:24:12] [V] [TRT] Tactic: 5375256703210220108 Time: 0.181504 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 5424258848951129084 [03/25/2022-13:24:12] [V] [TRT] Tactic: 5424258848951129084 Time: 0.16896 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5443897483205284103 [03/25/2022-13:24:12] [V] [TRT] Tactic: 5443897483205284103 Time: 0.212608 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5707566217891294846 [03/25/2022-13:24:12] [V] [TRT] Tactic: 5707566217891294846 Time: 0.163328 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5986622376339202983 [03/25/2022-13:24:12] [V] [TRT] Tactic: 5986622376339202983 Time: 0.161664 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6007888770437705057 [03/25/2022-13:24:12] [V] [TRT] Tactic: 6007888770437705057 Time: 0.201728 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 6405251167055673379 [03/25/2022-13:24:12] [V] [TRT] Tactic: 6405251167055673379 Time: 0.211584 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 6433368103202497147 [03/25/2022-13:24:12] [V] [TRT] Tactic: 6433368103202497147 Time: 0.1792 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6441948709525127755 [03/25/2022-13:24:12] [V] [TRT] Tactic: 6441948709525127755 Time: 0.263424 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 6443933097134654777 [03/25/2022-13:24:12] [V] [TRT] Tactic: 6443933097134654777 Time: 0.19968 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6457435868048963632 [03/25/2022-13:24:12] [V] [TRT] Tactic: 6457435868048963632 Time: 0.186624 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1 Tactic: 6510345569544721081 [03/25/2022-13:24:12] [V] [TRT] Tactic: 6510345569544721081 Time: 0.294784 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 6793988781414507278 [03/25/2022-13:24:12] [V] [TRT] Tactic: 6793988781414507278 Time: 0.157568 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 6880710371738875469 [03/25/2022-13:24:12] [V] [TRT] Tactic: 6880710371738875469 Time: 0.20992 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6925201228918187099 [03/25/2022-13:24:12] [V] [TRT] Tactic: 6925201228918187099 Time: 0.167168 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6991524515605108718 [03/25/2022-13:24:12] [V] [TRT] Tactic: 6991524515605108718 Time: 0.338048 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 7245509442265271220 [03/25/2022-13:24:12] [V] [TRT] Tactic: 7245509442265271220 Time: 0.147968 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 7318929579222925725 [03/25/2022-13:24:12] [V] [TRT] Tactic: 7318929579222925725 Time: 0.17216 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 7731430299029542276 [03/25/2022-13:24:12] [V] [TRT] Tactic: 7731430299029542276 Time: 0.14592 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 7738495016763012180 [03/25/2022-13:24:12] [V] [TRT] Tactic: 7738495016763012180 Time: 0.16384 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 8142283985160822229 [03/25/2022-13:24:12] [V] [TRT] Tactic: 8142283985160822229 Time: 0.195968 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 8173975624668590862 [03/25/2022-13:24:12] [V] [TRT] Tactic: 8173975624668590862 Time: 0.194304 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 8234775147403903473 [03/25/2022-13:24:12] [V] [TRT] Tactic: 8234775147403903473 Time: 0.211328 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 8524082966802584889 [03/25/2022-13:24:12] [V] [TRT] Tactic: 8524082966802584889 Time: 0.14784 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 8684013308930763400 [03/25/2022-13:24:12] [V] [TRT] Tactic: 8684013308930763400 Time: 0.167296 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 8765382722978397630 [03/25/2022-13:24:12] [V] [TRT] Tactic: 8765382722978397630 Time: 0.155648 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 8843193587782643431 [03/25/2022-13:24:12] [V] [TRT] Tactic: 8843193587782643431 Time: 0.225152 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 8883810517410230831 [03/25/2022-13:24:12] [V] [TRT] Tactic: 8883810517410230831 Time: 0.21824 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 8930797211803511337 [03/25/2022-13:24:12] [V] [TRT] Tactic: 8930797211803511337 Time: 0.322688 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 8935070489925739043 [03/25/2022-13:24:12] [V] [TRT] Tactic: 8935070489925739043 Time: 0.16 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1 Tactic: 9062173295331155069 [03/25/2022-13:24:12] [V] [TRT] Tactic: 9062173295331155069 Time: 0.381056 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_small_nt_v1 Tactic: -9118785798277698619 [03/25/2022-13:24:12] [V] [TRT] Tactic: -9118785798277698619 Time: 0.197888 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -8985599729413291927 [03/25/2022-13:24:12] [V] [TRT] Tactic: -8985599729413291927 Time: 0.167424 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -8972697510150675429 [03/25/2022-13:24:12] [V] [TRT] Tactic: -8972697510150675429 Time: 0.169728 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -8943710627305202139 [03/25/2022-13:24:12] [V] [TRT] Tactic: -8943710627305202139 Time: 0.168192 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -8859846367886814331 [03/25/2022-13:24:12] [V] [TRT] Tactic: -8859846367886814331 Time: 0.181504 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -8638624340850784688 [03/25/2022-13:24:12] [V] [TRT] Tactic: -8638624340850784688 Time: 0.266624 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -8556775352640313933 [03/25/2022-13:24:12] [V] [TRT] Tactic: -8556775352640313933 Time: 0.158336 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -8382298409581540699 [03/25/2022-13:24:12] [V] [TRT] Tactic: -8382298409581540699 Time: 0.218368 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -8172318747337038866 [03/25/2022-13:24:12] [V] [TRT] Tactic: -8172318747337038866 Time: 0.230144 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -8038164441468184723 [03/25/2022-13:24:12] [V] [TRT] Tactic: -8038164441468184723 Time: 0.159232 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -7844028314176826857 [03/25/2022-13:24:12] [V] [TRT] Tactic: -7844028314176826857 Time: 0.236544 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -7364286662638617917 [03/25/2022-13:24:12] [V] [TRT] Tactic: -7364286662638617917 Time: 0.155648 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r1s1 Tactic: -7361755530333096258 [03/25/2022-13:24:12] [V] [TRT] Tactic: -7361755530333096258 Time: 0.21248 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_t1r1s1 Tactic: -7289760022626653388 [03/25/2022-13:24:12] [V] [TRT] Tactic: -7289760022626653388 Time: 0.217344 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -7106539943789766885 [03/25/2022-13:24:12] [V] [TRT] Tactic: -7106539943789766885 Time: 0.210304 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -6969478418607271266 [03/25/2022-13:24:12] [V] [TRT] Tactic: -6969478418607271266 Time: 0.214912 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -6930438165437733000 [03/25/2022-13:24:12] [V] [TRT] Tactic: -6930438165437733000 Time: 0.29504 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -6879607992933502380 [03/25/2022-13:24:12] [V] [TRT] Tactic: -6879607992933502380 Time: 0.153472 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -6527178416855951297 [03/25/2022-13:24:12] [V] [TRT] Tactic: -6527178416855951297 Time: 0.257024 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -6510232214299595844 [03/25/2022-13:24:12] [V] [TRT] Tactic: -6510232214299595844 Time: 0.275968 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_small_nt_v1 Tactic: -6400348606759295499 [03/25/2022-13:24:12] [V] [TRT] Tactic: -6400348606759295499 Time: 0.184576 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -6346247605026339453 [03/25/2022-13:24:12] [V] [TRT] Tactic: -6346247605026339453 Time: 0.158976 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -6232597026469067819 [03/25/2022-13:24:12] [V] [TRT] Tactic: -6232597026469067819 Time: 0.286848 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_small_nt_v1 Tactic: -5980889159865208399 [03/25/2022-13:24:12] [V] [TRT] Tactic: -5980889159865208399 Time: 0.234112 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_medium_nt_v1 Tactic: -5766140806760372989 [03/25/2022-13:24:12] [V] [TRT] Tactic: -5766140806760372989 Time: 0.200192 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -5697614955743334137 [03/25/2022-13:24:12] [V] [TRT] Tactic: -5697614955743334137 Time: 0.171264 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -5671123121710113970 [03/25/2022-13:24:12] [V] [TRT] Tactic: -5671123121710113970 Time: 0.158976 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -5615581362569252260 [03/25/2022-13:24:12] [V] [TRT] Tactic: -5615581362569252260 Time: 0.186496 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -5562968047117507056 [03/25/2022-13:24:12] [V] [TRT] Tactic: -5562968047117507056 Time: 0.198912 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -5516472881360101487 [03/25/2022-13:24:12] [V] [TRT] Tactic: -5516472881360101487 Time: 0.256384 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -5311474420963248369 [03/25/2022-13:24:12] [V] [TRT] Tactic: -5311474420963248369 Time: 0.216576 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -5170003087447722174 [03/25/2022-13:24:12] [V] [TRT] Tactic: -5170003087447722174 Time: 0.260352 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -4889586143772361690 [03/25/2022-13:24:12] [V] [TRT] Tactic: -4889586143772361690 Time: 0.161408 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -4849712423393454704 [03/25/2022-13:24:12] [V] [TRT] Tactic: -4849712423393454704 Time: 0.169856 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: -4681913707320020520 [03/25/2022-13:24:12] [V] [TRT] Tactic: -4681913707320020520 Time: 0.192896 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_medium_nt_v1 Tactic: -4516822589357530549 [03/25/2022-13:24:12] [V] [TRT] Tactic: -4516822589357530549 Time: 0.200832 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -4455415102719506646 [03/25/2022-13:24:12] [V] [TRT] Tactic: -4455415102719506646 Time: 0.194816 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -4425346730823666456 [03/25/2022-13:24:12] [V] [TRT] Tactic: -4425346730823666456 Time: 0.26752 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r1s1 Tactic: -4260476497340370474 [03/25/2022-13:24:12] [V] [TRT] Tactic: -4260476497340370474 Time: 0.239744 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -4182501876984672402 [03/25/2022-13:24:12] [V] [TRT] Tactic: -4182501876984672402 Time: 0.163072 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_t1r1s1 Tactic: -4151617293257698859 [03/25/2022-13:24:12] [V] [TRT] Tactic: -4151617293257698859 Time: 0.20096 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -3825889760337461729 [03/25/2022-13:24:12] [V] [TRT] Tactic: -3825889760337461729 Time: 0.253056 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -3797022944823726673 [03/25/2022-13:24:12] [V] [TRT] Tactic: -3797022944823726673 Time: 0.185472 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -3613322253849278738 [03/25/2022-13:24:12] [V] [TRT] Tactic: -3613322253849278738 Time: 0.313984 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -3577322188448771475 [03/25/2022-13:24:12] [V] [TRT] Tactic: -3577322188448771475 Time: 0.200448 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16_t1r1s1 Tactic: -3531681826488401618 [03/25/2022-13:24:12] [V] [TRT] Tactic: -3531681826488401618 Time: 0.223744 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -3288585994448820820 [03/25/2022-13:24:12] [V] [TRT] Tactic: -3288585994448820820 Time: 0.315392 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -2754311112012636251 [03/25/2022-13:24:12] [V] [TRT] Tactic: -2754311112012636251 Time: 0.217728 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -2432868635536396215 [03/25/2022-13:24:12] [V] [TRT] Tactic: -2432868635536396215 Time: 0.267648 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -2379804152300264660 [03/25/2022-13:24:12] [V] [TRT] Tactic: -2379804152300264660 Time: 0.206208 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: -2352253835013627337 [03/25/2022-13:24:12] [V] [TRT] Tactic: -2352253835013627337 Time: 0.164736 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -2335587136911650799 [03/25/2022-13:24:12] [V] [TRT] Tactic: -2335587136911650799 Time: 0.22272 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r1s1 Tactic: -2315453944962430928 [03/25/2022-13:24:12] [V] [TRT] Tactic: -2315453944962430928 Time: 0.205568 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: -2238364958919154661 [03/25/2022-13:24:12] [V] [TRT] Tactic: -2238364958919154661 Time: 0.292224 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -1916483171117495388 [03/25/2022-13:24:12] [V] [TRT] Tactic: -1916483171117495388 Time: 0.33216 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -1549742793039499659 [03/25/2022-13:24:12] [V] [TRT] Tactic: -1549742793039499659 Time: 0.295168 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -1499578657823798783 [03/25/2022-13:24:12] [V] [TRT] Tactic: -1499578657823798783 Time: 0.199936 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -1494157908358500249 [03/25/2022-13:24:12] [V] [TRT] Tactic: -1494157908358500249 Time: 0.197888 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_t1r1s1 Tactic: -1328736756812546664 [03/25/2022-13:24:12] [V] [TRT] Tactic: -1328736756812546664 Time: 0.172416 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -1006589727652607355 [03/25/2022-13:24:12] [V] [TRT] Tactic: -1006589727652607355 Time: 0.204032 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -713022856474991236 [03/25/2022-13:24:12] [V] [TRT] Tactic: -713022856474991236 Time: 0.30848 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -405554772060757402 [03/25/2022-13:24:12] [V] [TRT] Tactic: -405554772060757402 Time: 0.176384 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -375949437730908730 [03/25/2022-13:24:12] [V] [TRT] Tactic: -375949437730908730 Time: 0.196864 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: -233227833606287806 [03/25/2022-13:24:12] [V] [TRT] Tactic: -233227833606287806 Time: 0.2208 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -111878368089469751 [03/25/2022-13:24:12] [V] [TRT] Tactic: -111878368089469751 Time: 0.249344 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -48936598874722005 [03/25/2022-13:24:12] [V] [TRT] Tactic: -48936598874722005 Time: 0.142336 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -19707840769375107 [03/25/2022-13:24:12] [V] [TRT] Tactic: -19707840769375107 Time: 0.190592 [03/25/2022-13:24:12] [V] [TRT] Fastest Tactic: 2972948223367788520 Time: 0.13568 [03/25/2022-13:24:12] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: 2972948223367788520 [03/25/2022-13:24:12] [V] [TRT] =============== Computing costs for [03/25/2022-13:24:12] [V] [TRT] *************** Autotuning format combination: Int8(50176,3136:4,56,1) -> Int8(50176,3136:4,56,1) *************** [03/25/2022-13:24:12] [V] [TRT] --------------- Timing Runner: sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 (CudaDepthwiseConvolution) [03/25/2022-13:24:12] [V] [TRT] CudaDepthwiseConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:12] [V] [TRT] --------------- Timing Runner: sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 (FusedConvActConvolution) [03/25/2022-13:24:12] [V] [TRT] FusedConvActConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:12] [V] [TRT] --------------- Timing Runner: sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 (CaskConvolution) [03/25/2022-13:24:12] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_medium_nn_v1 Tactic: 175853789719975416 [03/25/2022-13:24:12] [V] [TRT] Tactic: 175853789719975416 Time: 0.471424 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_medium_nn_v1 Tactic: 2171150287007712632 [03/25/2022-13:24:12] [V] [TRT] Tactic: 2171150287007712632 Time: 0.463104 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_small_nn_v1 Tactic: 2234457234705232274 [03/25/2022-13:24:12] [V] [TRT] Tactic: 2234457234705232274 Time: 0.433152 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_medium_nn_v1 Tactic: 5834048089706882838 [03/25/2022-13:24:12] [V] [TRT] Tactic: 5834048089706882838 Time: 0.435456 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_small_nn_v1 Tactic: -8626990807754934295 [03/25/2022-13:24:12] [V] [TRT] Tactic: -8626990807754934295 Time: 0.4672 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_small_nn_v1 Tactic: -7303593854972602201 [03/25/2022-13:24:12] [V] [TRT] Tactic: -7303593854972602201 Time: 0.43584 [03/25/2022-13:24:12] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_medium_nn_v1 Tactic: -6585664687867083638 [03/25/2022-13:24:13] [V] [TRT] Tactic: -6585664687867083638 Time: 0.831744 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_xregs_large_nn_v1 Tactic: -3730012925709297561 [03/25/2022-13:24:13] [V] [TRT] Tactic: -3730012925709297561 Time: 0.42944 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_xregs_large_nn_v1 Tactic: -2277259417488004546 [03/25/2022-13:24:13] [V] [TRT] Tactic: -2277259417488004546 Time: 0.96896 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_small_nn_v1 Tactic: -683636008127039856 [03/25/2022-13:24:13] [V] [TRT] Tactic: -683636008127039856 Time: 0.830336 [03/25/2022-13:24:13] [V] [TRT] Fastest Tactic: -3730012925709297561 Time: 0.42944 [03/25/2022-13:24:13] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: -3730012925709297561 [03/25/2022-13:24:13] [V] [TRT] *************** Autotuning format combination: Int8(50176,3136:4,56,1) -> Int8(6272,3136:32,56,1) *************** [03/25/2022-13:24:13] [V] [TRT] --------------- Timing Runner: sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 (CaskConvolution) [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_xregs_large_c32_nn_v1 Tactic: 984309058095623735 [03/25/2022-13:24:13] [V] [TRT] Tactic: 984309058095623735 Time: 0.428928 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_small_c32_nn_v1 Tactic: 1100922622480907544 [03/25/2022-13:24:13] [V] [TRT] Tactic: 1100922622480907544 Time: 0.465408 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_xregs_large_c32_nn_v1 Tactic: 3238312825609165543 [03/25/2022-13:24:13] [V] [TRT] Tactic: 3238312825609165543 Time: 0.962432 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_medium_c32_nn_v1 Tactic: 3606311198834416176 [03/25/2022-13:24:13] [V] [TRT] Tactic: 3606311198834416176 Time: 0.434432 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_small_c32_nn_v1 Tactic: 4325765560739862899 [03/25/2022-13:24:13] [V] [TRT] Tactic: 4325765560739862899 Time: 0.828672 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_medium_c32_nn_v1 Tactic: -4255737803793506479 [03/25/2022-13:24:13] [V] [TRT] Tactic: -4255737803793506479 Time: 0.829312 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_small_c32_nn_v1 Tactic: -3958182351168863467 [03/25/2022-13:24:13] [V] [TRT] Tactic: -3958182351168863467 Time: 0.434816 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_medium_c32_nn_v1 Tactic: -3111968753064955248 [03/25/2022-13:24:13] [V] [TRT] Tactic: -3111968753064955248 Time: 0.467072 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_medium_c32_nn_v1 Tactic: -1492575840277333548 [03/25/2022-13:24:13] [V] [TRT] Tactic: -1492575840277333548 Time: 0.469248 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_small_c32_nn_v1 Tactic: -868495160148524802 [03/25/2022-13:24:13] [V] [TRT] Tactic: -868495160148524802 Time: 0.432384 [03/25/2022-13:24:13] [V] [TRT] Fastest Tactic: 984309058095623735 Time: 0.428928 [03/25/2022-13:24:13] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: 984309058095623735 [03/25/2022-13:24:13] [V] [TRT] *************** Autotuning format combination: Int8(6272,3136:32,56,1) -> Int8(6272,3136:32,56,1) *************** [03/25/2022-13:24:13] [V] [TRT] --------------- Timing Runner: sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 (CudaGroupConvolution) [03/25/2022-13:24:13] [V] [TRT] CudaGroupConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:13] [V] [TRT] --------------- Timing Runner: sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 (CudaDepthwiseConvolution) [03/25/2022-13:24:13] [V] [TRT] CudaDepthwiseConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:13] [V] [TRT] --------------- Timing Runner: sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 (FusedConvActConvolution) [03/25/2022-13:24:13] [V] [TRT] FusedConvActConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:13] [V] [TRT] --------------- Timing Runner: sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 (CaskConvolution) [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 184229963126259101 [03/25/2022-13:24:13] [V] [TRT] Tactic: 184229963126259101 Time: 0.19392 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 289888059097454627 [03/25/2022-13:24:13] [V] [TRT] Tactic: 289888059097454627 Time: 0.30336 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32 Tactic: 328135613486708155 [03/25/2022-13:24:13] [V] [TRT] Tactic: 328135613486708155 Time: 0.310144 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 680740992583869928 [03/25/2022-13:24:13] [V] [TRT] Tactic: 680740992583869928 Time: 0.267264 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32 Tactic: 1111159740952609683 [03/25/2022-13:24:13] [V] [TRT] Tactic: 1111159740952609683 Time: 0.318976 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: 1134860903395928905 [03/25/2022-13:24:13] [V] [TRT] Tactic: 1134860903395928905 Time: 0.168192 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: 1276591930377039442 [03/25/2022-13:24:13] [V] [TRT] Tactic: 1276591930377039442 Time: 0.205184 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r3s3 Tactic: 1388866374720163187 [03/25/2022-13:24:13] [V] [TRT] Tactic: 1388866374720163187 Time: 0.18752 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32 Tactic: 1399501420456320585 [03/25/2022-13:24:13] [V] [TRT] Tactic: 1399501420456320585 Time: 0.208 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 1853122447892949466 [03/25/2022-13:24:13] [V] [TRT] Tactic: 1853122447892949466 Time: 0.204416 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: 2133329569091732311 [03/25/2022-13:24:13] [V] [TRT] Tactic: 2133329569091732311 Time: 0.27008 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32 Tactic: 2325023763229477890 [03/25/2022-13:24:13] [V] [TRT] Tactic: 2325023763229477890 Time: 0.09984 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: 2579824863892891529 [03/25/2022-13:24:13] [V] [TRT] Tactic: 2579824863892891529 Time: 0.212224 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32 Tactic: 2783960536172159663 [03/25/2022-13:24:13] [V] [TRT] Tactic: 2783960536172159663 Time: 0.104576 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32 Tactic: 2821711838552913693 [03/25/2022-13:24:13] [V] [TRT] Tactic: 2821711838552913693 Time: 0.18176 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64 Tactic: 2945009978756227538 [03/25/2022-13:24:13] [V] [TRT] Tactic: 2945009978756227538 Time: 0.132992 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_medium_nt_v1 Tactic: 2985940154541537814 [03/25/2022-13:24:13] [V] [TRT] Tactic: 2985940154541537814 Time: 0.270592 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 3401614690060226673 [03/25/2022-13:24:13] [V] [TRT] Tactic: 3401614690060226673 Time: 0.17856 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32 Tactic: 3456719996792527006 [03/25/2022-13:24:13] [V] [TRT] Tactic: 3456719996792527006 Time: 0.169344 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64 Tactic: 3651043333819148268 [03/25/2022-13:24:13] [V] [TRT] Tactic: 3651043333819148268 Time: 0.351104 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_medium_nt_v1 Tactic: 3899284354987683408 [03/25/2022-13:24:13] [V] [TRT] Tactic: 3899284354987683408 Time: 0.313984 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32 Tactic: 4042202769383439184 [03/25/2022-13:24:13] [V] [TRT] Tactic: 4042202769383439184 Time: 0.209152 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_large_nt_v1 Tactic: 4182625619810185112 [03/25/2022-13:24:13] [V] [TRT] Tactic: 4182625619810185112 Time: 0.156032 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 4214794893922618058 [03/25/2022-13:24:13] [V] [TRT] Tactic: 4214794893922618058 Time: 0.267776 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 4384868749799132354 [03/25/2022-13:24:13] [V] [TRT] Tactic: 4384868749799132354 Time: 0.210048 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64 Tactic: 4414594337986714263 [03/25/2022-13:24:13] [V] [TRT] Tactic: 4414594337986714263 Time: 0.170624 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_small_nt_v1 Tactic: 4717285412741024953 [03/25/2022-13:24:13] [V] [TRT] Tactic: 4717285412741024953 Time: 0.1472 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32 Tactic: 4734519122557206480 [03/25/2022-13:24:13] [V] [TRT] Tactic: 4734519122557206480 Time: 0.38592 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: 4922297020351187339 [03/25/2022-13:24:13] [V] [TRT] Tactic: 4922297020351187339 Time: 0.272256 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16_t1r3s3 Tactic: 4931167631624420067 [03/25/2022-13:24:13] [V] [TRT] Tactic: 4931167631624420067 Time: 0.583936 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 5136656982162849059 [03/25/2022-13:24:13] [V] [TRT] Tactic: 5136656982162849059 Time: 0.225536 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 5189825015507701541 [03/25/2022-13:24:13] [V] [TRT] Tactic: 5189825015507701541 Time: 0.321792 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32 Tactic: 5424417905073460656 [03/25/2022-13:24:13] [V] [TRT] Tactic: 5424417905073460656 Time: 0.153344 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32 Tactic: 5442043907221427810 [03/25/2022-13:24:13] [V] [TRT] Tactic: 5442043907221427810 Time: 0.389632 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_t1r3s3 Tactic: 5544365258913999384 [03/25/2022-13:24:13] [V] [TRT] Tactic: 5544365258913999384 Time: 0.373888 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r3s3 Tactic: 5641967928706599451 [03/25/2022-13:24:13] [V] [TRT] Tactic: 5641967928706599451 Time: 0.242816 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 5721595115357140131 [03/25/2022-13:24:13] [V] [TRT] Tactic: 5721595115357140131 Time: 0.183168 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 6004789655466615912 [03/25/2022-13:24:13] [V] [TRT] Tactic: 6004789655466615912 Time: 0.127872 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r3s3 Tactic: 6146901278630392829 [03/25/2022-13:24:13] [V] [TRT] Tactic: 6146901278630392829 Time: 0.37696 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16 Tactic: 6394572396369862482 [03/25/2022-13:24:13] [V] [TRT] Tactic: 6394572396369862482 Time: 0.249088 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32 Tactic: 6781129591847482048 [03/25/2022-13:24:13] [V] [TRT] Tactic: 6781129591847482048 Time: 0.121088 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 6984451771200230840 [03/25/2022-13:24:13] [V] [TRT] Tactic: 6984451771200230840 Time: 0.204032 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 7048234086361926570 [03/25/2022-13:24:13] [V] [TRT] Tactic: 7048234086361926570 Time: 0.166144 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32 Tactic: 7077570591813340966 [03/25/2022-13:24:13] [V] [TRT] Tactic: 7077570591813340966 Time: 0.20608 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_t1r3s3 Tactic: 7429976449747682901 [03/25/2022-13:24:13] [V] [TRT] Tactic: 7429976449747682901 Time: 0.196096 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 8096257414008860171 [03/25/2022-13:24:13] [V] [TRT] Tactic: 8096257414008860171 Time: 0.111872 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 8128112048355596715 [03/25/2022-13:24:13] [V] [TRT] Tactic: 8128112048355596715 Time: 0.1088 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_indexed_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32 Tactic: 9064458886956700976 [03/25/2022-13:24:13] [V] [TRT] Tactic: 9064458886956700976 Time: 0.181504 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32 Tactic: -9165697322068360861 [03/25/2022-13:24:13] [V] [TRT] Tactic: -9165697322068360861 Time: 0.207744 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_small_nt_v1 Tactic: -9118785798277698619 [03/25/2022-13:24:13] [V] [TRT] Tactic: -9118785798277698619 Time: 0.14272 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: -9108166971364503411 [03/25/2022-13:24:13] [V] [TRT] Tactic: -9108166971364503411 Time: 0.17024 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16 Tactic: -8861822316054763526 [03/25/2022-13:24:13] [V] [TRT] Tactic: -8861822316054763526 Time: 0.309632 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64 Tactic: -8791277710877987710 [03/25/2022-13:24:13] [V] [TRT] Tactic: -8791277710877987710 Time: 0.225664 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: -8691377209893505057 [03/25/2022-13:24:13] [V] [TRT] Tactic: -8691377209893505057 Time: 0.169216 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: -8520292213102999339 [03/25/2022-13:24:13] [V] [TRT] Tactic: -8520292213102999339 Time: 0.303872 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16_t1r3s3 Tactic: -8475551154769412306 [03/25/2022-13:24:13] [V] [TRT] Tactic: -8475551154769412306 Time: 0.14656 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -8417388128970254446 [03/25/2022-13:24:13] [V] [TRT] Tactic: -8417388128970254446 Time: 0.306432 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32 Tactic: -8263994888336646547 [03/25/2022-13:24:13] [V] [TRT] Tactic: -8263994888336646547 Time: 0.165632 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32 Tactic: -8205948405243401049 [03/25/2022-13:24:13] [V] [TRT] Tactic: -8205948405243401049 Time: 0.200448 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_t1r3s3 Tactic: -7898477046581738867 [03/25/2022-13:24:13] [V] [TRT] Tactic: -7898477046581738867 Time: 0.136448 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32 Tactic: -7683887278997527517 [03/25/2022-13:24:13] [V] [TRT] Tactic: -7683887278997527517 Time: 0.179584 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: -7381370635708568663 [03/25/2022-13:24:13] [V] [TRT] Tactic: -7381370635708568663 Time: 0.114432 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -7129320389887881029 [03/25/2022-13:24:13] [V] [TRT] Tactic: -7129320389887881029 Time: 0.158976 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -6959995514028471820 [03/25/2022-13:24:13] [V] [TRT] Tactic: -6959995514028471820 Time: 0.265856 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_small_nt_v1 Tactic: -6400348606759295499 [03/25/2022-13:24:13] [V] [TRT] Tactic: -6400348606759295499 Time: 0.26176 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: -6371781333659293809 [03/25/2022-13:24:13] [V] [TRT] Tactic: -6371781333659293809 Time: 0.168192 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32 Tactic: -6256128573036943404 [03/25/2022-13:24:13] [V] [TRT] Tactic: -6256128573036943404 Time: 0.208256 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_small_nt_v1 Tactic: -5980889159865208399 [03/25/2022-13:24:13] [V] [TRT] Tactic: -5980889159865208399 Time: 0.309504 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_medium_nt_v1 Tactic: -5766140806760372989 [03/25/2022-13:24:13] [V] [TRT] Tactic: -5766140806760372989 Time: 0.148736 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16 Tactic: -5180570335464125033 [03/25/2022-13:24:13] [V] [TRT] Tactic: -5180570335464125033 Time: 0.154368 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -4933563390723451692 [03/25/2022-13:24:13] [V] [TRT] Tactic: -4933563390723451692 Time: 0.132224 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_medium_nt_v1 Tactic: -4516822589357530549 [03/25/2022-13:24:13] [V] [TRT] Tactic: -4516822589357530549 Time: 0.152832 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64 Tactic: -4232916483289779353 [03/25/2022-13:24:13] [V] [TRT] Tactic: -4232916483289779353 Time: 0.415872 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64 Tactic: -3460842194336717186 [03/25/2022-13:24:13] [V] [TRT] Tactic: -3460842194336717186 Time: 0.1152 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: -3280888557222886418 [03/25/2022-13:24:13] [V] [TRT] Tactic: -3280888557222886418 Time: 0.14592 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -3238475748440751107 [03/25/2022-13:24:13] [V] [TRT] Tactic: -3238475748440751107 Time: 0.194048 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: -3182884991006484042 [03/25/2022-13:24:13] [V] [TRT] Tactic: -3182884991006484042 Time: 0.095872 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -3173468756112541306 [03/25/2022-13:24:13] [V] [TRT] Tactic: -3173468756112541306 Time: 0.192256 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_large_nt_v1 Tactic: -2917455979290586480 [03/25/2022-13:24:13] [V] [TRT] Tactic: -2917455979290586480 Time: 0.314624 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -2741641298163591508 [03/25/2022-13:24:13] [V] [TRT] Tactic: -2741641298163591508 Time: 0.20032 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_large_nt_v1 Tactic: -2571022005763160364 [03/25/2022-13:24:13] [V] [TRT] Tactic: -2571022005763160364 Time: 0.280576 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: -2499089240293650188 [03/25/2022-13:24:13] [V] [TRT] Tactic: -2499089240293650188 Time: 0.26752 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: -2328318099174473157 [03/25/2022-13:24:13] [V] [TRT] Tactic: -2328318099174473157 Time: 0.173312 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32 Tactic: -2054375205435666404 [03/25/2022-13:24:13] [V] [TRT] Tactic: -2054375205435666404 Time: 0.204032 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -1546787387293556842 [03/25/2022-13:24:13] [V] [TRT] Tactic: -1546787387293556842 Time: 0.158848 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32 Tactic: -1498626619443284096 [03/25/2022-13:24:13] [V] [TRT] Tactic: -1498626619443284096 Time: 0.137088 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -1471245223605064669 [03/25/2022-13:24:13] [V] [TRT] Tactic: -1471245223605064669 Time: 0.27776 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32 Tactic: -1283580231568512025 [03/25/2022-13:24:13] [V] [TRT] Tactic: -1283580231568512025 Time: 0.220288 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: -1224421172675151280 [03/25/2022-13:24:13] [V] [TRT] Tactic: -1224421172675151280 Time: 0.097536 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -921247911551089037 [03/25/2022-13:24:13] [V] [TRT] Tactic: -921247911551089037 Time: 0.16192 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32 Tactic: -762222380308749469 [03/25/2022-13:24:13] [V] [TRT] Tactic: -762222380308749469 Time: 0.138496 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -516725800067794372 [03/25/2022-13:24:13] [V] [TRT] Tactic: -516725800067794372 Time: 0.199552 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_large_nt_v1 Tactic: -428104331444385564 [03/25/2022-13:24:13] [V] [TRT] Tactic: -428104331444385564 Time: 0.150784 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16 Tactic: -366411318217594794 [03/25/2022-13:24:13] [V] [TRT] Tactic: -366411318217594794 Time: 0.190976 [03/25/2022-13:24:13] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16 Tactic: -351548418071036983 [03/25/2022-13:24:13] [V] [TRT] Tactic: -351548418071036983 Time: 0.593792 [03/25/2022-13:24:13] [V] [TRT] Fastest Tactic: -3182884991006484042 Time: 0.095872 [03/25/2022-13:24:13] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: -3182884991006484042 [03/25/2022-13:24:13] [V] [TRT] =============== Computing costs for [03/25/2022-13:24:13] [V] [TRT] *************** Autotuning format combination: Int8(50176,3136:4,56,1), Int8(200704,3136:4,56,1) -> Int8(200704,3136:4,56,1) *************** [03/25/2022-13:24:13] [V] [TRT] --------------- Timing Runner: sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 (CudaDepthwiseConvolution) [03/25/2022-13:24:13] [V] [TRT] CudaDepthwiseConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:14] [V] [TRT] --------------- Timing Runner: sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 (FusedConvActConvolution) [03/25/2022-13:24:14] [V] [TRT] FusedConvActConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:14] [V] [TRT] --------------- Timing Runner: sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 (CaskConvolution) [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_medium_nn_v1 Tactic: 175853789719975416 [03/25/2022-13:24:14] [V] [TRT] Tactic: 175853789719975416 Time: 0.447616 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_medium_nn_v1 Tactic: 2171150287007712632 [03/25/2022-13:24:14] [V] [TRT] Tactic: 2171150287007712632 Time: 0.523264 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_small_nn_v1 Tactic: 2234457234705232274 [03/25/2022-13:24:14] [V] [TRT] Tactic: 2234457234705232274 Time: 0.368 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_medium_nn_v1 Tactic: 5834048089706882838 [03/25/2022-13:24:14] [V] [TRT] Tactic: 5834048089706882838 Time: 0.374272 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_interior_nn_v1 Tactic: 6299962968199310600 [03/25/2022-13:24:14] [V] [TRT] Tactic: 6299962968199310600 Time: 0.38848 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_interior_nn_v1 Tactic: 6341572697076960911 [03/25/2022-13:24:14] [V] [TRT] Tactic: 6341572697076960911 Time: 0.50624 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_small_nn_v1 Tactic: -8626990807754934295 [03/25/2022-13:24:14] [V] [TRT] Tactic: -8626990807754934295 Time: 0.441856 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_interior_nn_v1 Tactic: -8498217049614706532 [03/25/2022-13:24:14] [V] [TRT] Tactic: -8498217049614706532 Time: 0.36288 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_small_nn_v1 Tactic: -7303593854972602201 [03/25/2022-13:24:14] [V] [TRT] Tactic: -7303593854972602201 Time: 0.50432 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_medium_nn_v1 Tactic: -6585664687867083638 [03/25/2022-13:24:14] [V] [TRT] Tactic: -6585664687867083638 Time: 0.393216 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_interior_nn_v1 Tactic: -3326139578711341011 [03/25/2022-13:24:14] [V] [TRT] Tactic: -3326139578711341011 Time: 0.432256 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_small_nn_v1 Tactic: -683636008127039856 [03/25/2022-13:24:14] [V] [TRT] Tactic: -683636008127039856 Time: 0.391808 [03/25/2022-13:24:14] [V] [TRT] Fastest Tactic: -8498217049614706532 Time: 0.36288 [03/25/2022-13:24:14] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: -8498217049614706532 [03/25/2022-13:24:14] [V] [TRT] *************** Autotuning format combination: Int8(50176,3136:4,56,1), Int8(25088,3136:32,56,1) -> Int8(25088,3136:32,56,1) *************** [03/25/2022-13:24:14] [V] [TRT] --------------- Timing Runner: sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 (CaskConvolution) [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_small_c32_nn_v1 Tactic: 1100922622480907544 [03/25/2022-13:24:14] [V] [TRT] Tactic: 1100922622480907544 Time: 0.431232 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_interior_c32_nn_v1 Tactic: 2855900226702061782 [03/25/2022-13:24:14] [V] [TRT] Tactic: 2855900226702061782 Time: 0.388352 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_medium_c32_nn_v1 Tactic: 3606311198834416176 [03/25/2022-13:24:14] [V] [TRT] Tactic: 3606311198834416176 Time: 0.37312 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_small_c32_nn_v1 Tactic: 4325765560739862899 [03/25/2022-13:24:14] [V] [TRT] Tactic: 4325765560739862899 Time: 0.392576 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_interior_c32_nn_v1 Tactic: 8803458114157674373 [03/25/2022-13:24:14] [V] [TRT] Tactic: 8803458114157674373 Time: 0.3616 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_interior_c32_nn_v1 Tactic: -6934773036503365000 [03/25/2022-13:24:14] [V] [TRT] Tactic: -6934773036503365000 Time: 0.423296 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_interior_c32_nn_v1 Tactic: -4431642509665791294 [03/25/2022-13:24:14] [V] [TRT] Tactic: -4431642509665791294 Time: 0.475136 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_medium_c32_nn_v1 Tactic: -4255737803793506479 [03/25/2022-13:24:14] [V] [TRT] Tactic: -4255737803793506479 Time: 0.394112 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_small_c32_nn_v1 Tactic: -3958182351168863467 [03/25/2022-13:24:14] [V] [TRT] Tactic: -3958182351168863467 Time: 0.494592 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_medium_c32_nn_v1 Tactic: -3111968753064955248 [03/25/2022-13:24:14] [V] [TRT] Tactic: -3111968753064955248 Time: 0.500992 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_medium_c32_nn_v1 Tactic: -1492575840277333548 [03/25/2022-13:24:14] [V] [TRT] Tactic: -1492575840277333548 Time: 0.43584 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_small_c32_nn_v1 Tactic: -868495160148524802 [03/25/2022-13:24:14] [V] [TRT] Tactic: -868495160148524802 Time: 0.366976 [03/25/2022-13:24:14] [V] [TRT] Fastest Tactic: 8803458114157674373 Time: 0.3616 [03/25/2022-13:24:14] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: 8803458114157674373 [03/25/2022-13:24:14] [V] [TRT] *************** Autotuning format combination: Int8(6272,3136:32,56,1), Int8(25088,3136:32,56,1) -> Int8(25088,3136:32,56,1) *************** [03/25/2022-13:24:14] [V] [TRT] --------------- Timing Runner: sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 (CudaGroupConvolution) [03/25/2022-13:24:14] [V] [TRT] CudaGroupConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:14] [V] [TRT] --------------- Timing Runner: sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 (CudaDepthwiseConvolution) [03/25/2022-13:24:14] [V] [TRT] CudaDepthwiseConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:14] [V] [TRT] --------------- Timing Runner: sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 (FusedConvActConvolution) [03/25/2022-13:24:14] [V] [TRT] FusedConvActConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:14] [V] [TRT] --------------- Timing Runner: sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 (CaskConvolution) [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16_t1r1s1 Tactic: 68468667201176803 [03/25/2022-13:24:14] [V] [TRT] Tactic: 68468667201176803 Time: 0.2816 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 125145153013230687 [03/25/2022-13:24:14] [V] [TRT] Tactic: 125145153013230687 Time: 0.36992 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_interior_nt_v1 Tactic: 434957160407688216 [03/25/2022-13:24:14] [V] [TRT] Tactic: 434957160407688216 Time: 0.294144 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 805889586762897346 [03/25/2022-13:24:14] [V] [TRT] Tactic: 805889586762897346 Time: 0.242816 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: 857001784974286465 [03/25/2022-13:24:14] [V] [TRT] Tactic: 857001784974286465 Time: 0.28032 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 1214130898909872671 [03/25/2022-13:24:14] [V] [TRT] Tactic: 1214130898909872671 Time: 0.322688 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 1278425129871930205 [03/25/2022-13:24:14] [V] [TRT] Tactic: 1278425129871930205 Time: 0.226176 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 1583811548148740665 [03/25/2022-13:24:14] [V] [TRT] Tactic: 1583811548148740665 Time: 0.33024 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 1701344857577810806 [03/25/2022-13:24:14] [V] [TRT] Tactic: 1701344857577810806 Time: 0.309376 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 1797231177354918208 [03/25/2022-13:24:14] [V] [TRT] Tactic: 1797231177354918208 Time: 0.32512 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2004812516525036381 [03/25/2022-13:24:14] [V] [TRT] Tactic: 2004812516525036381 Time: 0.33088 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2030033463723799063 [03/25/2022-13:24:14] [V] [TRT] Tactic: 2030033463723799063 Time: 0.233344 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_interior_nt_v1 Tactic: 2346437292116182513 [03/25/2022-13:24:14] [V] [TRT] Tactic: 2346437292116182513 Time: 0.251776 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2376898825218218566 [03/25/2022-13:24:14] [V] [TRT] Tactic: 2376898825218218566 Time: 0.208 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_interior_nt_v1 Tactic: 2522133112320625287 [03/25/2022-13:24:14] [V] [TRT] Tactic: 2522133112320625287 Time: 0.260224 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2548171972648455240 [03/25/2022-13:24:14] [V] [TRT] Tactic: 2548171972648455240 Time: 0.210944 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 2548946449357458230 [03/25/2022-13:24:14] [V] [TRT] Tactic: 2548946449357458230 Time: 0.269696 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 2678520742286844763 [03/25/2022-13:24:14] [V] [TRT] Tactic: 2678520742286844763 Time: 0.43968 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 2756291002030759362 [03/25/2022-13:24:14] [V] [TRT] Tactic: 2756291002030759362 Time: 0.297728 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2972948223367788520 [03/25/2022-13:24:14] [V] [TRT] Tactic: 2972948223367788520 Time: 0.208768 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_medium_nt_v1 Tactic: 2985940154541537814 [03/25/2022-13:24:14] [V] [TRT] Tactic: 2985940154541537814 Time: 0.255104 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 3043273137345374664 [03/25/2022-13:24:14] [V] [TRT] Tactic: 3043273137345374664 Time: 0.284928 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 3221677093659484230 [03/25/2022-13:24:14] [V] [TRT] Tactic: 3221677093659484230 Time: 0.350976 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: 3242897809704328258 [03/25/2022-13:24:14] [V] [TRT] Tactic: 3242897809704328258 Time: 0.294272 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 3312456766204252694 [03/25/2022-13:24:14] [V] [TRT] Tactic: 3312456766204252694 Time: 0.24576 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 3538565962642681625 [03/25/2022-13:24:14] [V] [TRT] Tactic: 3538565962642681625 Time: 0.28928 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 3541919052468401776 [03/25/2022-13:24:14] [V] [TRT] Tactic: 3541919052468401776 Time: 0.25856 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 3593397928177382100 [03/25/2022-13:24:14] [V] [TRT] Tactic: 3593397928177382100 Time: 0.318208 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 3670282018109435863 [03/25/2022-13:24:14] [V] [TRT] Tactic: 3670282018109435863 Time: 0.244608 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 3671413346254027573 [03/25/2022-13:24:14] [V] [TRT] Tactic: 3671413346254027573 Time: 0.28672 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_medium_nt_v1 Tactic: 3899284354987683408 [03/25/2022-13:24:14] [V] [TRT] Tactic: 3899284354987683408 Time: 0.298112 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 3927509214678622419 [03/25/2022-13:24:14] [V] [TRT] Tactic: 3927509214678622419 Time: 0.279168 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 4112572034735311841 [03/25/2022-13:24:14] [V] [TRT] Tactic: 4112572034735311841 Time: 0.372992 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 4239974928951431644 [03/25/2022-13:24:14] [V] [TRT] Tactic: 4239974928951431644 Time: 0.215808 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 4610760414797216079 [03/25/2022-13:24:14] [V] [TRT] Tactic: 4610760414797216079 Time: 0.245248 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_small_nt_v1 Tactic: 4717285412741024953 [03/25/2022-13:24:14] [V] [TRT] Tactic: 4717285412741024953 Time: 0.266112 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 4796956614760326119 [03/25/2022-13:24:14] [V] [TRT] Tactic: 4796956614760326119 Time: 0.244224 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 4919361344804309192 [03/25/2022-13:24:14] [V] [TRT] Tactic: 4919361344804309192 Time: 0.344064 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 5043674678294309681 [03/25/2022-13:24:14] [V] [TRT] Tactic: 5043674678294309681 Time: 0.308352 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_interior_nt_v1 Tactic: 5126565865931538390 [03/25/2022-13:24:14] [V] [TRT] Tactic: 5126565865931538390 Time: 0.261504 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5204702486885981735 [03/25/2022-13:24:14] [V] [TRT] Tactic: 5204702486885981735 Time: 0.244736 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 5375256703210220108 [03/25/2022-13:24:14] [V] [TRT] Tactic: 5375256703210220108 Time: 0.284032 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 5424258848951129084 [03/25/2022-13:24:14] [V] [TRT] Tactic: 5424258848951129084 Time: 0.264832 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5443897483205284103 [03/25/2022-13:24:14] [V] [TRT] Tactic: 5443897483205284103 Time: 0.302848 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5707566217891294846 [03/25/2022-13:24:14] [V] [TRT] Tactic: 5707566217891294846 Time: 0.22656 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5986622376339202983 [03/25/2022-13:24:14] [V] [TRT] Tactic: 5986622376339202983 Time: 0.243584 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6007888770437705057 [03/25/2022-13:24:14] [V] [TRT] Tactic: 6007888770437705057 Time: 0.254464 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 6405251167055673379 [03/25/2022-13:24:14] [V] [TRT] Tactic: 6405251167055673379 Time: 0.25536 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 6433368103202497147 [03/25/2022-13:24:14] [V] [TRT] Tactic: 6433368103202497147 Time: 0.237056 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6441948709525127755 [03/25/2022-13:24:14] [V] [TRT] Tactic: 6441948709525127755 Time: 0.327936 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 6443933097134654777 [03/25/2022-13:24:14] [V] [TRT] Tactic: 6443933097134654777 Time: 0.249344 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6457435868048963632 [03/25/2022-13:24:14] [V] [TRT] Tactic: 6457435868048963632 Time: 0.249984 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1 Tactic: 6510345569544721081 [03/25/2022-13:24:14] [V] [TRT] Tactic: 6510345569544721081 Time: 0.439424 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 6793988781414507278 [03/25/2022-13:24:14] [V] [TRT] Tactic: 6793988781414507278 Time: 0.221696 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 6880710371738875469 [03/25/2022-13:24:14] [V] [TRT] Tactic: 6880710371738875469 Time: 0.304512 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6925201228918187099 [03/25/2022-13:24:14] [V] [TRT] Tactic: 6925201228918187099 Time: 0.231552 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6991524515605108718 [03/25/2022-13:24:14] [V] [TRT] Tactic: 6991524515605108718 Time: 0.425728 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 7245509442265271220 [03/25/2022-13:24:14] [V] [TRT] Tactic: 7245509442265271220 Time: 0.23872 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 7318929579222925725 [03/25/2022-13:24:14] [V] [TRT] Tactic: 7318929579222925725 Time: 0.230912 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 7731430299029542276 [03/25/2022-13:24:14] [V] [TRT] Tactic: 7731430299029542276 Time: 0.217472 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 7738495016763012180 [03/25/2022-13:24:14] [V] [TRT] Tactic: 7738495016763012180 Time: 0.2336 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 8142283985160822229 [03/25/2022-13:24:14] [V] [TRT] Tactic: 8142283985160822229 Time: 0.24576 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 8173975624668590862 [03/25/2022-13:24:14] [V] [TRT] Tactic: 8173975624668590862 Time: 0.243712 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 8234775147403903473 [03/25/2022-13:24:14] [V] [TRT] Tactic: 8234775147403903473 Time: 0.258176 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 8524082966802584889 [03/25/2022-13:24:14] [V] [TRT] Tactic: 8524082966802584889 Time: 0.21696 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 8684013308930763400 [03/25/2022-13:24:14] [V] [TRT] Tactic: 8684013308930763400 Time: 0.265344 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 8765382722978397630 [03/25/2022-13:24:14] [V] [TRT] Tactic: 8765382722978397630 Time: 0.220288 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 8843193587782643431 [03/25/2022-13:24:14] [V] [TRT] Tactic: 8843193587782643431 Time: 0.319744 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 8883810517410230831 [03/25/2022-13:24:14] [V] [TRT] Tactic: 8883810517410230831 Time: 0.307456 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 8930797211803511337 [03/25/2022-13:24:14] [V] [TRT] Tactic: 8930797211803511337 Time: 0.398336 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 8935070489925739043 [03/25/2022-13:24:14] [V] [TRT] Tactic: 8935070489925739043 Time: 0.227584 [03/25/2022-13:24:14] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1 Tactic: 9062173295331155069 [03/25/2022-13:24:15] [V] [TRT] Tactic: 9062173295331155069 Time: 0.445952 [03/25/2022-13:24:15] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_small_nt_v1 Tactic: -9118785798277698619 [03/25/2022-13:24:15] [V] [TRT] Tactic: -9118785798277698619 Time: 0.264192 [03/25/2022-13:24:15] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -8985599729413291927 [03/25/2022-13:24:15] [V] [TRT] Tactic: -8985599729413291927 Time: 0.279424 [03/25/2022-13:24:15] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -8972697510150675429 [03/25/2022-13:24:15] [V] [TRT] Tactic: -8972697510150675429 Time: 0.28224 [03/25/2022-13:24:15] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -8943710627305202139 [03/25/2022-13:24:15] [V] [TRT] Tactic: -8943710627305202139 Time: 0.234368 [03/25/2022-13:24:15] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -8859846367886814331 [03/25/2022-13:24:15] [V] [TRT] Tactic: -8859846367886814331 Time: 0.263552 [03/25/2022-13:24:15] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -8638624340850784688 [03/25/2022-13:24:15] [V] [TRT] Tactic: -8638624340850784688 Time: 0.339712 [03/25/2022-13:24:15] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -8556775352640313933 [03/25/2022-13:24:15] [V] [TRT] Tactic: -8556775352640313933 Time: 0.22272 [03/25/2022-13:24:15] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -8382298409581540699 [03/25/2022-13:24:15] [V] [TRT] Tactic: -8382298409581540699 Time: 0.27072 [03/25/2022-13:24:15] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -8172318747337038866 [03/25/2022-13:24:15] [V] [TRT] Tactic: -8172318747337038866 Time: 0.32064 [03/25/2022-13:24:15] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -8038164441468184723 [03/25/2022-13:24:15] [V] [TRT] Tactic: -8038164441468184723 Time: 0.231808 [03/25/2022-13:24:15] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -7844028314176826857 [03/25/2022-13:24:15] [V] [TRT] Tactic: -7844028314176826857 Time: 0.293376 [03/25/2022-13:24:15] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -7364286662638617917 [03/25/2022-13:24:15] [V] [TRT] Tactic: -7364286662638617917 Time: 0.226432 [03/25/2022-13:24:15] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r1s1 Tactic: -7361755530333096258 [03/25/2022-13:24:15] [V] [TRT] Tactic: -7361755530333096258 Time: 0.286336 [03/25/2022-13:24:15] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_t1r1s1 Tactic: -7289760022626653388 [03/25/2022-13:24:15] [V] [TRT] Tactic: -7289760022626653388 Time: 0.307584 [03/25/2022-13:24:15] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -7106539943789766885 [03/25/2022-13:24:15] [V] [TRT] Tactic: -7106539943789766885 Time: 0.30656 [03/25/2022-13:24:15] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -6969478418607271266 [03/25/2022-13:24:15] [V] [TRT] Tactic: -6969478418607271266 Time: 0.309888 [03/25/2022-13:24:15] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -6930438165437733000 [03/25/2022-13:24:15] [V] [TRT] Tactic: -6930438165437733000 Time: 0.369152 [03/25/2022-13:24:15] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -6879607992933502380 [03/25/2022-13:24:15] [V] [TRT] Tactic: -6879607992933502380 Time: 0.219776 [03/25/2022-13:24:15] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -6527178416855951297 [03/25/2022-13:24:15] [V] [TRT] Tactic: -6527178416855951297 Time: 0.357504 [03/25/2022-13:24:15] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -6510232214299595844 [03/25/2022-13:24:15] [V] [TRT] Tactic: -6510232214299595844 Time: 0.37504 [03/25/2022-13:24:15] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_small_nt_v1 Tactic: -6400348606759295499 [03/25/2022-13:24:15] [V] [TRT] Tactic: -6400348606759295499 Time: 0.25344 [03/25/2022-13:24:15] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -6346247605026339453 [03/25/2022-13:24:15] [V] [TRT] Tactic: -6346247605026339453 Time: 0.228992 [03/25/2022-13:24:15] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -6232597026469067819 [03/25/2022-13:24:15] [V] [TRT] Tactic: -6232597026469067819 Time: 0.332928 [03/25/2022-13:24:15] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_small_nt_v1 Tactic: -5980889159865208399 [03/25/2022-13:24:15] [V] [TRT] Tactic: -5980889159865208399 Time: 0.297088 [03/25/2022-13:24:15] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_medium_nt_v1 Tactic: -5766140806760372989 [03/25/2022-13:24:15] [V] [TRT] Tactic: -5766140806760372989 Time: 0.266624 [03/25/2022-13:24:15] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -5697614955743334137 [03/25/2022-13:24:15] [V] [TRT] Tactic: -5697614955743334137 Time: 0.270848 [03/25/2022-13:24:15] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -5671123121710113970 [03/25/2022-13:24:15] [V] [TRT] Tactic: -5671123121710113970 Time: 0.222464 [03/25/2022-13:24:15] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -5615581362569252260 [03/25/2022-13:24:15] [V] [TRT] Tactic: -5615581362569252260 Time: 0.304 [03/25/2022-13:24:15] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -5562968047117507056 [03/25/2022-13:24:15] [V] [TRT] Tactic: -5562968047117507056 Time: 0.251008 [03/25/2022-13:24:15] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -5516472881360101487 [03/25/2022-13:24:15] [V] [TRT] Tactic: -5516472881360101487 Time: 0.335104 [03/25/2022-13:24:15] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -5311474420963248369 [03/25/2022-13:24:15] [V] [TRT] Tactic: -5311474420963248369 Time: 0.2624 [03/25/2022-13:24:15] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -5170003087447722174 [03/25/2022-13:24:15] [V] [TRT] Tactic: -5170003087447722174 Time: 0.331904 [03/25/2022-13:24:15] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -4889586143772361690 [03/25/2022-13:24:15] [V] [TRT] Tactic: -4889586143772361690 Time: 0.2304 [03/25/2022-13:24:15] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -4849712423393454704 [03/25/2022-13:24:15] [V] [TRT] Tactic: -4849712423393454704 Time: 0.243584 [03/25/2022-13:24:15] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: -4681913707320020520 [03/25/2022-13:24:15] [V] [TRT] Tactic: -4681913707320020520 Time: 0.278528 [03/25/2022-13:24:15] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_medium_nt_v1 Tactic: -4516822589357530549 [03/25/2022-13:24:15] [V] [TRT] Tactic: -4516822589357530549 Time: 0.26816 [03/25/2022-13:24:15] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -4455415102719506646 [03/25/2022-13:24:15] [V] [TRT] Tactic: -4455415102719506646 Time: 0.276352 [03/25/2022-13:24:15] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -4425346730823666456 [03/25/2022-13:24:15] [V] [TRT] Tactic: -4425346730823666456 Time: 0.34688 [03/25/2022-13:24:15] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r1s1 Tactic: -4260476497340370474 [03/25/2022-13:24:15] [V] [TRT] Tactic: -4260476497340370474 Time: 0.288512 [03/25/2022-13:24:15] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -4182501876984672402 [03/25/2022-13:24:15] [V] [TRT] Tactic: -4182501876984672402 Time: 0.226432 [03/25/2022-13:24:15] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_t1r1s1 Tactic: -4151617293257698859 [03/25/2022-13:24:15] [V] [TRT] Tactic: -4151617293257698859 Time: 0.252032 [03/25/2022-13:24:15] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -3825889760337461729 [03/25/2022-13:24:15] [V] [TRT] Tactic: -3825889760337461729 Time: 0.361088 [03/25/2022-13:24:15] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -3797022944823726673 [03/25/2022-13:24:15] [V] [TRT] Tactic: -3797022944823726673 Time: 0.270208 [03/25/2022-13:24:15] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -3613322253849278738 [03/25/2022-13:24:15] [V] [TRT] Tactic: -3613322253849278738 Time: 0.396032 [03/25/2022-13:24:15] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -3577322188448771475 [03/25/2022-13:24:15] [V] [TRT] Tactic: -3577322188448771475 Time: 0.295936 [03/25/2022-13:24:15] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16_t1r1s1 Tactic: -3531681826488401618 [03/25/2022-13:24:15] [V] [TRT] Tactic: -3531681826488401618 Time: 0.314368 [03/25/2022-13:24:15] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -3288585994448820820 [03/25/2022-13:24:15] [V] [TRT] Tactic: -3288585994448820820 Time: 0.361728 [03/25/2022-13:24:15] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -2754311112012636251 [03/25/2022-13:24:15] [V] [TRT] Tactic: -2754311112012636251 Time: 0.308352 [03/25/2022-13:24:15] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -2432868635536396215 [03/25/2022-13:24:15] [V] [TRT] Tactic: -2432868635536396215 Time: 0.32192 [03/25/2022-13:24:15] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -2379804152300264660 [03/25/2022-13:24:15] [V] [TRT] Tactic: -2379804152300264660 Time: 0.279808 [03/25/2022-13:24:15] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: -2352253835013627337 [03/25/2022-13:24:15] [V] [TRT] Tactic: -2352253835013627337 Time: 0.265216 [03/25/2022-13:24:15] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -2335587136911650799 [03/25/2022-13:24:15] [V] [TRT] Tactic: -2335587136911650799 Time: 0.307456 [03/25/2022-13:24:15] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r1s1 Tactic: -2315453944962430928 [03/25/2022-13:24:15] [V] [TRT] Tactic: -2315453944962430928 Time: 0.253312 [03/25/2022-13:24:15] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: -2238364958919154661 [03/25/2022-13:24:15] [V] [TRT] Tactic: -2238364958919154661 Time: 0.439168 [03/25/2022-13:24:15] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -1916483171117495388 [03/25/2022-13:24:15] [V] [TRT] Tactic: -1916483171117495388 Time: 0.419328 [03/25/2022-13:24:15] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -1549742793039499659 [03/25/2022-13:24:15] [V] [TRT] Tactic: -1549742793039499659 Time: 0.342912 [03/25/2022-13:24:15] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -1499578657823798783 [03/25/2022-13:24:15] [V] [TRT] Tactic: -1499578657823798783 Time: 0.276736 [03/25/2022-13:24:15] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -1494157908358500249 [03/25/2022-13:24:15] [V] [TRT] Tactic: -1494157908358500249 Time: 0.276736 [03/25/2022-13:24:15] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_t1r1s1 Tactic: -1328736756812546664 [03/25/2022-13:24:15] [V] [TRT] Tactic: -1328736756812546664 Time: 0.251904 [03/25/2022-13:24:15] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -1006589727652607355 [03/25/2022-13:24:15] [V] [TRT] Tactic: -1006589727652607355 Time: 0.317056 [03/25/2022-13:24:15] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -713022856474991236 [03/25/2022-13:24:15] [V] [TRT] Tactic: -713022856474991236 Time: 0.385152 [03/25/2022-13:24:15] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -405554772060757402 [03/25/2022-13:24:15] [V] [TRT] Tactic: -405554772060757402 Time: 0.254976 [03/25/2022-13:24:15] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -375949437730908730 [03/25/2022-13:24:15] [V] [TRT] Tactic: -375949437730908730 Time: 0.296064 [03/25/2022-13:24:15] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: -233227833606287806 [03/25/2022-13:24:15] [V] [TRT] Tactic: -233227833606287806 Time: 0.307456 [03/25/2022-13:24:15] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -111878368089469751 [03/25/2022-13:24:15] [V] [TRT] Tactic: -111878368089469751 Time: 0.33024 [03/25/2022-13:24:15] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -48936598874722005 [03/25/2022-13:24:15] [V] [TRT] Tactic: -48936598874722005 Time: 0.2176 [03/25/2022-13:24:15] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -19707840769375107 [03/25/2022-13:24:15] [V] [TRT] Tactic: -19707840769375107 Time: 0.291456 [03/25/2022-13:24:15] [V] [TRT] Fastest Tactic: 2376898825218218566 Time: 0.208 [03/25/2022-13:24:15] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: 2376898825218218566 [03/25/2022-13:24:15] [V] [TRT] =============== Computing costs for [03/25/2022-13:24:15] [V] [TRT] *************** Autotuning format combination: Int8(200704,3136:4,56,1) -> Int8(50176,3136:4,56,1) *************** [03/25/2022-13:24:15] [V] [TRT] --------------- Timing Runner: sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 (CudaDepthwiseConvolution) [03/25/2022-13:24:15] [V] [TRT] CudaDepthwiseConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:15] [V] [TRT] --------------- Timing Runner: sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 (FusedConvActConvolution) [03/25/2022-13:24:15] [V] [TRT] FusedConvActConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:15] [V] [TRT] --------------- Timing Runner: sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 (CaskConvolution) [03/25/2022-13:24:15] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_medium_nn_v1 Tactic: 175853789719975416 [03/25/2022-13:24:15] [V] [TRT] Tactic: 175853789719975416 Time: 0.248576 [03/25/2022-13:24:15] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_medium_nn_v1 Tactic: 2171150287007712632 [03/25/2022-13:24:15] [V] [TRT] Tactic: 2171150287007712632 Time: 0.251264 [03/25/2022-13:24:15] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_small_nn_v1 Tactic: 2234457234705232274 [03/25/2022-13:24:15] [V] [TRT] Tactic: 2234457234705232274 Time: 0.206336 [03/25/2022-13:24:15] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_medium_nn_v1 Tactic: 5834048089706882838 [03/25/2022-13:24:15] [V] [TRT] Tactic: 5834048089706882838 Time: 0.207872 [03/25/2022-13:24:15] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_interior_nn_v1 Tactic: 6299962968199310600 [03/25/2022-13:24:15] [V] [TRT] Tactic: 6299962968199310600 Time: 0.3872 [03/25/2022-13:24:15] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_interior_nn_v1 Tactic: 6341572697076960911 [03/25/2022-13:24:15] [V] [TRT] Tactic: 6341572697076960911 Time: 0.233728 [03/25/2022-13:24:15] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_small_nn_v1 Tactic: -8626990807754934295 [03/25/2022-13:24:15] [V] [TRT] Tactic: -8626990807754934295 Time: 0.246016 [03/25/2022-13:24:15] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_interior_nn_v1 Tactic: -8498217049614706532 [03/25/2022-13:24:15] [V] [TRT] Tactic: -8498217049614706532 Time: 0.198016 [03/25/2022-13:24:15] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_small_nn_v1 Tactic: -7303593854972602201 [03/25/2022-13:24:15] [V] [TRT] Tactic: -7303593854972602201 Time: 0.239232 [03/25/2022-13:24:15] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_medium_nn_v1 Tactic: -6585664687867083638 [03/25/2022-13:24:15] [V] [TRT] Tactic: -6585664687867083638 Time: 0.389376 [03/25/2022-13:24:15] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_interior_nn_v1 Tactic: -3326139578711341011 [03/25/2022-13:24:15] [V] [TRT] Tactic: -3326139578711341011 Time: 0.236928 [03/25/2022-13:24:15] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_small_nn_v1 Tactic: -683636008127039856 [03/25/2022-13:24:15] [V] [TRT] Tactic: -683636008127039856 Time: 0.389376 [03/25/2022-13:24:15] [V] [TRT] Fastest Tactic: -8498217049614706532 Time: 0.198016 [03/25/2022-13:24:15] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: -8498217049614706532 [03/25/2022-13:24:15] [V] [TRT] *************** Autotuning format combination: Int8(200704,3136:4,56,1) -> Int8(6272,3136:32,56,1) *************** [03/25/2022-13:24:15] [V] [TRT] --------------- Timing Runner: sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 (CaskConvolution) [03/25/2022-13:24:15] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_small_c32_nn_v1 Tactic: 1100922622480907544 [03/25/2022-13:24:15] [V] [TRT] Tactic: 1100922622480907544 Time: 0.24512 [03/25/2022-13:24:15] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_interior_c32_nn_v1 Tactic: 2855900226702061782 [03/25/2022-13:24:15] [V] [TRT] Tactic: 2855900226702061782 Time: 0.385408 [03/25/2022-13:24:15] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_medium_c32_nn_v1 Tactic: 3606311198834416176 [03/25/2022-13:24:15] [V] [TRT] Tactic: 3606311198834416176 Time: 0.206464 [03/25/2022-13:24:15] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_small_c32_nn_v1 Tactic: 4325765560739862899 [03/25/2022-13:24:15] [V] [TRT] Tactic: 4325765560739862899 Time: 0.388096 [03/25/2022-13:24:15] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_interior_c32_nn_v1 Tactic: 8803458114157674373 [03/25/2022-13:24:15] [V] [TRT] Tactic: 8803458114157674373 Time: 0.197504 [03/25/2022-13:24:15] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_interior_c32_nn_v1 Tactic: -6934773036503365000 [03/25/2022-13:24:15] [V] [TRT] Tactic: -6934773036503365000 Time: 0.23616 [03/25/2022-13:24:15] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_interior_c32_nn_v1 Tactic: -4431642509665791294 [03/25/2022-13:24:15] [V] [TRT] Tactic: -4431642509665791294 Time: 0.23168 [03/25/2022-13:24:15] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_medium_c32_nn_v1 Tactic: -4255737803793506479 [03/25/2022-13:24:15] [V] [TRT] Tactic: -4255737803793506479 Time: 0.386432 [03/25/2022-13:24:15] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_small_c32_nn_v1 Tactic: -3958182351168863467 [03/25/2022-13:24:15] [V] [TRT] Tactic: -3958182351168863467 Time: 0.236288 [03/25/2022-13:24:15] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_medium_c32_nn_v1 Tactic: -3111968753064955248 [03/25/2022-13:24:15] [V] [TRT] Tactic: -3111968753064955248 Time: 0.248832 [03/25/2022-13:24:15] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_medium_c32_nn_v1 Tactic: -1492575840277333548 [03/25/2022-13:24:15] [V] [TRT] Tactic: -1492575840277333548 Time: 0.247552 [03/25/2022-13:24:15] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_small_c32_nn_v1 Tactic: -868495160148524802 [03/25/2022-13:24:15] [V] [TRT] Tactic: -868495160148524802 Time: 0.205696 [03/25/2022-13:24:15] [V] [TRT] Fastest Tactic: 8803458114157674373 Time: 0.197504 [03/25/2022-13:24:15] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: 8803458114157674373 [03/25/2022-13:24:15] [V] [TRT] *************** Autotuning format combination: Int8(25088,3136:32,56,1) -> Int8(6272,3136:32,56,1) *************** [03/25/2022-13:24:15] [V] [TRT] --------------- Timing Runner: sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 (CudaGroupConvolution) [03/25/2022-13:24:15] [V] [TRT] CudaGroupConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:15] [V] [TRT] --------------- Timing Runner: sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 (CudaDepthwiseConvolution) [03/25/2022-13:24:15] [V] [TRT] CudaDepthwiseConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:15] [V] [TRT] --------------- Timing Runner: sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 (FusedConvActConvolution) [03/25/2022-13:24:15] [V] [TRT] FusedConvActConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:15] [V] [TRT] --------------- Timing Runner: sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 (CaskConvolution) [03/25/2022-13:24:15] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16_t1r1s1 Tactic: 68468667201176803 [03/25/2022-13:24:15] [V] [TRT] Tactic: 68468667201176803 Time: 0.115712 [03/25/2022-13:24:15] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 125145153013230687 [03/25/2022-13:24:15] [V] [TRT] Tactic: 125145153013230687 Time: 0.224256 [03/25/2022-13:24:15] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_interior_nt_v1 Tactic: 434957160407688216 [03/25/2022-13:24:15] [V] [TRT] Tactic: 434957160407688216 Time: 0.204544 [03/25/2022-13:24:15] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 805889586762897346 [03/25/2022-13:24:15] [V] [TRT] Tactic: 805889586762897346 Time: 0.128128 [03/25/2022-13:24:15] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: 857001784974286465 [03/25/2022-13:24:15] [V] [TRT] Tactic: 857001784974286465 Time: 0.22144 [03/25/2022-13:24:15] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 1214130898909872671 [03/25/2022-13:24:15] [V] [TRT] Tactic: 1214130898909872671 Time: 0.151808 [03/25/2022-13:24:15] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 1278425129871930205 [03/25/2022-13:24:15] [V] [TRT] Tactic: 1278425129871930205 Time: 0.121088 [03/25/2022-13:24:15] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 1583811548148740665 [03/25/2022-13:24:15] [V] [TRT] Tactic: 1583811548148740665 Time: 0.19968 [03/25/2022-13:24:15] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 1701344857577810806 [03/25/2022-13:24:15] [V] [TRT] Tactic: 1701344857577810806 Time: 0.120704 [03/25/2022-13:24:15] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 1797231177354918208 [03/25/2022-13:24:15] [V] [TRT] Tactic: 1797231177354918208 Time: 0.108288 [03/25/2022-13:24:15] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2004812516525036381 [03/25/2022-13:24:15] [V] [TRT] Tactic: 2004812516525036381 Time: 0.17024 [03/25/2022-13:24:15] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2030033463723799063 [03/25/2022-13:24:15] [V] [TRT] Tactic: 2030033463723799063 Time: 0.126336 [03/25/2022-13:24:15] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_interior_nt_v1 Tactic: 2346437292116182513 [03/25/2022-13:24:15] [V] [TRT] Tactic: 2346437292116182513 Time: 0.168192 [03/25/2022-13:24:15] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2376898825218218566 [03/25/2022-13:24:16] [V] [TRT] Tactic: 2376898825218218566 Time: 0.108672 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_interior_nt_v1 Tactic: 2522133112320625287 [03/25/2022-13:24:16] [V] [TRT] Tactic: 2522133112320625287 Time: 0.116992 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2548171972648455240 [03/25/2022-13:24:16] [V] [TRT] Tactic: 2548171972648455240 Time: 0.104448 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 2548946449357458230 [03/25/2022-13:24:16] [V] [TRT] Tactic: 2548946449357458230 Time: 0.118912 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: 2570666021825229009 [03/25/2022-13:24:16] [V] [TRT] Tactic: 2570666021825229009 Time: 0.116992 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 2678520742286844763 [03/25/2022-13:24:16] [V] [TRT] Tactic: 2678520742286844763 Time: 0.192896 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 2756291002030759362 [03/25/2022-13:24:16] [V] [TRT] Tactic: 2756291002030759362 Time: 0.103552 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2972948223367788520 [03/25/2022-13:24:16] [V] [TRT] Tactic: 2972948223367788520 Time: 0.1152 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_medium_nt_v1 Tactic: 2985940154541537814 [03/25/2022-13:24:16] [V] [TRT] Tactic: 2985940154541537814 Time: 0.167296 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 3043273137345374664 [03/25/2022-13:24:16] [V] [TRT] Tactic: 3043273137345374664 Time: 0.153088 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 3221677093659484230 [03/25/2022-13:24:16] [V] [TRT] Tactic: 3221677093659484230 Time: 0.186368 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: 3242897809704328258 [03/25/2022-13:24:16] [V] [TRT] Tactic: 3242897809704328258 Time: 0.11264 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 3312456766204252694 [03/25/2022-13:24:16] [V] [TRT] Tactic: 3312456766204252694 Time: 0.120448 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 3538565962642681625 [03/25/2022-13:24:16] [V] [TRT] Tactic: 3538565962642681625 Time: 0.102784 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 3541919052468401776 [03/25/2022-13:24:16] [V] [TRT] Tactic: 3541919052468401776 Time: 0.150912 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 3593397928177382100 [03/25/2022-13:24:16] [V] [TRT] Tactic: 3593397928177382100 Time: 0.151808 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 3670282018109435863 [03/25/2022-13:24:16] [V] [TRT] Tactic: 3670282018109435863 Time: 0.103168 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 3671413346254027573 [03/25/2022-13:24:16] [V] [TRT] Tactic: 3671413346254027573 Time: 0.115712 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_medium_nt_v1 Tactic: 3899284354987683408 [03/25/2022-13:24:16] [V] [TRT] Tactic: 3899284354987683408 Time: 0.202752 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 3927509214678622419 [03/25/2022-13:24:16] [V] [TRT] Tactic: 3927509214678622419 Time: 0.109952 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 4112572034735311841 [03/25/2022-13:24:16] [V] [TRT] Tactic: 4112572034735311841 Time: 0.168448 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 4239974928951431644 [03/25/2022-13:24:16] [V] [TRT] Tactic: 4239974928951431644 Time: 0.121856 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 4610760414797216079 [03/25/2022-13:24:16] [V] [TRT] Tactic: 4610760414797216079 Time: 0.131328 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_small_nt_v1 Tactic: 4717285412741024953 [03/25/2022-13:24:16] [V] [TRT] Tactic: 4717285412741024953 Time: 0.119424 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 4796956614760326119 [03/25/2022-13:24:16] [V] [TRT] Tactic: 4796956614760326119 Time: 0.140032 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 4919361344804309192 [03/25/2022-13:24:16] [V] [TRT] Tactic: 4919361344804309192 Time: 0.115456 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 5043674678294309681 [03/25/2022-13:24:16] [V] [TRT] Tactic: 5043674678294309681 Time: 0.103296 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_interior_nt_v1 Tactic: 5126565865931538390 [03/25/2022-13:24:16] [V] [TRT] Tactic: 5126565865931538390 Time: 0.11776 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5204702486885981735 [03/25/2022-13:24:16] [V] [TRT] Tactic: 5204702486885981735 Time: 0.107136 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 5375256703210220108 [03/25/2022-13:24:16] [V] [TRT] Tactic: 5375256703210220108 Time: 0.101888 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 5424258848951129084 [03/25/2022-13:24:16] [V] [TRT] Tactic: 5424258848951129084 Time: 0.107648 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5443897483205284103 [03/25/2022-13:24:16] [V] [TRT] Tactic: 5443897483205284103 Time: 0.118912 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5707566217891294846 [03/25/2022-13:24:16] [V] [TRT] Tactic: 5707566217891294846 Time: 0.107008 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5986622376339202983 [03/25/2022-13:24:16] [V] [TRT] Tactic: 5986622376339202983 Time: 0.132352 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6007888770437705057 [03/25/2022-13:24:16] [V] [TRT] Tactic: 6007888770437705057 Time: 0.148992 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 6405251167055673379 [03/25/2022-13:24:16] [V] [TRT] Tactic: 6405251167055673379 Time: 0.14272 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 6433368103202497147 [03/25/2022-13:24:16] [V] [TRT] Tactic: 6433368103202497147 Time: 0.109696 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6441948709525127755 [03/25/2022-13:24:16] [V] [TRT] Tactic: 6441948709525127755 Time: 0.152064 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 6443933097134654777 [03/25/2022-13:24:16] [V] [TRT] Tactic: 6443933097134654777 Time: 0.255232 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6457435868048963632 [03/25/2022-13:24:16] [V] [TRT] Tactic: 6457435868048963632 Time: 0.133504 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1 Tactic: 6510345569544721081 [03/25/2022-13:24:16] [V] [TRT] Tactic: 6510345569544721081 Time: 0.11712 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 6793988781414507278 [03/25/2022-13:24:16] [V] [TRT] Tactic: 6793988781414507278 Time: 0.11584 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 6880710371738875469 [03/25/2022-13:24:16] [V] [TRT] Tactic: 6880710371738875469 Time: 0.161152 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6925201228918187099 [03/25/2022-13:24:16] [V] [TRT] Tactic: 6925201228918187099 Time: 0.11904 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6991524515605108718 [03/25/2022-13:24:16] [V] [TRT] Tactic: 6991524515605108718 Time: 0.22144 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 7245509442265271220 [03/25/2022-13:24:16] [V] [TRT] Tactic: 7245509442265271220 Time: 0.129408 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 7318929579222925725 [03/25/2022-13:24:16] [V] [TRT] Tactic: 7318929579222925725 Time: 0.107136 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 7731430299029542276 [03/25/2022-13:24:16] [V] [TRT] Tactic: 7731430299029542276 Time: 0.112768 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 7738495016763012180 [03/25/2022-13:24:16] [V] [TRT] Tactic: 7738495016763012180 Time: 0.22208 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: 7886967395128926382 [03/25/2022-13:24:16] [V] [TRT] Tactic: 7886967395128926382 Time: 0.101632 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 8142283985160822229 [03/25/2022-13:24:16] [V] [TRT] Tactic: 8142283985160822229 Time: 0.141312 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 8173975624668590862 [03/25/2022-13:24:16] [V] [TRT] Tactic: 8173975624668590862 Time: 0.252288 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 8234775147403903473 [03/25/2022-13:24:16] [V] [TRT] Tactic: 8234775147403903473 Time: 0.147968 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 8524082966802584889 [03/25/2022-13:24:16] [V] [TRT] Tactic: 8524082966802584889 Time: 0.119808 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 8684013308930763400 [03/25/2022-13:24:16] [V] [TRT] Tactic: 8684013308930763400 Time: 0.113024 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 8765382722978397630 [03/25/2022-13:24:16] [V] [TRT] Tactic: 8765382722978397630 Time: 0.120576 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 8843193587782643431 [03/25/2022-13:24:16] [V] [TRT] Tactic: 8843193587782643431 Time: 0.126336 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 8883810517410230831 [03/25/2022-13:24:16] [V] [TRT] Tactic: 8883810517410230831 Time: 0.10176 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 8930797211803511337 [03/25/2022-13:24:16] [V] [TRT] Tactic: 8930797211803511337 Time: 0.208768 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 8935070489925739043 [03/25/2022-13:24:16] [V] [TRT] Tactic: 8935070489925739043 Time: 0.107648 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1 Tactic: 9062173295331155069 [03/25/2022-13:24:16] [V] [TRT] Tactic: 9062173295331155069 Time: 0.192896 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_small_nt_v1 Tactic: -9118785798277698619 [03/25/2022-13:24:16] [V] [TRT] Tactic: -9118785798277698619 Time: 0.118016 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -8985599729413291927 [03/25/2022-13:24:16] [V] [TRT] Tactic: -8985599729413291927 Time: 0.101888 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -8972697510150675429 [03/25/2022-13:24:16] [V] [TRT] Tactic: -8972697510150675429 Time: 0.10304 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -8943710627305202139 [03/25/2022-13:24:16] [V] [TRT] Tactic: -8943710627305202139 Time: 0.160768 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -8859846367886814331 [03/25/2022-13:24:16] [V] [TRT] Tactic: -8859846367886814331 Time: 0.114816 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -8638624340850784688 [03/25/2022-13:24:16] [V] [TRT] Tactic: -8638624340850784688 Time: 0.143104 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -8556775352640313933 [03/25/2022-13:24:16] [V] [TRT] Tactic: -8556775352640313933 Time: 0.105728 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -8382298409581540699 [03/25/2022-13:24:16] [V] [TRT] Tactic: -8382298409581540699 Time: 0.132352 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -8172318747337038866 [03/25/2022-13:24:16] [V] [TRT] Tactic: -8172318747337038866 Time: 0.194048 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -8038164441468184723 [03/25/2022-13:24:16] [V] [TRT] Tactic: -8038164441468184723 Time: 0.213248 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -7844028314176826857 [03/25/2022-13:24:16] [V] [TRT] Tactic: -7844028314176826857 Time: 0.153984 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -7364286662638617917 [03/25/2022-13:24:16] [V] [TRT] Tactic: -7364286662638617917 Time: 0.2112 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r1s1 Tactic: -7361755530333096258 [03/25/2022-13:24:16] [V] [TRT] Tactic: -7361755530333096258 Time: 0.122112 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_t1r1s1 Tactic: -7289760022626653388 [03/25/2022-13:24:16] [V] [TRT] Tactic: -7289760022626653388 Time: 0.10176 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -7106539943789766885 [03/25/2022-13:24:16] [V] [TRT] Tactic: -7106539943789766885 Time: 0.34048 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -6969478418607271266 [03/25/2022-13:24:16] [V] [TRT] Tactic: -6969478418607271266 Time: 0.188544 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -6930438165437733000 [03/25/2022-13:24:16] [V] [TRT] Tactic: -6930438165437733000 Time: 0.168064 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -6879607992933502380 [03/25/2022-13:24:16] [V] [TRT] Tactic: -6879607992933502380 Time: 0.106112 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -6839669803644810934 [03/25/2022-13:24:16] [V] [TRT] Tactic: -6839669803644810934 Time: 0.103168 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -6812830108414456369 [03/25/2022-13:24:16] [V] [TRT] Tactic: -6812830108414456369 Time: 0.102784 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -6527178416855951297 [03/25/2022-13:24:16] [V] [TRT] Tactic: -6527178416855951297 Time: 0.117888 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -6510232214299595844 [03/25/2022-13:24:16] [V] [TRT] Tactic: -6510232214299595844 Time: 0.118144 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_small_nt_v1 Tactic: -6400348606759295499 [03/25/2022-13:24:16] [V] [TRT] Tactic: -6400348606759295499 Time: 0.161152 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -6346247605026339453 [03/25/2022-13:24:16] [V] [TRT] Tactic: -6346247605026339453 Time: 0.157824 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -6232597026469067819 [03/25/2022-13:24:16] [V] [TRT] Tactic: -6232597026469067819 Time: 0.14208 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_small_nt_v1 Tactic: -5980889159865208399 [03/25/2022-13:24:16] [V] [TRT] Tactic: -5980889159865208399 Time: 0.19776 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_medium_nt_v1 Tactic: -5766140806760372989 [03/25/2022-13:24:16] [V] [TRT] Tactic: -5766140806760372989 Time: 0.1184 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -5697614955743334137 [03/25/2022-13:24:16] [V] [TRT] Tactic: -5697614955743334137 Time: 0.155904 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -5671123121710113970 [03/25/2022-13:24:16] [V] [TRT] Tactic: -5671123121710113970 Time: 0.126336 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -5615581362569252260 [03/25/2022-13:24:16] [V] [TRT] Tactic: -5615581362569252260 Time: 0.105728 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -5562968047117507056 [03/25/2022-13:24:16] [V] [TRT] Tactic: -5562968047117507056 Time: 0.147072 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -5516472881360101487 [03/25/2022-13:24:16] [V] [TRT] Tactic: -5516472881360101487 Time: 0.142848 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -5311474420963248369 [03/25/2022-13:24:16] [V] [TRT] Tactic: -5311474420963248369 Time: 0.12288 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -5170003087447722174 [03/25/2022-13:24:16] [V] [TRT] Tactic: -5170003087447722174 Time: 0.152064 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -4889586143772361690 [03/25/2022-13:24:16] [V] [TRT] Tactic: -4889586143772361690 Time: 0.120576 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -4889498558023475527 [03/25/2022-13:24:16] [V] [TRT] Tactic: -4889498558023475527 Time: 0.101376 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -4849712423393454704 [03/25/2022-13:24:16] [V] [TRT] Tactic: -4849712423393454704 Time: 0.127232 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: -4681913707320020520 [03/25/2022-13:24:16] [V] [TRT] Tactic: -4681913707320020520 Time: 0.218752 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_medium_nt_v1 Tactic: -4516822589357530549 [03/25/2022-13:24:16] [V] [TRT] Tactic: -4516822589357530549 Time: 0.12096 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -4455415102719506646 [03/25/2022-13:24:16] [V] [TRT] Tactic: -4455415102719506646 Time: 0.102656 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -4425346730823666456 [03/25/2022-13:24:16] [V] [TRT] Tactic: -4425346730823666456 Time: 0.185728 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r1s1 Tactic: -4260476497340370474 [03/25/2022-13:24:16] [V] [TRT] Tactic: -4260476497340370474 Time: 0.134912 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -4182501876984672402 [03/25/2022-13:24:16] [V] [TRT] Tactic: -4182501876984672402 Time: 0.127232 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_t1r1s1 Tactic: -4151617293257698859 [03/25/2022-13:24:16] [V] [TRT] Tactic: -4151617293257698859 Time: 0.259968 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -3825889760337461729 [03/25/2022-13:24:16] [V] [TRT] Tactic: -3825889760337461729 Time: 0.114944 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -3797022944823726673 [03/25/2022-13:24:16] [V] [TRT] Tactic: -3797022944823726673 Time: 0.102656 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -3613322253849278738 [03/25/2022-13:24:16] [V] [TRT] Tactic: -3613322253849278738 Time: 0.169472 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -3577322188448771475 [03/25/2022-13:24:16] [V] [TRT] Tactic: -3577322188448771475 Time: 0.104832 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16_t1r1s1 Tactic: -3531681826488401618 [03/25/2022-13:24:16] [V] [TRT] Tactic: -3531681826488401618 Time: 0.350976 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -3305554949874552860 [03/25/2022-13:24:16] [V] [TRT] Tactic: -3305554949874552860 Time: 0.192768 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -3288585994448820820 [03/25/2022-13:24:16] [V] [TRT] Tactic: -3288585994448820820 Time: 0.219136 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -2754311112012636251 [03/25/2022-13:24:16] [V] [TRT] Tactic: -2754311112012636251 Time: 0.106112 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -2432868635536396215 [03/25/2022-13:24:16] [V] [TRT] Tactic: -2432868635536396215 Time: 0.198144 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -2379804152300264660 [03/25/2022-13:24:16] [V] [TRT] Tactic: -2379804152300264660 Time: 0.152832 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: -2352253835013627337 [03/25/2022-13:24:16] [V] [TRT] Tactic: -2352253835013627337 Time: 0.107776 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -2335587136911650799 [03/25/2022-13:24:16] [V] [TRT] Tactic: -2335587136911650799 Time: 0.12096 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r1s1 Tactic: -2315453944962430928 [03/25/2022-13:24:16] [V] [TRT] Tactic: -2315453944962430928 Time: 0.261376 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: -2238364958919154661 [03/25/2022-13:24:16] [V] [TRT] Tactic: -2238364958919154661 Time: 0.116992 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -1916483171117495388 [03/25/2022-13:24:16] [V] [TRT] Tactic: -1916483171117495388 Time: 0.215552 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -1740762957710554518 [03/25/2022-13:24:16] [V] [TRT] Tactic: -1740762957710554518 Time: 0.192896 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -1549742793039499659 [03/25/2022-13:24:16] [V] [TRT] Tactic: -1549742793039499659 Time: 0.144256 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -1499578657823798783 [03/25/2022-13:24:16] [V] [TRT] Tactic: -1499578657823798783 Time: 0.103168 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -1494157908358500249 [03/25/2022-13:24:16] [V] [TRT] Tactic: -1494157908358500249 Time: 0.117248 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_t1r1s1 Tactic: -1328736756812546664 [03/25/2022-13:24:16] [V] [TRT] Tactic: -1328736756812546664 Time: 0.135296 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -1006589727652607355 [03/25/2022-13:24:16] [V] [TRT] Tactic: -1006589727652607355 Time: 0.105728 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -713022856474991236 [03/25/2022-13:24:16] [V] [TRT] Tactic: -713022856474991236 Time: 0.169088 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -619668460699260222 [03/25/2022-13:24:16] [V] [TRT] Tactic: -619668460699260222 Time: 0.116992 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -405554772060757402 [03/25/2022-13:24:16] [V] [TRT] Tactic: -405554772060757402 Time: 0.10944 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -375949437730908730 [03/25/2022-13:24:16] [V] [TRT] Tactic: -375949437730908730 Time: 0.102656 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: -233227833606287806 [03/25/2022-13:24:16] [V] [TRT] Tactic: -233227833606287806 Time: 0.103168 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -111878368089469751 [03/25/2022-13:24:16] [V] [TRT] Tactic: -111878368089469751 Time: 0.167808 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -48936598874722005 [03/25/2022-13:24:16] [V] [TRT] Tactic: -48936598874722005 Time: 0.102912 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -19707840769375107 [03/25/2022-13:24:16] [V] [TRT] Tactic: -19707840769375107 Time: 0.104448 [03/25/2022-13:24:16] [V] [TRT] Fastest Tactic: -4889498558023475527 Time: 0.101376 [03/25/2022-13:24:16] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: -4889498558023475527 [03/25/2022-13:24:16] [V] [TRT] =============== Computing costs for [03/25/2022-13:24:16] [V] [TRT] *************** Autotuning format combination: Int8(50176,3136:4,56,1) -> Int8(50176,3136:4,56,1) *************** [03/25/2022-13:24:16] [V] [TRT] *************** Autotuning format combination: Int8(50176,3136:4,56,1) -> Int8(6272,3136:32,56,1) *************** [03/25/2022-13:24:16] [V] [TRT] *************** Autotuning format combination: Int8(6272,3136:32,56,1) -> Int8(6272,3136:32,56,1) *************** [03/25/2022-13:24:16] [V] [TRT] Skip timing cache hit (epiFadd mismatch) for node: sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111, LayerImpl: CaskConvolution, tactic: -3182884991006484042 [03/25/2022-13:24:16] [V] [TRT] --------------- Timing Runner: sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 (CudaGroupConvolution) [03/25/2022-13:24:16] [V] [TRT] CudaGroupConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:16] [V] [TRT] --------------- Timing Runner: sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 (CudaDepthwiseConvolution) [03/25/2022-13:24:16] [V] [TRT] CudaDepthwiseConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:16] [V] [TRT] --------------- Timing Runner: sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 (FusedConvActConvolution) [03/25/2022-13:24:16] [V] [TRT] FusedConvActConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:16] [V] [TRT] --------------- Timing Runner: sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 (CaskConvolution) [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_epifadd Tactic: 177040020707947851 [03/25/2022-13:24:16] [V] [TRT] Tactic: 177040020707947851 Time: 0.236672 [03/25/2022-13:24:16] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 184229963126259101 [03/25/2022-13:24:17] [V] [TRT] Tactic: 184229963126259101 Time: 0.19392 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 289888059097454627 [03/25/2022-13:24:17] [V] [TRT] Tactic: 289888059097454627 Time: 0.30336 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32 Tactic: 328135613486708155 [03/25/2022-13:24:17] [V] [TRT] Tactic: 328135613486708155 Time: 0.310656 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 680740992583869928 [03/25/2022-13:24:17] [V] [TRT] Tactic: 680740992583869928 Time: 0.267264 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32 Tactic: 1111159740952609683 [03/25/2022-13:24:17] [V] [TRT] Tactic: 1111159740952609683 Time: 0.318848 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: 1134860903395928905 [03/25/2022-13:24:17] [V] [TRT] Tactic: 1134860903395928905 Time: 0.168192 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: 1276591930377039442 [03/25/2022-13:24:17] [V] [TRT] Tactic: 1276591930377039442 Time: 0.205312 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r3s3 Tactic: 1388866374720163187 [03/25/2022-13:24:17] [V] [TRT] Tactic: 1388866374720163187 Time: 0.187648 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32 Tactic: 1399501420456320585 [03/25/2022-13:24:17] [V] [TRT] Tactic: 1399501420456320585 Time: 0.207744 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: 1550399266192842845 [03/25/2022-13:24:17] [V] [TRT] Tactic: 1550399266192842845 Time: 0.19968 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 1572887561103143487 [03/25/2022-13:24:17] [V] [TRT] Tactic: 1572887561103143487 Time: 0.127488 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 1853122447892949466 [03/25/2022-13:24:17] [V] [TRT] Tactic: 1853122447892949466 Time: 0.2048 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: 2133329569091732311 [03/25/2022-13:24:17] [V] [TRT] Tactic: 2133329569091732311 Time: 0.270208 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32 Tactic: 2325023763229477890 [03/25/2022-13:24:17] [V] [TRT] Tactic: 2325023763229477890 Time: 0.100096 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: 2579824863892891529 [03/25/2022-13:24:17] [V] [TRT] Tactic: 2579824863892891529 Time: 0.212224 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32 Tactic: 2783960536172159663 [03/25/2022-13:24:17] [V] [TRT] Tactic: 2783960536172159663 Time: 0.104704 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32 Tactic: 2821711838552913693 [03/25/2022-13:24:17] [V] [TRT] Tactic: 2821711838552913693 Time: 0.181376 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64 Tactic: 2945009978756227538 [03/25/2022-13:24:17] [V] [TRT] Tactic: 2945009978756227538 Time: 0.13312 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_medium_nt_v1 Tactic: 2985940154541537814 [03/25/2022-13:24:17] [V] [TRT] Tactic: 2985940154541537814 Time: 0.270848 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 3284282970967328046 [03/25/2022-13:24:17] [V] [TRT] Tactic: 3284282970967328046 Time: 0.224384 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 3401614690060226673 [03/25/2022-13:24:17] [V] [TRT] Tactic: 3401614690060226673 Time: 0.178688 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32 Tactic: 3456719996792527006 [03/25/2022-13:24:17] [V] [TRT] Tactic: 3456719996792527006 Time: 0.169472 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 3512426920013359699 [03/25/2022-13:24:17] [V] [TRT] Tactic: 3512426920013359699 Time: 0.130816 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64 Tactic: 3651043333819148268 [03/25/2022-13:24:17] [V] [TRT] Tactic: 3651043333819148268 Time: 0.35136 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_medium_nt_v1 Tactic: 3899284354987683408 [03/25/2022-13:24:17] [V] [TRT] Tactic: 3899284354987683408 Time: 0.313984 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32 Tactic: 4042202769383439184 [03/25/2022-13:24:17] [V] [TRT] Tactic: 4042202769383439184 Time: 0.209024 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_large_nt_v1 Tactic: 4182625619810185112 [03/25/2022-13:24:17] [V] [TRT] Tactic: 4182625619810185112 Time: 0.156032 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 4214794893922618058 [03/25/2022-13:24:17] [V] [TRT] Tactic: 4214794893922618058 Time: 0.26752 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_epifadd Tactic: 4259547356717612415 [03/25/2022-13:24:17] [V] [TRT] Tactic: 4259547356717612415 Time: 0.136192 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 4384868749799132354 [03/25/2022-13:24:17] [V] [TRT] Tactic: 4384868749799132354 Time: 0.210304 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64 Tactic: 4414594337986714263 [03/25/2022-13:24:17] [V] [TRT] Tactic: 4414594337986714263 Time: 0.170496 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_small_nt_v1 Tactic: 4717285412741024953 [03/25/2022-13:24:17] [V] [TRT] Tactic: 4717285412741024953 Time: 0.147456 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32 Tactic: 4734519122557206480 [03/25/2022-13:24:17] [V] [TRT] Tactic: 4734519122557206480 Time: 0.385792 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: 4922297020351187339 [03/25/2022-13:24:17] [V] [TRT] Tactic: 4922297020351187339 Time: 0.272512 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16_t1r3s3 Tactic: 4931167631624420067 [03/25/2022-13:24:17] [V] [TRT] Tactic: 4931167631624420067 Time: 0.583424 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_epifadd Tactic: 5121596860264626879 [03/25/2022-13:24:17] [V] [TRT] Tactic: 5121596860264626879 Time: 0.344576 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 5136656982162849059 [03/25/2022-13:24:17] [V] [TRT] Tactic: 5136656982162849059 Time: 0.22528 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: 5158259316594207439 [03/25/2022-13:24:17] [V] [TRT] Tactic: 5158259316594207439 Time: 0.207232 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 5189825015507701541 [03/25/2022-13:24:17] [V] [TRT] Tactic: 5189825015507701541 Time: 0.321792 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32 Tactic: 5424417905073460656 [03/25/2022-13:24:17] [V] [TRT] Tactic: 5424417905073460656 Time: 0.153344 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32 Tactic: 5442043907221427810 [03/25/2022-13:24:17] [V] [TRT] Tactic: 5442043907221427810 Time: 0.389888 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_t1r3s3 Tactic: 5544365258913999384 [03/25/2022-13:24:17] [V] [TRT] Tactic: 5544365258913999384 Time: 0.373504 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r3s3 Tactic: 5641967928706599451 [03/25/2022-13:24:17] [V] [TRT] Tactic: 5641967928706599451 Time: 0.243328 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 5721595115357140131 [03/25/2022-13:24:17] [V] [TRT] Tactic: 5721595115357140131 Time: 0.1824 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 5966973378912044513 [03/25/2022-13:24:17] [V] [TRT] Tactic: 5966973378912044513 Time: 0.094208 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 6004789655466615912 [03/25/2022-13:24:17] [V] [TRT] Tactic: 6004789655466615912 Time: 0.127872 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r3s3 Tactic: 6146901278630392829 [03/25/2022-13:24:17] [V] [TRT] Tactic: 6146901278630392829 Time: 0.377088 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16 Tactic: 6394572396369862482 [03/25/2022-13:24:17] [V] [TRT] Tactic: 6394572396369862482 Time: 0.247552 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_epifadd Tactic: 6434020722187266170 [03/25/2022-13:24:17] [V] [TRT] Tactic: 6434020722187266170 Time: 0.186496 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32 Tactic: 6781129591847482048 [03/25/2022-13:24:17] [V] [TRT] Tactic: 6781129591847482048 Time: 0.121344 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 6984451771200230840 [03/25/2022-13:24:17] [V] [TRT] Tactic: 6984451771200230840 Time: 0.204032 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 7048234086361926570 [03/25/2022-13:24:17] [V] [TRT] Tactic: 7048234086361926570 Time: 0.166272 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32 Tactic: 7077570591813340966 [03/25/2022-13:24:17] [V] [TRT] Tactic: 7077570591813340966 Time: 0.205568 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 7191893591576074000 [03/25/2022-13:24:17] [V] [TRT] Tactic: 7191893591576074000 Time: 0.193024 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_t1r3s3 Tactic: 7429976449747682901 [03/25/2022-13:24:17] [V] [TRT] Tactic: 7429976449747682901 Time: 0.196096 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 7438984192263206338 [03/25/2022-13:24:17] [V] [TRT] Tactic: 7438984192263206338 Time: 0.194048 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: 7504901284678552178 [03/25/2022-13:24:17] [V] [TRT] Tactic: 7504901284678552178 Time: 0.164608 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 8096257414008860171 [03/25/2022-13:24:17] [V] [TRT] Tactic: 8096257414008860171 Time: 0.111872 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 8128112048355596715 [03/25/2022-13:24:17] [V] [TRT] Tactic: 8128112048355596715 Time: 0.108928 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_indexed_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: 8751622450593766232 [03/25/2022-13:24:17] [V] [TRT] Tactic: 8751622450593766232 Time: 0.177536 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_indexed_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32 Tactic: 9064458886956700976 [03/25/2022-13:24:17] [V] [TRT] Tactic: 9064458886956700976 Time: 0.181504 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 9143438935315839085 [03/25/2022-13:24:17] [V] [TRT] Tactic: 9143438935315839085 Time: 0.1792 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32 Tactic: -9165697322068360861 [03/25/2022-13:24:17] [V] [TRT] Tactic: -9165697322068360861 Time: 0.207744 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_small_nt_v1 Tactic: -9118785798277698619 [03/25/2022-13:24:17] [V] [TRT] Tactic: -9118785798277698619 Time: 0.142592 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: -9108166971364503411 [03/25/2022-13:24:17] [V] [TRT] Tactic: -9108166971364503411 Time: 0.170368 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16 Tactic: -8861822316054763526 [03/25/2022-13:24:17] [V] [TRT] Tactic: -8861822316054763526 Time: 0.309504 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64 Tactic: -8791277710877987710 [03/25/2022-13:24:17] [V] [TRT] Tactic: -8791277710877987710 Time: 0.225536 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: -8691377209893505057 [03/25/2022-13:24:17] [V] [TRT] Tactic: -8691377209893505057 Time: 0.169216 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: -8520292213102999339 [03/25/2022-13:24:17] [V] [TRT] Tactic: -8520292213102999339 Time: 0.304128 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16_t1r3s3 Tactic: -8475551154769412306 [03/25/2022-13:24:17] [V] [TRT] Tactic: -8475551154769412306 Time: 0.14656 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -8417388128970254446 [03/25/2022-13:24:17] [V] [TRT] Tactic: -8417388128970254446 Time: 0.306432 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32 Tactic: -8263994888336646547 [03/25/2022-13:24:17] [V] [TRT] Tactic: -8263994888336646547 Time: 0.16576 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32 Tactic: -8205948405243401049 [03/25/2022-13:24:17] [V] [TRT] Tactic: -8205948405243401049 Time: 0.200448 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: -7992068592656168418 [03/25/2022-13:24:17] [V] [TRT] Tactic: -7992068592656168418 Time: 0.109824 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_t1r3s3 Tactic: -7898477046581738867 [03/25/2022-13:24:17] [V] [TRT] Tactic: -7898477046581738867 Time: 0.136448 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_epifadd Tactic: -7842775553137511386 [03/25/2022-13:24:17] [V] [TRT] Tactic: -7842775553137511386 Time: 0.098688 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32 Tactic: -7683887278997527517 [03/25/2022-13:24:17] [V] [TRT] Tactic: -7683887278997527517 Time: 0.179456 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: -7381370635708568663 [03/25/2022-13:24:17] [V] [TRT] Tactic: -7381370635708568663 Time: 0.114304 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -7129320389887881029 [03/25/2022-13:24:17] [V] [TRT] Tactic: -7129320389887881029 Time: 0.158976 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -6959995514028471820 [03/25/2022-13:24:17] [V] [TRT] Tactic: -6959995514028471820 Time: 0.265856 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_small_nt_v1 Tactic: -6400348606759295499 [03/25/2022-13:24:17] [V] [TRT] Tactic: -6400348606759295499 Time: 0.26176 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: -6371781333659293809 [03/25/2022-13:24:17] [V] [TRT] Tactic: -6371781333659293809 Time: 0.168064 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32 Tactic: -6256128573036943404 [03/25/2022-13:24:17] [V] [TRT] Tactic: -6256128573036943404 Time: 0.208256 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_small_nt_v1 Tactic: -5980889159865208399 [03/25/2022-13:24:17] [V] [TRT] Tactic: -5980889159865208399 Time: 0.309632 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_medium_nt_v1 Tactic: -5766140806760372989 [03/25/2022-13:24:17] [V] [TRT] Tactic: -5766140806760372989 Time: 0.148736 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: -5709079507616090666 [03/25/2022-13:24:17] [V] [TRT] Tactic: -5709079507616090666 Time: 0.160896 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: -5698636014239116282 [03/25/2022-13:24:17] [V] [TRT] Tactic: -5698636014239116282 Time: 0.332288 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16 Tactic: -5180570335464125033 [03/25/2022-13:24:17] [V] [TRT] Tactic: -5180570335464125033 Time: 0.154496 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -4933563390723451692 [03/25/2022-13:24:17] [V] [TRT] Tactic: -4933563390723451692 Time: 0.131968 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_medium_nt_v1 Tactic: -4516822589357530549 [03/25/2022-13:24:17] [V] [TRT] Tactic: -4516822589357530549 Time: 0.152832 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64 Tactic: -4232916483289779353 [03/25/2022-13:24:17] [V] [TRT] Tactic: -4232916483289779353 Time: 0.415872 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64 Tactic: -3460842194336717186 [03/25/2022-13:24:17] [V] [TRT] Tactic: -3460842194336717186 Time: 0.114944 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: -3413217501222406256 [03/25/2022-13:24:17] [V] [TRT] Tactic: -3413217501222406256 Time: 0.178944 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: -3280888557222886418 [03/25/2022-13:24:17] [V] [TRT] Tactic: -3280888557222886418 Time: 0.146176 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -3238475748440751107 [03/25/2022-13:24:17] [V] [TRT] Tactic: -3238475748440751107 Time: 0.193792 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: -3182884991006484042 [03/25/2022-13:24:17] [V] [TRT] Tactic: -3182884991006484042 Time: 0.095744 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -3173468756112541306 [03/25/2022-13:24:17] [V] [TRT] Tactic: -3173468756112541306 Time: 0.192512 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_large_nt_v1 Tactic: -2917455979290586480 [03/25/2022-13:24:17] [V] [TRT] Tactic: -2917455979290586480 Time: 0.314112 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -2741641298163591508 [03/25/2022-13:24:17] [V] [TRT] Tactic: -2741641298163591508 Time: 0.200064 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_large_nt_v1 Tactic: -2571022005763160364 [03/25/2022-13:24:17] [V] [TRT] Tactic: -2571022005763160364 Time: 0.280448 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: -2499089240293650188 [03/25/2022-13:24:17] [V] [TRT] Tactic: -2499089240293650188 Time: 0.267264 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: -2328318099174473157 [03/25/2022-13:24:17] [V] [TRT] Tactic: -2328318099174473157 Time: 0.173568 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: -2083778562631872334 [03/25/2022-13:24:17] [V] [TRT] Tactic: -2083778562631872334 Time: 0.118272 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32 Tactic: -2054375205435666404 [03/25/2022-13:24:17] [V] [TRT] Tactic: -2054375205435666404 Time: 0.204032 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -1546787387293556842 [03/25/2022-13:24:17] [V] [TRT] Tactic: -1546787387293556842 Time: 0.158464 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32 Tactic: -1498626619443284096 [03/25/2022-13:24:17] [V] [TRT] Tactic: -1498626619443284096 Time: 0.137216 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -1471245223605064669 [03/25/2022-13:24:17] [V] [TRT] Tactic: -1471245223605064669 Time: 0.277632 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32 Tactic: -1283580231568512025 [03/25/2022-13:24:17] [V] [TRT] Tactic: -1283580231568512025 Time: 0.220416 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: -1224421172675151280 [03/25/2022-13:24:17] [V] [TRT] Tactic: -1224421172675151280 Time: 0.097536 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_epifadd Tactic: -1173968681844185579 [03/25/2022-13:24:17] [V] [TRT] Tactic: -1173968681844185579 Time: 0.226432 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -921247911551089037 [03/25/2022-13:24:17] [V] [TRT] Tactic: -921247911551089037 Time: 0.161664 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32 Tactic: -762222380308749469 [03/25/2022-13:24:17] [V] [TRT] Tactic: -762222380308749469 Time: 0.138496 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: -556794153877490941 [03/25/2022-13:24:17] [V] [TRT] Tactic: -556794153877490941 Time: 0.141568 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -516725800067794372 [03/25/2022-13:24:17] [V] [TRT] Tactic: -516725800067794372 Time: 0.199424 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_large_nt_v1 Tactic: -428104331444385564 [03/25/2022-13:24:17] [V] [TRT] Tactic: -428104331444385564 Time: 0.150784 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16 Tactic: -366411318217594794 [03/25/2022-13:24:17] [V] [TRT] Tactic: -366411318217594794 Time: 0.190592 [03/25/2022-13:24:17] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16 Tactic: -351548418071036983 [03/25/2022-13:24:17] [V] [TRT] Tactic: -351548418071036983 Time: 0.59328 [03/25/2022-13:24:17] [V] [TRT] Fastest Tactic: 5966973378912044513 Time: 0.094208 [03/25/2022-13:24:17] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: 5966973378912044513 [03/25/2022-13:24:17] [V] [TRT] =============== Computing costs for [03/25/2022-13:24:17] [V] [TRT] *************** Autotuning format combination: Int8(50176,3136:4,56,1), Int8(200704,3136:4,56,1) -> Int8(200704,3136:4,56,1) *************** [03/25/2022-13:24:17] [V] [TRT] *************** Autotuning format combination: Int8(50176,3136:4,56,1), Int8(25088,3136:32,56,1) -> Int8(25088,3136:32,56,1) *************** [03/25/2022-13:24:17] [V] [TRT] *************** Autotuning format combination: Int8(6272,3136:32,56,1), Int8(25088,3136:32,56,1) -> Int8(25088,3136:32,56,1) *************** [03/25/2022-13:24:17] [V] [TRT] =============== Computing costs for [03/25/2022-13:24:17] [V] [TRT] *************** Autotuning format combination: Int8(200704,3136:4,56,1) -> Int8(50176,3136:4,56,1) *************** [03/25/2022-13:24:17] [V] [TRT] *************** Autotuning format combination: Int8(200704,3136:4,56,1) -> Int8(6272,3136:32,56,1) *************** [03/25/2022-13:24:18] [V] [TRT] *************** Autotuning format combination: Int8(25088,3136:32,56,1) -> Int8(6272,3136:32,56,1) *************** [03/25/2022-13:24:18] [V] [TRT] Skip timing cache hit (epiFadd mismatch) for node: sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148, LayerImpl: CaskConvolution, tactic: -4889498558023475527 [03/25/2022-13:24:18] [V] [TRT] --------------- Timing Runner: sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 (CudaGroupConvolution) [03/25/2022-13:24:18] [V] [TRT] CudaGroupConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:18] [V] [TRT] --------------- Timing Runner: sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 (CudaDepthwiseConvolution) [03/25/2022-13:24:18] [V] [TRT] CudaDepthwiseConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:18] [V] [TRT] --------------- Timing Runner: sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 (FusedConvActConvolution) [03/25/2022-13:24:18] [V] [TRT] FusedConvActConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:18] [V] [TRT] --------------- Timing Runner: sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 (CaskConvolution) [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16_t1r1s1 Tactic: 68468667201176803 [03/25/2022-13:24:18] [V] [TRT] Tactic: 68468667201176803 Time: 0.116224 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 125145153013230687 [03/25/2022-13:24:18] [V] [TRT] Tactic: 125145153013230687 Time: 0.224 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_interior_nt_v1 Tactic: 434957160407688216 [03/25/2022-13:24:18] [V] [TRT] Tactic: 434957160407688216 Time: 0.204672 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 805889586762897346 [03/25/2022-13:24:18] [V] [TRT] Tactic: 805889586762897346 Time: 0.128 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: 857001784974286465 [03/25/2022-13:24:18] [V] [TRT] Tactic: 857001784974286465 Time: 0.22144 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 1214130898909872671 [03/25/2022-13:24:18] [V] [TRT] Tactic: 1214130898909872671 Time: 0.151808 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 1278425129871930205 [03/25/2022-13:24:18] [V] [TRT] Tactic: 1278425129871930205 Time: 0.121088 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 1583811548148740665 [03/25/2022-13:24:18] [V] [TRT] Tactic: 1583811548148740665 Time: 0.199936 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 1701344857577810806 [03/25/2022-13:24:18] [V] [TRT] Tactic: 1701344857577810806 Time: 0.120832 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 1797231177354918208 [03/25/2022-13:24:18] [V] [TRT] Tactic: 1797231177354918208 Time: 0.10816 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2004812516525036381 [03/25/2022-13:24:18] [V] [TRT] Tactic: 2004812516525036381 Time: 0.169984 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2030033463723799063 [03/25/2022-13:24:18] [V] [TRT] Tactic: 2030033463723799063 Time: 0.126208 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_interior_nt_v1 Tactic: 2346437292116182513 [03/25/2022-13:24:18] [V] [TRT] Tactic: 2346437292116182513 Time: 0.168064 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2376898825218218566 [03/25/2022-13:24:18] [V] [TRT] Tactic: 2376898825218218566 Time: 0.109056 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_interior_nt_v1 Tactic: 2522133112320625287 [03/25/2022-13:24:18] [V] [TRT] Tactic: 2522133112320625287 Time: 0.11648 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2548171972648455240 [03/25/2022-13:24:18] [V] [TRT] Tactic: 2548171972648455240 Time: 0.10496 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 2548946449357458230 [03/25/2022-13:24:18] [V] [TRT] Tactic: 2548946449357458230 Time: 0.118784 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: 2570666021825229009 [03/25/2022-13:24:18] [V] [TRT] Tactic: 2570666021825229009 Time: 0.11712 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 2678520742286844763 [03/25/2022-13:24:18] [V] [TRT] Tactic: 2678520742286844763 Time: 0.192896 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 2756291002030759362 [03/25/2022-13:24:18] [V] [TRT] Tactic: 2756291002030759362 Time: 0.103552 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2972948223367788520 [03/25/2022-13:24:18] [V] [TRT] Tactic: 2972948223367788520 Time: 0.115072 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_medium_nt_v1 Tactic: 2985940154541537814 [03/25/2022-13:24:18] [V] [TRT] Tactic: 2985940154541537814 Time: 0.167296 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 3043273137345374664 [03/25/2022-13:24:18] [V] [TRT] Tactic: 3043273137345374664 Time: 0.153344 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 3221677093659484230 [03/25/2022-13:24:18] [V] [TRT] Tactic: 3221677093659484230 Time: 0.185728 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: 3242897809704328258 [03/25/2022-13:24:18] [V] [TRT] Tactic: 3242897809704328258 Time: 0.112512 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 3312456766204252694 [03/25/2022-13:24:18] [V] [TRT] Tactic: 3312456766204252694 Time: 0.120448 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 3538565962642681625 [03/25/2022-13:24:18] [V] [TRT] Tactic: 3538565962642681625 Time: 0.103296 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 3541919052468401776 [03/25/2022-13:24:18] [V] [TRT] Tactic: 3541919052468401776 Time: 0.150784 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 3593397928177382100 [03/25/2022-13:24:18] [V] [TRT] Tactic: 3593397928177382100 Time: 0.151808 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 3670282018109435863 [03/25/2022-13:24:18] [V] [TRT] Tactic: 3670282018109435863 Time: 0.103296 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 3671413346254027573 [03/25/2022-13:24:18] [V] [TRT] Tactic: 3671413346254027573 Time: 0.115584 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_medium_nt_v1 Tactic: 3899284354987683408 [03/25/2022-13:24:18] [V] [TRT] Tactic: 3899284354987683408 Time: 0.202624 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 3927509214678622419 [03/25/2022-13:24:18] [V] [TRT] Tactic: 3927509214678622419 Time: 0.109696 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 4112572034735311841 [03/25/2022-13:24:18] [V] [TRT] Tactic: 4112572034735311841 Time: 0.168192 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 4239974928951431644 [03/25/2022-13:24:18] [V] [TRT] Tactic: 4239974928951431644 Time: 0.121728 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 4610760414797216079 [03/25/2022-13:24:18] [V] [TRT] Tactic: 4610760414797216079 Time: 0.131328 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_small_nt_v1 Tactic: 4717285412741024953 [03/25/2022-13:24:18] [V] [TRT] Tactic: 4717285412741024953 Time: 0.119552 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 4796956614760326119 [03/25/2022-13:24:18] [V] [TRT] Tactic: 4796956614760326119 Time: 0.139904 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 4919361344804309192 [03/25/2022-13:24:18] [V] [TRT] Tactic: 4919361344804309192 Time: 0.1152 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 5043674678294309681 [03/25/2022-13:24:18] [V] [TRT] Tactic: 5043674678294309681 Time: 0.103296 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_interior_nt_v1 Tactic: 5126565865931538390 [03/25/2022-13:24:18] [V] [TRT] Tactic: 5126565865931538390 Time: 0.117632 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5204702486885981735 [03/25/2022-13:24:18] [V] [TRT] Tactic: 5204702486885981735 Time: 0.107264 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 5375256703210220108 [03/25/2022-13:24:18] [V] [TRT] Tactic: 5375256703210220108 Time: 0.101888 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 5424258848951129084 [03/25/2022-13:24:18] [V] [TRT] Tactic: 5424258848951129084 Time: 0.107776 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5443897483205284103 [03/25/2022-13:24:18] [V] [TRT] Tactic: 5443897483205284103 Time: 0.118912 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5707566217891294846 [03/25/2022-13:24:18] [V] [TRT] Tactic: 5707566217891294846 Time: 0.107008 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5986622376339202983 [03/25/2022-13:24:18] [V] [TRT] Tactic: 5986622376339202983 Time: 0.132352 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6007888770437705057 [03/25/2022-13:24:18] [V] [TRT] Tactic: 6007888770437705057 Time: 0.148864 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 6405251167055673379 [03/25/2022-13:24:18] [V] [TRT] Tactic: 6405251167055673379 Time: 0.14336 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 6433368103202497147 [03/25/2022-13:24:18] [V] [TRT] Tactic: 6433368103202497147 Time: 0.1088 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6441948709525127755 [03/25/2022-13:24:18] [V] [TRT] Tactic: 6441948709525127755 Time: 0.152448 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 6443933097134654777 [03/25/2022-13:24:18] [V] [TRT] Tactic: 6443933097134654777 Time: 0.25536 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6457435868048963632 [03/25/2022-13:24:18] [V] [TRT] Tactic: 6457435868048963632 Time: 0.133632 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1 Tactic: 6510345569544721081 [03/25/2022-13:24:18] [V] [TRT] Tactic: 6510345569544721081 Time: 0.116992 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 6793988781414507278 [03/25/2022-13:24:18] [V] [TRT] Tactic: 6793988781414507278 Time: 0.1152 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 6880710371738875469 [03/25/2022-13:24:18] [V] [TRT] Tactic: 6880710371738875469 Time: 0.161024 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6925201228918187099 [03/25/2022-13:24:18] [V] [TRT] Tactic: 6925201228918187099 Time: 0.11904 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6991524515605108718 [03/25/2022-13:24:18] [V] [TRT] Tactic: 6991524515605108718 Time: 0.221056 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 7245509442265271220 [03/25/2022-13:24:18] [V] [TRT] Tactic: 7245509442265271220 Time: 0.129408 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 7318929579222925725 [03/25/2022-13:24:18] [V] [TRT] Tactic: 7318929579222925725 Time: 0.10688 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 7731430299029542276 [03/25/2022-13:24:18] [V] [TRT] Tactic: 7731430299029542276 Time: 0.113024 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 7738495016763012180 [03/25/2022-13:24:18] [V] [TRT] Tactic: 7738495016763012180 Time: 0.222208 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: 7886967395128926382 [03/25/2022-13:24:18] [V] [TRT] Tactic: 7886967395128926382 Time: 0.10176 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 8142283985160822229 [03/25/2022-13:24:18] [V] [TRT] Tactic: 8142283985160822229 Time: 0.141312 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 8173975624668590862 [03/25/2022-13:24:18] [V] [TRT] Tactic: 8173975624668590862 Time: 0.252672 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 8234775147403903473 [03/25/2022-13:24:18] [V] [TRT] Tactic: 8234775147403903473 Time: 0.14784 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 8524082966802584889 [03/25/2022-13:24:18] [V] [TRT] Tactic: 8524082966802584889 Time: 0.11968 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 8684013308930763400 [03/25/2022-13:24:18] [V] [TRT] Tactic: 8684013308930763400 Time: 0.113024 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 8765382722978397630 [03/25/2022-13:24:18] [V] [TRT] Tactic: 8765382722978397630 Time: 0.120704 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 8843193587782643431 [03/25/2022-13:24:18] [V] [TRT] Tactic: 8843193587782643431 Time: 0.126336 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 8883810517410230831 [03/25/2022-13:24:18] [V] [TRT] Tactic: 8883810517410230831 Time: 0.101888 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 8930797211803511337 [03/25/2022-13:24:18] [V] [TRT] Tactic: 8930797211803511337 Time: 0.208384 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 8935070489925739043 [03/25/2022-13:24:18] [V] [TRT] Tactic: 8935070489925739043 Time: 0.107904 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1 Tactic: 9062173295331155069 [03/25/2022-13:24:18] [V] [TRT] Tactic: 9062173295331155069 Time: 0.192768 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_small_nt_v1 Tactic: -9118785798277698619 [03/25/2022-13:24:18] [V] [TRT] Tactic: -9118785798277698619 Time: 0.117888 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -8985599729413291927 [03/25/2022-13:24:18] [V] [TRT] Tactic: -8985599729413291927 Time: 0.101632 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -8972697510150675429 [03/25/2022-13:24:18] [V] [TRT] Tactic: -8972697510150675429 Time: 0.103424 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -8943710627305202139 [03/25/2022-13:24:18] [V] [TRT] Tactic: -8943710627305202139 Time: 0.161024 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -8859846367886814331 [03/25/2022-13:24:18] [V] [TRT] Tactic: -8859846367886814331 Time: 0.114944 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -8638624340850784688 [03/25/2022-13:24:18] [V] [TRT] Tactic: -8638624340850784688 Time: 0.142976 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -8556775352640313933 [03/25/2022-13:24:18] [V] [TRT] Tactic: -8556775352640313933 Time: 0.106112 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -8382298409581540699 [03/25/2022-13:24:18] [V] [TRT] Tactic: -8382298409581540699 Time: 0.132352 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -8172318747337038866 [03/25/2022-13:24:18] [V] [TRT] Tactic: -8172318747337038866 Time: 0.194688 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -8038164441468184723 [03/25/2022-13:24:18] [V] [TRT] Tactic: -8038164441468184723 Time: 0.213248 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -7844028314176826857 [03/25/2022-13:24:18] [V] [TRT] Tactic: -7844028314176826857 Time: 0.153984 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -7364286662638617917 [03/25/2022-13:24:18] [V] [TRT] Tactic: -7364286662638617917 Time: 0.2112 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r1s1 Tactic: -7361755530333096258 [03/25/2022-13:24:18] [V] [TRT] Tactic: -7361755530333096258 Time: 0.122368 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_t1r1s1 Tactic: -7289760022626653388 [03/25/2022-13:24:18] [V] [TRT] Tactic: -7289760022626653388 Time: 0.10176 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -7106539943789766885 [03/25/2022-13:24:18] [V] [TRT] Tactic: -7106539943789766885 Time: 0.34048 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -6969478418607271266 [03/25/2022-13:24:18] [V] [TRT] Tactic: -6969478418607271266 Time: 0.188288 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -6930438165437733000 [03/25/2022-13:24:18] [V] [TRT] Tactic: -6930438165437733000 Time: 0.168448 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -6879607992933502380 [03/25/2022-13:24:18] [V] [TRT] Tactic: -6879607992933502380 Time: 0.105472 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -6839669803644810934 [03/25/2022-13:24:18] [V] [TRT] Tactic: -6839669803644810934 Time: 0.102912 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -6812830108414456369 [03/25/2022-13:24:18] [V] [TRT] Tactic: -6812830108414456369 Time: 0.102912 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -6527178416855951297 [03/25/2022-13:24:18] [V] [TRT] Tactic: -6527178416855951297 Time: 0.118016 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -6510232214299595844 [03/25/2022-13:24:18] [V] [TRT] Tactic: -6510232214299595844 Time: 0.118016 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_small_nt_v1 Tactic: -6400348606759295499 [03/25/2022-13:24:18] [V] [TRT] Tactic: -6400348606759295499 Time: 0.161152 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -6346247605026339453 [03/25/2022-13:24:18] [V] [TRT] Tactic: -6346247605026339453 Time: 0.157824 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -6232597026469067819 [03/25/2022-13:24:18] [V] [TRT] Tactic: -6232597026469067819 Time: 0.14208 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_small_nt_v1 Tactic: -5980889159865208399 [03/25/2022-13:24:18] [V] [TRT] Tactic: -5980889159865208399 Time: 0.19776 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_medium_nt_v1 Tactic: -5766140806760372989 [03/25/2022-13:24:18] [V] [TRT] Tactic: -5766140806760372989 Time: 0.118656 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -5697614955743334137 [03/25/2022-13:24:18] [V] [TRT] Tactic: -5697614955743334137 Time: 0.155776 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -5671123121710113970 [03/25/2022-13:24:18] [V] [TRT] Tactic: -5671123121710113970 Time: 0.126208 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -5615581362569252260 [03/25/2022-13:24:18] [V] [TRT] Tactic: -5615581362569252260 Time: 0.1056 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -5562968047117507056 [03/25/2022-13:24:18] [V] [TRT] Tactic: -5562968047117507056 Time: 0.146688 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -5516472881360101487 [03/25/2022-13:24:18] [V] [TRT] Tactic: -5516472881360101487 Time: 0.143616 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -5311474420963248369 [03/25/2022-13:24:18] [V] [TRT] Tactic: -5311474420963248369 Time: 0.123136 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -5170003087447722174 [03/25/2022-13:24:18] [V] [TRT] Tactic: -5170003087447722174 Time: 0.152064 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -4889586143772361690 [03/25/2022-13:24:18] [V] [TRT] Tactic: -4889586143772361690 Time: 0.12096 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -4889498558023475527 [03/25/2022-13:24:18] [V] [TRT] Tactic: -4889498558023475527 Time: 0.10176 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -4849712423393454704 [03/25/2022-13:24:18] [V] [TRT] Tactic: -4849712423393454704 Time: 0.127232 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: -4681913707320020520 [03/25/2022-13:24:18] [V] [TRT] Tactic: -4681913707320020520 Time: 0.218752 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_medium_nt_v1 Tactic: -4516822589357530549 [03/25/2022-13:24:18] [V] [TRT] Tactic: -4516822589357530549 Time: 0.120832 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -4455415102719506646 [03/25/2022-13:24:18] [V] [TRT] Tactic: -4455415102719506646 Time: 0.102912 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -4425346730823666456 [03/25/2022-13:24:18] [V] [TRT] Tactic: -4425346730823666456 Time: 0.185088 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r1s1 Tactic: -4260476497340370474 [03/25/2022-13:24:18] [V] [TRT] Tactic: -4260476497340370474 Time: 0.13504 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -4182501876984672402 [03/25/2022-13:24:18] [V] [TRT] Tactic: -4182501876984672402 Time: 0.12736 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_t1r1s1 Tactic: -4151617293257698859 [03/25/2022-13:24:18] [V] [TRT] Tactic: -4151617293257698859 Time: 0.260224 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -3825889760337461729 [03/25/2022-13:24:18] [V] [TRT] Tactic: -3825889760337461729 Time: 0.114944 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -3797022944823726673 [03/25/2022-13:24:18] [V] [TRT] Tactic: -3797022944823726673 Time: 0.102656 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -3613322253849278738 [03/25/2022-13:24:18] [V] [TRT] Tactic: -3613322253849278738 Time: 0.169088 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -3577322188448771475 [03/25/2022-13:24:18] [V] [TRT] Tactic: -3577322188448771475 Time: 0.104832 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16_t1r1s1 Tactic: -3531681826488401618 [03/25/2022-13:24:18] [V] [TRT] Tactic: -3531681826488401618 Time: 0.351488 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -3305554949874552860 [03/25/2022-13:24:18] [V] [TRT] Tactic: -3305554949874552860 Time: 0.192768 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -3288585994448820820 [03/25/2022-13:24:18] [V] [TRT] Tactic: -3288585994448820820 Time: 0.219136 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -2754311112012636251 [03/25/2022-13:24:18] [V] [TRT] Tactic: -2754311112012636251 Time: 0.10624 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -2432868635536396215 [03/25/2022-13:24:18] [V] [TRT] Tactic: -2432868635536396215 Time: 0.198144 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -2379804152300264660 [03/25/2022-13:24:18] [V] [TRT] Tactic: -2379804152300264660 Time: 0.152448 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: -2352253835013627337 [03/25/2022-13:24:18] [V] [TRT] Tactic: -2352253835013627337 Time: 0.107776 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -2335587136911650799 [03/25/2022-13:24:18] [V] [TRT] Tactic: -2335587136911650799 Time: 0.121088 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r1s1 Tactic: -2315453944962430928 [03/25/2022-13:24:18] [V] [TRT] Tactic: -2315453944962430928 Time: 0.261376 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: -2238364958919154661 [03/25/2022-13:24:18] [V] [TRT] Tactic: -2238364958919154661 Time: 0.116864 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -1916483171117495388 [03/25/2022-13:24:18] [V] [TRT] Tactic: -1916483171117495388 Time: 0.216576 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -1740762957710554518 [03/25/2022-13:24:18] [V] [TRT] Tactic: -1740762957710554518 Time: 0.192768 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -1549742793039499659 [03/25/2022-13:24:18] [V] [TRT] Tactic: -1549742793039499659 Time: 0.144128 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -1499578657823798783 [03/25/2022-13:24:18] [V] [TRT] Tactic: -1499578657823798783 Time: 0.102784 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -1494157908358500249 [03/25/2022-13:24:18] [V] [TRT] Tactic: -1494157908358500249 Time: 0.117376 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_t1r1s1 Tactic: -1328736756812546664 [03/25/2022-13:24:18] [V] [TRT] Tactic: -1328736756812546664 Time: 0.135424 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -1006589727652607355 [03/25/2022-13:24:18] [V] [TRT] Tactic: -1006589727652607355 Time: 0.106112 [03/25/2022-13:24:18] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -713022856474991236 [03/25/2022-13:24:19] [V] [TRT] Tactic: -713022856474991236 Time: 0.169344 [03/25/2022-13:24:19] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -619668460699260222 [03/25/2022-13:24:19] [V] [TRT] Tactic: -619668460699260222 Time: 0.116992 [03/25/2022-13:24:19] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -405554772060757402 [03/25/2022-13:24:19] [V] [TRT] Tactic: -405554772060757402 Time: 0.109824 [03/25/2022-13:24:19] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -375949437730908730 [03/25/2022-13:24:19] [V] [TRT] Tactic: -375949437730908730 Time: 0.102144 [03/25/2022-13:24:19] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: -233227833606287806 [03/25/2022-13:24:19] [V] [TRT] Tactic: -233227833606287806 Time: 0.10304 [03/25/2022-13:24:19] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -111878368089469751 [03/25/2022-13:24:19] [V] [TRT] Tactic: -111878368089469751 Time: 0.16832 [03/25/2022-13:24:19] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -48936598874722005 [03/25/2022-13:24:19] [V] [TRT] Tactic: -48936598874722005 Time: 0.102528 [03/25/2022-13:24:19] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -19707840769375107 [03/25/2022-13:24:19] [V] [TRT] Tactic: -19707840769375107 Time: 0.104704 [03/25/2022-13:24:19] [V] [TRT] Fastest Tactic: -8985599729413291927 Time: 0.101632 [03/25/2022-13:24:19] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: -8985599729413291927 [03/25/2022-13:24:19] [V] [TRT] =============== Computing costs for [03/25/2022-13:24:19] [V] [TRT] *************** Autotuning format combination: Int8(50176,3136:4,56,1) -> Int8(50176,3136:4,56,1) *************** [03/25/2022-13:24:19] [V] [TRT] *************** Autotuning format combination: Int8(50176,3136:4,56,1) -> Int8(6272,3136:32,56,1) *************** [03/25/2022-13:24:19] [V] [TRT] *************** Autotuning format combination: Int8(6272,3136:32,56,1) -> Int8(6272,3136:32,56,1) *************** [03/25/2022-13:24:19] [V] [TRT] =============== Computing costs for [03/25/2022-13:24:19] [V] [TRT] *************** Autotuning format combination: Int8(50176,3136:4,56,1), Int8(200704,3136:4,56,1) -> Int8(200704,3136:4,56,1) *************** [03/25/2022-13:24:19] [V] [TRT] *************** Autotuning format combination: Int8(50176,3136:4,56,1), Int8(25088,3136:32,56,1) -> Int8(25088,3136:32,56,1) *************** [03/25/2022-13:24:19] [V] [TRT] *************** Autotuning format combination: Int8(6272,3136:32,56,1), Int8(25088,3136:32,56,1) -> Int8(25088,3136:32,56,1) *************** [03/25/2022-13:24:19] [V] [TRT] =============== Computing costs for [03/25/2022-13:24:19] [V] [TRT] *************** Autotuning format combination: Int8(200704,3136:4,56,1) -> Int8(100352,3136:4,56,1) *************** [03/25/2022-13:24:19] [V] [TRT] --------------- Timing Runner: sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 (CudaDepthwiseConvolution) [03/25/2022-13:24:19] [V] [TRT] CudaDepthwiseConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:19] [V] [TRT] --------------- Timing Runner: sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 (FusedConvActConvolution) [03/25/2022-13:24:19] [V] [TRT] FusedConvActConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:19] [V] [TRT] --------------- Timing Runner: sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 (CaskConvolution) [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_medium_nn_v1 Tactic: 175853789719975416 [03/25/2022-13:24:19] [V] [TRT] Tactic: 175853789719975416 Time: 0.477696 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_medium_nn_v1 Tactic: 2171150287007712632 [03/25/2022-13:24:19] [V] [TRT] Tactic: 2171150287007712632 Time: 0.479872 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_small_nn_v1 Tactic: 2234457234705232274 [03/25/2022-13:24:19] [V] [TRT] Tactic: 2234457234705232274 Time: 0.393984 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_medium_nn_v1 Tactic: 5834048089706882838 [03/25/2022-13:24:19] [V] [TRT] Tactic: 5834048089706882838 Time: 0.395776 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_interior_nn_v1 Tactic: 6299962968199310600 [03/25/2022-13:24:19] [V] [TRT] Tactic: 6299962968199310600 Time: 0.387328 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_interior_nn_v1 Tactic: 6341572697076960911 [03/25/2022-13:24:19] [V] [TRT] Tactic: 6341572697076960911 Time: 0.443136 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_small_nn_v1 Tactic: -8626990807754934295 [03/25/2022-13:24:19] [V] [TRT] Tactic: -8626990807754934295 Time: 0.473344 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_interior_nn_v1 Tactic: -8498217049614706532 [03/25/2022-13:24:19] [V] [TRT] Tactic: -8498217049614706532 Time: 0.37824 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_small_nn_v1 Tactic: -7303593854972602201 [03/25/2022-13:24:19] [V] [TRT] Tactic: -7303593854972602201 Time: 0.452608 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_medium_nn_v1 Tactic: -6585664687867083638 [03/25/2022-13:24:19] [V] [TRT] Tactic: -6585664687867083638 Time: 0.389632 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_interior_nn_v1 Tactic: -3326139578711341011 [03/25/2022-13:24:19] [V] [TRT] Tactic: -3326139578711341011 Time: 0.453504 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_small_nn_v1 Tactic: -683636008127039856 [03/25/2022-13:24:19] [V] [TRT] Tactic: -683636008127039856 Time: 0.389888 [03/25/2022-13:24:19] [V] [TRT] Fastest Tactic: -8498217049614706532 Time: 0.37824 [03/25/2022-13:24:19] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: -8498217049614706532 [03/25/2022-13:24:19] [V] [TRT] *************** Autotuning format combination: Int8(200704,3136:4,56,1) -> Int8(12544,3136:32,56,1) *************** [03/25/2022-13:24:19] [V] [TRT] --------------- Timing Runner: sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 (CaskConvolution) [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_small_c32_nn_v1 Tactic: 1100922622480907544 [03/25/2022-13:24:19] [V] [TRT] Tactic: 1100922622480907544 Time: 0.469504 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_interior_c32_nn_v1 Tactic: 2855900226702061782 [03/25/2022-13:24:19] [V] [TRT] Tactic: 2855900226702061782 Time: 0.38848 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_medium_c32_nn_v1 Tactic: 3606311198834416176 [03/25/2022-13:24:19] [V] [TRT] Tactic: 3606311198834416176 Time: 0.393984 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_small_c32_nn_v1 Tactic: 4325765560739862899 [03/25/2022-13:24:19] [V] [TRT] Tactic: 4325765560739862899 Time: 0.390528 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_interior_c32_nn_v1 Tactic: 8803458114157674373 [03/25/2022-13:24:19] [V] [TRT] Tactic: 8803458114157674373 Time: 0.37696 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_interior_c32_nn_v1 Tactic: -6934773036503365000 [03/25/2022-13:24:19] [V] [TRT] Tactic: -6934773036503365000 Time: 0.45184 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_interior_c32_nn_v1 Tactic: -4431642509665791294 [03/25/2022-13:24:19] [V] [TRT] Tactic: -4431642509665791294 Time: 0.438784 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_medium_c32_nn_v1 Tactic: -4255737803793506479 [03/25/2022-13:24:19] [V] [TRT] Tactic: -4255737803793506479 Time: 0.3904 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_small_c32_nn_v1 Tactic: -3958182351168863467 [03/25/2022-13:24:19] [V] [TRT] Tactic: -3958182351168863467 Time: 0.451712 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_medium_c32_nn_v1 Tactic: -3111968753064955248 [03/25/2022-13:24:19] [V] [TRT] Tactic: -3111968753064955248 Time: 0.47424 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_medium_c32_nn_v1 Tactic: -1492575840277333548 [03/25/2022-13:24:19] [V] [TRT] Tactic: -1492575840277333548 Time: 0.474496 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_small_c32_nn_v1 Tactic: -868495160148524802 [03/25/2022-13:24:19] [V] [TRT] Tactic: -868495160148524802 Time: 0.39232 [03/25/2022-13:24:19] [V] [TRT] Fastest Tactic: 8803458114157674373 Time: 0.37696 [03/25/2022-13:24:19] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: 8803458114157674373 [03/25/2022-13:24:19] [V] [TRT] *************** Autotuning format combination: Int8(25088,3136:32,56,1) -> Int8(12544,3136:32,56,1) *************** [03/25/2022-13:24:19] [V] [TRT] --------------- Timing Runner: sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 (CudaGroupConvolution) [03/25/2022-13:24:19] [V] [TRT] CudaGroupConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:19] [V] [TRT] --------------- Timing Runner: sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 (CudaDepthwiseConvolution) [03/25/2022-13:24:19] [V] [TRT] CudaDepthwiseConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:19] [V] [TRT] --------------- Timing Runner: sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 (FusedConvActConvolution) [03/25/2022-13:24:19] [V] [TRT] FusedConvActConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:19] [V] [TRT] --------------- Timing Runner: sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 (CaskConvolution) [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16_t1r1s1 Tactic: 68468667201176803 [03/25/2022-13:24:19] [V] [TRT] Tactic: 68468667201176803 Time: 0.177792 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 125145153013230687 [03/25/2022-13:24:19] [V] [TRT] Tactic: 125145153013230687 Time: 0.230528 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_interior_nt_v1 Tactic: 434957160407688216 [03/25/2022-13:24:19] [V] [TRT] Tactic: 434957160407688216 Time: 0.211712 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 805889586762897346 [03/25/2022-13:24:19] [V] [TRT] Tactic: 805889586762897346 Time: 0.135296 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: 857001784974286465 [03/25/2022-13:24:19] [V] [TRT] Tactic: 857001784974286465 Time: 0.225536 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 1214130898909872671 [03/25/2022-13:24:19] [V] [TRT] Tactic: 1214130898909872671 Time: 0.2176 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 1278425129871930205 [03/25/2022-13:24:19] [V] [TRT] Tactic: 1278425129871930205 Time: 0.130304 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 1583811548148740665 [03/25/2022-13:24:19] [V] [TRT] Tactic: 1583811548148740665 Time: 0.205952 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 1701344857577810806 [03/25/2022-13:24:19] [V] [TRT] Tactic: 1701344857577810806 Time: 0.212736 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 1797231177354918208 [03/25/2022-13:24:19] [V] [TRT] Tactic: 1797231177354918208 Time: 0.17664 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2004812516525036381 [03/25/2022-13:24:19] [V] [TRT] Tactic: 2004812516525036381 Time: 0.17536 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2030033463723799063 [03/25/2022-13:24:19] [V] [TRT] Tactic: 2030033463723799063 Time: 0.135808 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_interior_nt_v1 Tactic: 2346437292116182513 [03/25/2022-13:24:19] [V] [TRT] Tactic: 2346437292116182513 Time: 0.17472 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2376898825218218566 [03/25/2022-13:24:19] [V] [TRT] Tactic: 2376898825218218566 Time: 0.126208 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_interior_nt_v1 Tactic: 2522133112320625287 [03/25/2022-13:24:19] [V] [TRT] Tactic: 2522133112320625287 Time: 0.176128 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2548171972648455240 [03/25/2022-13:24:19] [V] [TRT] Tactic: 2548171972648455240 Time: 0.126848 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 2548946449357458230 [03/25/2022-13:24:19] [V] [TRT] Tactic: 2548946449357458230 Time: 0.201088 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: 2570666021825229009 [03/25/2022-13:24:19] [V] [TRT] Tactic: 2570666021825229009 Time: 0.214272 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 2678520742286844763 [03/25/2022-13:24:19] [V] [TRT] Tactic: 2678520742286844763 Time: 0.3648 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 2756291002030759362 [03/25/2022-13:24:19] [V] [TRT] Tactic: 2756291002030759362 Time: 0.150144 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2972948223367788520 [03/25/2022-13:24:19] [V] [TRT] Tactic: 2972948223367788520 Time: 0.128768 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_medium_nt_v1 Tactic: 2985940154541537814 [03/25/2022-13:24:19] [V] [TRT] Tactic: 2985940154541537814 Time: 0.173952 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 3043273137345374664 [03/25/2022-13:24:19] [V] [TRT] Tactic: 3043273137345374664 Time: 0.233728 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 3221677093659484230 [03/25/2022-13:24:19] [V] [TRT] Tactic: 3221677093659484230 Time: 0.301184 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: 3242897809704328258 [03/25/2022-13:24:19] [V] [TRT] Tactic: 3242897809704328258 Time: 0.176256 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 3312456766204252694 [03/25/2022-13:24:19] [V] [TRT] Tactic: 3312456766204252694 Time: 0.20928 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 3538565962642681625 [03/25/2022-13:24:19] [V] [TRT] Tactic: 3538565962642681625 Time: 0.15616 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 3541919052468401776 [03/25/2022-13:24:19] [V] [TRT] Tactic: 3541919052468401776 Time: 0.158336 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 3593397928177382100 [03/25/2022-13:24:19] [V] [TRT] Tactic: 3593397928177382100 Time: 0.2176 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 3670282018109435863 [03/25/2022-13:24:19] [V] [TRT] Tactic: 3670282018109435863 Time: 0.13888 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 3671413346254027573 [03/25/2022-13:24:19] [V] [TRT] Tactic: 3671413346254027573 Time: 0.192256 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_medium_nt_v1 Tactic: 3899284354987683408 [03/25/2022-13:24:19] [V] [TRT] Tactic: 3899284354987683408 Time: 0.209664 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 3927509214678622419 [03/25/2022-13:24:19] [V] [TRT] Tactic: 3927509214678622419 Time: 0.169344 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 4112572034735311841 [03/25/2022-13:24:19] [V] [TRT] Tactic: 4112572034735311841 Time: 0.250368 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 4239974928951431644 [03/25/2022-13:24:19] [V] [TRT] Tactic: 4239974928951431644 Time: 0.148224 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 4610760414797216079 [03/25/2022-13:24:19] [V] [TRT] Tactic: 4610760414797216079 Time: 0.1344 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_small_nt_v1 Tactic: 4717285412741024953 [03/25/2022-13:24:19] [V] [TRT] Tactic: 4717285412741024953 Time: 0.184704 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 4796956614760326119 [03/25/2022-13:24:19] [V] [TRT] Tactic: 4796956614760326119 Time: 0.192896 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: 4909502217677847353 [03/25/2022-13:24:19] [V] [TRT] Tactic: 4909502217677847353 Time: 0.124544 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 4919361344804309192 [03/25/2022-13:24:19] [V] [TRT] Tactic: 4919361344804309192 Time: 0.186112 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 5043674678294309681 [03/25/2022-13:24:19] [V] [TRT] Tactic: 5043674678294309681 Time: 0.14464 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_interior_nt_v1 Tactic: 5126565865931538390 [03/25/2022-13:24:19] [V] [TRT] Tactic: 5126565865931538390 Time: 0.179968 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5204702486885981735 [03/25/2022-13:24:19] [V] [TRT] Tactic: 5204702486885981735 Time: 0.145792 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 5375256703210220108 [03/25/2022-13:24:19] [V] [TRT] Tactic: 5375256703210220108 Time: 0.136832 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 5424258848951129084 [03/25/2022-13:24:19] [V] [TRT] Tactic: 5424258848951129084 Time: 0.124928 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5443897483205284103 [03/25/2022-13:24:19] [V] [TRT] Tactic: 5443897483205284103 Time: 0.20608 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5707566217891294846 [03/25/2022-13:24:19] [V] [TRT] Tactic: 5707566217891294846 Time: 0.147712 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5986622376339202983 [03/25/2022-13:24:19] [V] [TRT] Tactic: 5986622376339202983 Time: 0.153088 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6007888770437705057 [03/25/2022-13:24:19] [V] [TRT] Tactic: 6007888770437705057 Time: 0.15488 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 6405251167055673379 [03/25/2022-13:24:19] [V] [TRT] Tactic: 6405251167055673379 Time: 0.198016 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 6433368103202497147 [03/25/2022-13:24:19] [V] [TRT] Tactic: 6433368103202497147 Time: 0.150144 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6441948709525127755 [03/25/2022-13:24:19] [V] [TRT] Tactic: 6441948709525127755 Time: 0.223872 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 6443933097134654777 [03/25/2022-13:24:19] [V] [TRT] Tactic: 6443933097134654777 Time: 0.260352 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6457435868048963632 [03/25/2022-13:24:19] [V] [TRT] Tactic: 6457435868048963632 Time: 0.136832 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1 Tactic: 6510345569544721081 [03/25/2022-13:24:19] [V] [TRT] Tactic: 6510345569544721081 Time: 0.214784 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 6793988781414507278 [03/25/2022-13:24:19] [V] [TRT] Tactic: 6793988781414507278 Time: 0.141312 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 6880710371738875469 [03/25/2022-13:24:19] [V] [TRT] Tactic: 6880710371738875469 Time: 0.1664 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6925201228918187099 [03/25/2022-13:24:19] [V] [TRT] Tactic: 6925201228918187099 Time: 0.137984 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6991524515605108718 [03/25/2022-13:24:19] [V] [TRT] Tactic: 6991524515605108718 Time: 0.234624 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 7245509442265271220 [03/25/2022-13:24:19] [V] [TRT] Tactic: 7245509442265271220 Time: 0.153984 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 7318929579222925725 [03/25/2022-13:24:19] [V] [TRT] Tactic: 7318929579222925725 Time: 0.14016 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 7731430299029542276 [03/25/2022-13:24:19] [V] [TRT] Tactic: 7731430299029542276 Time: 0.128 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 7738495016763012180 [03/25/2022-13:24:19] [V] [TRT] Tactic: 7738495016763012180 Time: 0.225664 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: 7886967395128926382 [03/25/2022-13:24:19] [V] [TRT] Tactic: 7886967395128926382 Time: 0.133632 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 8142283985160822229 [03/25/2022-13:24:19] [V] [TRT] Tactic: 8142283985160822229 Time: 0.147328 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 8173975624668590862 [03/25/2022-13:24:19] [V] [TRT] Tactic: 8173975624668590862 Time: 0.255616 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 8234775147403903473 [03/25/2022-13:24:19] [V] [TRT] Tactic: 8234775147403903473 Time: 0.152576 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 8524082966802584889 [03/25/2022-13:24:19] [V] [TRT] Tactic: 8524082966802584889 Time: 0.13312 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 8684013308930763400 [03/25/2022-13:24:19] [V] [TRT] Tactic: 8684013308930763400 Time: 0.169344 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 8765382722978397630 [03/25/2022-13:24:19] [V] [TRT] Tactic: 8765382722978397630 Time: 0.132224 [03/25/2022-13:24:19] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 8843193587782643431 [03/25/2022-13:24:20] [V] [TRT] Tactic: 8843193587782643431 Time: 0.229504 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 8883810517410230831 [03/25/2022-13:24:20] [V] [TRT] Tactic: 8883810517410230831 Time: 0.135552 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 8930797211803511337 [03/25/2022-13:24:20] [V] [TRT] Tactic: 8930797211803511337 Time: 0.218496 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 8935070489925739043 [03/25/2022-13:24:20] [V] [TRT] Tactic: 8935070489925739043 Time: 0.165504 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1 Tactic: 9062173295331155069 [03/25/2022-13:24:20] [V] [TRT] Tactic: 9062173295331155069 Time: 0.364928 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_small_nt_v1 Tactic: -9118785798277698619 [03/25/2022-13:24:20] [V] [TRT] Tactic: -9118785798277698619 Time: 0.178432 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -8985599729413291927 [03/25/2022-13:24:20] [V] [TRT] Tactic: -8985599729413291927 Time: 0.128768 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -8972697510150675429 [03/25/2022-13:24:20] [V] [TRT] Tactic: -8972697510150675429 Time: 0.15296 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -8943710627305202139 [03/25/2022-13:24:20] [V] [TRT] Tactic: -8943710627305202139 Time: 0.16896 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -8859846367886814331 [03/25/2022-13:24:20] [V] [TRT] Tactic: -8859846367886814331 Time: 0.190208 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -8638624340850784688 [03/25/2022-13:24:20] [V] [TRT] Tactic: -8638624340850784688 Time: 0.260224 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -8556775352640313933 [03/25/2022-13:24:20] [V] [TRT] Tactic: -8556775352640313933 Time: 0.131328 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -8382298409581540699 [03/25/2022-13:24:20] [V] [TRT] Tactic: -8382298409581540699 Time: 0.238336 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -8172318747337038866 [03/25/2022-13:24:20] [V] [TRT] Tactic: -8172318747337038866 Time: 0.201856 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -8038164441468184723 [03/25/2022-13:24:20] [V] [TRT] Tactic: -8038164441468184723 Time: 0.217984 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -7844028314176826857 [03/25/2022-13:24:20] [V] [TRT] Tactic: -7844028314176826857 Time: 0.23616 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -7674507941016740570 [03/25/2022-13:24:20] [V] [TRT] Tactic: -7674507941016740570 Time: 0.124672 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -7364286662638617917 [03/25/2022-13:24:20] [V] [TRT] Tactic: -7364286662638617917 Time: 0.214912 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r1s1 Tactic: -7361755530333096258 [03/25/2022-13:24:20] [V] [TRT] Tactic: -7361755530333096258 Time: 0.208256 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_t1r1s1 Tactic: -7289760022626653388 [03/25/2022-13:24:20] [V] [TRT] Tactic: -7289760022626653388 Time: 0.135552 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -7106539943789766885 [03/25/2022-13:24:20] [V] [TRT] Tactic: -7106539943789766885 Time: 0.342656 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -6969478418607271266 [03/25/2022-13:24:20] [V] [TRT] Tactic: -6969478418607271266 Time: 0.195584 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -6930438165437733000 [03/25/2022-13:24:20] [V] [TRT] Tactic: -6930438165437733000 Time: 0.250752 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -6879607992933502380 [03/25/2022-13:24:20] [V] [TRT] Tactic: -6879607992933502380 Time: 0.140544 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -6839669803644810934 [03/25/2022-13:24:20] [V] [TRT] Tactic: -6839669803644810934 Time: 0.144 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -6812830108414456369 [03/25/2022-13:24:20] [V] [TRT] Tactic: -6812830108414456369 Time: 0.143616 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_conv_fprop_smallk_i8i8_i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_warptilesize32x32_stage4_warpsize2x4x1_r1s1_u1v1_hw0_c256_scalebias_relu Tactic: -6620675299995493092 [03/25/2022-13:24:20] [V] [TRT] Tactic: -6620675299995493092 Time: 0.12416 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -6527178416855951297 [03/25/2022-13:24:20] [V] [TRT] Tactic: -6527178416855951297 Time: 0.196736 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -6510232214299595844 [03/25/2022-13:24:20] [V] [TRT] Tactic: -6510232214299595844 Time: 0.195712 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_small_nt_v1 Tactic: -6400348606759295499 [03/25/2022-13:24:20] [V] [TRT] Tactic: -6400348606759295499 Time: 0.168192 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -6346247605026339453 [03/25/2022-13:24:20] [V] [TRT] Tactic: -6346247605026339453 Time: 0.166144 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -6232597026469067819 [03/25/2022-13:24:20] [V] [TRT] Tactic: -6232597026469067819 Time: 0.26176 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_small_nt_v1 Tactic: -5980889159865208399 [03/25/2022-13:24:20] [V] [TRT] Tactic: -5980889159865208399 Time: 0.204288 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_medium_nt_v1 Tactic: -5766140806760372989 [03/25/2022-13:24:20] [V] [TRT] Tactic: -5766140806760372989 Time: 0.181504 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -5697614955743334137 [03/25/2022-13:24:20] [V] [TRT] Tactic: -5697614955743334137 Time: 0.164352 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -5671123121710113970 [03/25/2022-13:24:20] [V] [TRT] Tactic: -5671123121710113970 Time: 0.147968 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -5615581362569252260 [03/25/2022-13:24:20] [V] [TRT] Tactic: -5615581362569252260 Time: 0.167936 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -5562968047117507056 [03/25/2022-13:24:20] [V] [TRT] Tactic: -5562968047117507056 Time: 0.153088 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -5516472881360101487 [03/25/2022-13:24:20] [V] [TRT] Tactic: -5516472881360101487 Time: 0.258432 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -5311474420963248369 [03/25/2022-13:24:20] [V] [TRT] Tactic: -5311474420963248369 Time: 0.214272 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -5170003087447722174 [03/25/2022-13:24:20] [V] [TRT] Tactic: -5170003087447722174 Time: 0.223104 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -4889586143772361690 [03/25/2022-13:24:20] [V] [TRT] Tactic: -4889586143772361690 Time: 0.12864 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -4889498558023475527 [03/25/2022-13:24:20] [V] [TRT] Tactic: -4889498558023475527 Time: 0.134016 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -4849712423393454704 [03/25/2022-13:24:20] [V] [TRT] Tactic: -4849712423393454704 Time: 0.130688 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: -4681913707320020520 [03/25/2022-13:24:20] [V] [TRT] Tactic: -4681913707320020520 Time: 0.223232 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_medium_nt_v1 Tactic: -4516822589357530549 [03/25/2022-13:24:20] [V] [TRT] Tactic: -4516822589357530549 Time: 0.188544 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -4455415102719506646 [03/25/2022-13:24:20] [V] [TRT] Tactic: -4455415102719506646 Time: 0.14976 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -4425346730823666456 [03/25/2022-13:24:20] [V] [TRT] Tactic: -4425346730823666456 Time: 0.280448 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r1s1 Tactic: -4260476497340370474 [03/25/2022-13:24:20] [V] [TRT] Tactic: -4260476497340370474 Time: 0.243328 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -4182501876984672402 [03/25/2022-13:24:20] [V] [TRT] Tactic: -4182501876984672402 Time: 0.148608 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_t1r1s1 Tactic: -4151617293257698859 [03/25/2022-13:24:20] [V] [TRT] Tactic: -4151617293257698859 Time: 0.263424 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -3825889760337461729 [03/25/2022-13:24:20] [V] [TRT] Tactic: -3825889760337461729 Time: 0.185472 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -3797022944823726673 [03/25/2022-13:24:20] [V] [TRT] Tactic: -3797022944823726673 Time: 0.148864 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -3613322253849278738 [03/25/2022-13:24:20] [V] [TRT] Tactic: -3613322253849278738 Time: 0.255744 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -3577322188448771475 [03/25/2022-13:24:20] [V] [TRT] Tactic: -3577322188448771475 Time: 0.167296 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16_t1r1s1 Tactic: -3531681826488401618 [03/25/2022-13:24:20] [V] [TRT] Tactic: -3531681826488401618 Time: 0.353024 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -3305554949874552860 [03/25/2022-13:24:20] [V] [TRT] Tactic: -3305554949874552860 Time: 0.364672 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -3288585994448820820 [03/25/2022-13:24:20] [V] [TRT] Tactic: -3288585994448820820 Time: 0.226048 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -2754311112012636251 [03/25/2022-13:24:20] [V] [TRT] Tactic: -2754311112012636251 Time: 0.172928 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -2432868635536396215 [03/25/2022-13:24:20] [V] [TRT] Tactic: -2432868635536396215 Time: 0.204288 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -2379804152300264660 [03/25/2022-13:24:20] [V] [TRT] Tactic: -2379804152300264660 Time: 0.230656 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: -2352253835013627337 [03/25/2022-13:24:20] [V] [TRT] Tactic: -2352253835013627337 Time: 0.124928 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -2335587136911650799 [03/25/2022-13:24:20] [V] [TRT] Tactic: -2335587136911650799 Time: 0.21824 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r1s1 Tactic: -2315453944962430928 [03/25/2022-13:24:20] [V] [TRT] Tactic: -2315453944962430928 Time: 0.264064 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: -2238364958919154661 [03/25/2022-13:24:20] [V] [TRT] Tactic: -2238364958919154661 Time: 0.214272 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -1916483171117495388 [03/25/2022-13:24:20] [V] [TRT] Tactic: -1916483171117495388 Time: 0.230016 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -1740762957710554518 [03/25/2022-13:24:20] [V] [TRT] Tactic: -1740762957710554518 Time: 0.364928 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -1549742793039499659 [03/25/2022-13:24:20] [V] [TRT] Tactic: -1549742793039499659 Time: 0.267904 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -1499578657823798783 [03/25/2022-13:24:20] [V] [TRT] Tactic: -1499578657823798783 Time: 0.15616 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -1494157908358500249 [03/25/2022-13:24:20] [V] [TRT] Tactic: -1494157908358500249 Time: 0.196352 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_t1r1s1 Tactic: -1328736756812546664 [03/25/2022-13:24:20] [V] [TRT] Tactic: -1328736756812546664 Time: 0.15488 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -1006589727652607355 [03/25/2022-13:24:20] [V] [TRT] Tactic: -1006589727652607355 Time: 0.169728 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -713022856474991236 [03/25/2022-13:24:20] [V] [TRT] Tactic: -713022856474991236 Time: 0.257408 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -619668460699260222 [03/25/2022-13:24:20] [V] [TRT] Tactic: -619668460699260222 Time: 0.214272 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -405554772060757402 [03/25/2022-13:24:20] [V] [TRT] Tactic: -405554772060757402 Time: 0.14976 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -375949437730908730 [03/25/2022-13:24:20] [V] [TRT] Tactic: -375949437730908730 Time: 0.143744 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: -233227833606287806 [03/25/2022-13:24:20] [V] [TRT] Tactic: -233227833606287806 Time: 0.14464 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -111878368089469751 [03/25/2022-13:24:20] [V] [TRT] Tactic: -111878368089469751 Time: 0.256128 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -48936598874722005 [03/25/2022-13:24:20] [V] [TRT] Tactic: -48936598874722005 Time: 0.134656 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -19707840769375107 [03/25/2022-13:24:20] [V] [TRT] Tactic: -19707840769375107 Time: 0.166272 [03/25/2022-13:24:20] [V] [TRT] Fastest Tactic: -6620675299995493092 Time: 0.12416 [03/25/2022-13:24:20] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: -6620675299995493092 [03/25/2022-13:24:20] [V] [TRT] =============== Computing costs for [03/25/2022-13:24:20] [V] [TRT] *************** Autotuning format combination: Int8(200704,3136:4,56,1) -> Int8(100352,784:4,28,1) *************** [03/25/2022-13:24:20] [V] [TRT] --------------- Timing Runner: sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 (CudaDepthwiseConvolution) [03/25/2022-13:24:20] [V] [TRT] CudaDepthwiseConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:20] [V] [TRT] --------------- Timing Runner: sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 (FusedConvActConvolution) [03/25/2022-13:24:20] [V] [TRT] FusedConvActConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:20] [V] [TRT] --------------- Timing Runner: sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 (CaskConvolution) [03/25/2022-13:24:20] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_medium_nn_v1 Tactic: 175853789719975416 [03/25/2022-13:24:20] [V] [TRT] Tactic: 175853789719975416 Time: 0.910976 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_medium_nn_v1 Tactic: 2171150287007712632 [03/25/2022-13:24:20] [V] [TRT] Tactic: 2171150287007712632 Time: 0.919424 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_small_nn_v1 Tactic: 2234457234705232274 [03/25/2022-13:24:20] [V] [TRT] Tactic: 2234457234705232274 Time: 0.494336 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_medium_nn_v1 Tactic: 5834048089706882838 [03/25/2022-13:24:20] [V] [TRT] Tactic: 5834048089706882838 Time: 0.495104 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_interior_nn_v1 Tactic: 6299962968199310600 [03/25/2022-13:24:20] [V] [TRT] Tactic: 6299962968199310600 Time: 0.389376 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_interior_nn_v1 Tactic: 6341572697076960911 [03/25/2022-13:24:20] [V] [TRT] Tactic: 6341572697076960911 Time: 0.91456 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_small_nn_v1 Tactic: -8626990807754934295 [03/25/2022-13:24:20] [V] [TRT] Tactic: -8626990807754934295 Time: 0.91072 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_interior_nn_v1 Tactic: -8498217049614706532 [03/25/2022-13:24:20] [V] [TRT] Tactic: -8498217049614706532 Time: 0.495488 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_small_nn_v1 Tactic: -7303593854972602201 [03/25/2022-13:24:20] [V] [TRT] Tactic: -7303593854972602201 Time: 0.9184 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_medium_nn_v1 Tactic: -6585664687867083638 [03/25/2022-13:24:20] [V] [TRT] Tactic: -6585664687867083638 Time: 0.392448 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_interior_nn_v1 Tactic: -3326139578711341011 [03/25/2022-13:24:20] [V] [TRT] Tactic: -3326139578711341011 Time: 0.906496 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_small_nn_v1 Tactic: -683636008127039856 [03/25/2022-13:24:20] [V] [TRT] Tactic: -683636008127039856 Time: 0.391808 [03/25/2022-13:24:20] [V] [TRT] Fastest Tactic: 6299962968199310600 Time: 0.389376 [03/25/2022-13:24:20] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: 6299962968199310600 [03/25/2022-13:24:20] [V] [TRT] *************** Autotuning format combination: Int8(200704,3136:4,56,1) -> Int8(12544,784:32,28,1) *************** [03/25/2022-13:24:20] [V] [TRT] --------------- Timing Runner: sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 (CaskConvolution) [03/25/2022-13:24:20] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_small_c32_nn_v1 Tactic: 1100922622480907544 [03/25/2022-13:24:20] [V] [TRT] Tactic: 1100922622480907544 Time: 0.909696 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_interior_c32_nn_v1 Tactic: 2855900226702061782 [03/25/2022-13:24:20] [V] [TRT] Tactic: 2855900226702061782 Time: 0.391168 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_medium_c32_nn_v1 Tactic: 3606311198834416176 [03/25/2022-13:24:20] [V] [TRT] Tactic: 3606311198834416176 Time: 0.493824 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_small_c32_nn_v1 Tactic: 4325765560739862899 [03/25/2022-13:24:20] [V] [TRT] Tactic: 4325765560739862899 Time: 0.393472 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_interior_c32_nn_v1 Tactic: 8803458114157674373 [03/25/2022-13:24:20] [V] [TRT] Tactic: 8803458114157674373 Time: 0.494464 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_interior_c32_nn_v1 Tactic: -6934773036503365000 [03/25/2022-13:24:20] [V] [TRT] Tactic: -6934773036503365000 Time: 0.905344 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_interior_c32_nn_v1 Tactic: -4431642509665791294 [03/25/2022-13:24:20] [V] [TRT] Tactic: -4431642509665791294 Time: 0.913664 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_medium_c32_nn_v1 Tactic: -4255737803793506479 [03/25/2022-13:24:20] [V] [TRT] Tactic: -4255737803793506479 Time: 0.393344 [03/25/2022-13:24:20] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_small_c32_nn_v1 Tactic: -3958182351168863467 [03/25/2022-13:24:21] [V] [TRT] Tactic: -3958182351168863467 Time: 0.915328 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_medium_c32_nn_v1 Tactic: -3111968753064955248 [03/25/2022-13:24:21] [V] [TRT] Tactic: -3111968753064955248 Time: 0.91904 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_medium_c32_nn_v1 Tactic: -1492575840277333548 [03/25/2022-13:24:21] [V] [TRT] Tactic: -1492575840277333548 Time: 0.909312 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_small_c32_nn_v1 Tactic: -868495160148524802 [03/25/2022-13:24:21] [V] [TRT] Tactic: -868495160148524802 Time: 0.493056 [03/25/2022-13:24:21] [V] [TRT] Fastest Tactic: 2855900226702061782 Time: 0.391168 [03/25/2022-13:24:21] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: 2855900226702061782 [03/25/2022-13:24:21] [V] [TRT] *************** Autotuning format combination: Int8(25088,3136:32,56,1) -> Int8(12544,784:32,28,1) *************** [03/25/2022-13:24:21] [V] [TRT] --------------- Timing Runner: sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 (CudaGroupConvolution) [03/25/2022-13:24:21] [V] [TRT] CudaGroupConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:21] [V] [TRT] --------------- Timing Runner: sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 (CudaDepthwiseConvolution) [03/25/2022-13:24:21] [V] [TRT] CudaDepthwiseConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:21] [V] [TRT] --------------- Timing Runner: sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 (FusedConvActConvolution) [03/25/2022-13:24:21] [V] [TRT] FusedConvActConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:21] [V] [TRT] --------------- Timing Runner: sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 (CaskConvolution) [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16_t1r1s1 Tactic: 68468667201176803 [03/25/2022-13:24:21] [V] [TRT] Tactic: 68468667201176803 Time: 0.173824 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 125145153013230687 [03/25/2022-13:24:21] [V] [TRT] Tactic: 125145153013230687 Time: 0.174464 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_epifadd Tactic: 177040020707947851 [03/25/2022-13:24:21] [V] [TRT] Tactic: 177040020707947851 Time: 0.363392 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32 Tactic: 328135613486708155 [03/25/2022-13:24:21] [V] [TRT] Tactic: 328135613486708155 Time: 0.304384 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_interior_nt_v1 Tactic: 434957160407688216 [03/25/2022-13:24:21] [V] [TRT] Tactic: 434957160407688216 Time: 0.208768 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 805889586762897346 [03/25/2022-13:24:21] [V] [TRT] Tactic: 805889586762897346 Time: 0.134144 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: 857001784974286465 [03/25/2022-13:24:21] [V] [TRT] Tactic: 857001784974286465 Time: 0.127744 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32 Tactic: 1111159740952609683 [03/25/2022-13:24:21] [V] [TRT] Tactic: 1111159740952609683 Time: 0.18944 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: 1134860903395928905 [03/25/2022-13:24:21] [V] [TRT] Tactic: 1134860903395928905 Time: 0.184448 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: 1276591930377039442 [03/25/2022-13:24:21] [V] [TRT] Tactic: 1276591930377039442 Time: 0.222848 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32 Tactic: 1399501420456320585 [03/25/2022-13:24:21] [V] [TRT] Tactic: 1399501420456320585 Time: 0.20608 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: 1550399266192842845 [03/25/2022-13:24:21] [V] [TRT] Tactic: 1550399266192842845 Time: 0.243968 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 1583811548148740665 [03/25/2022-13:24:21] [V] [TRT] Tactic: 1583811548148740665 Time: 0.156928 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 1701344857577810806 [03/25/2022-13:24:21] [V] [TRT] Tactic: 1701344857577810806 Time: 0.168192 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 1797231177354918208 [03/25/2022-13:24:21] [V] [TRT] Tactic: 1797231177354918208 Time: 0.192896 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: 2133329569091732311 [03/25/2022-13:24:21] [V] [TRT] Tactic: 2133329569091732311 Time: 0.186752 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_no_preds Tactic: 2186058294798640800 [03/25/2022-13:24:21] [V] [TRT] Tactic: 2186058294798640800 Time: 0.13184 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32 Tactic: 2325023763229477890 [03/25/2022-13:24:21] [V] [TRT] Tactic: 2325023763229477890 Time: 0.184576 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_interior_nt_v1 Tactic: 2346437292116182513 [03/25/2022-13:24:21] [V] [TRT] Tactic: 2346437292116182513 Time: 0.17536 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_no_preds Tactic: 2434539343777234419 [03/25/2022-13:24:21] [V] [TRT] Tactic: 2434539343777234419 Time: 0.128896 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_interior_nt_v1 Tactic: 2522133112320625287 [03/25/2022-13:24:21] [V] [TRT] Tactic: 2522133112320625287 Time: 0.176256 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: 2579824863892891529 [03/25/2022-13:24:21] [V] [TRT] Tactic: 2579824863892891529 Time: 0.249216 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32 Tactic: 2783960536172159663 [03/25/2022-13:24:21] [V] [TRT] Tactic: 2783960536172159663 Time: 0.155904 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32 Tactic: 2821711838552913693 [03/25/2022-13:24:21] [V] [TRT] Tactic: 2821711838552913693 Time: 0.13376 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64 Tactic: 2945009978756227538 [03/25/2022-13:24:21] [V] [TRT] Tactic: 2945009978756227538 Time: 0.163328 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_medium_nt_v1 Tactic: 2985940154541537814 [03/25/2022-13:24:21] [V] [TRT] Tactic: 2985940154541537814 Time: 0.173056 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: 3242897809704328258 [03/25/2022-13:24:21] [V] [TRT] Tactic: 3242897809704328258 Time: 0.17664 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32 Tactic: 3456719996792527006 [03/25/2022-13:24:21] [V] [TRT] Tactic: 3456719996792527006 Time: 0.18752 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 3538565962642681625 [03/25/2022-13:24:21] [V] [TRT] Tactic: 3538565962642681625 Time: 0.162304 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64 Tactic: 3651043333819148268 [03/25/2022-13:24:21] [V] [TRT] Tactic: 3651043333819148268 Time: 0.157696 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_no_preds Tactic: 3866129666720518662 [03/25/2022-13:24:21] [V] [TRT] Tactic: 3866129666720518662 Time: 0.159232 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_medium_nt_v1 Tactic: 3899284354987683408 [03/25/2022-13:24:21] [V] [TRT] Tactic: 3899284354987683408 Time: 0.20544 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32 Tactic: 4042202769383439184 [03/25/2022-13:24:21] [V] [TRT] Tactic: 4042202769383439184 Time: 0.163072 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_epifadd Tactic: 4259547356717612415 [03/25/2022-13:24:21] [V] [TRT] Tactic: 4259547356717612415 Time: 0.251776 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64 Tactic: 4414594337986714263 [03/25/2022-13:24:21] [V] [TRT] Tactic: 4414594337986714263 Time: 0.148864 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_small_nt_v1 Tactic: 4717285412741024953 [03/25/2022-13:24:21] [V] [TRT] Tactic: 4717285412741024953 Time: 0.184704 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32 Tactic: 4734519122557206480 [03/25/2022-13:24:21] [V] [TRT] Tactic: 4734519122557206480 Time: 0.159104 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: 4909502217677847353 [03/25/2022-13:24:21] [V] [TRT] Tactic: 4909502217677847353 Time: 0.118912 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: 4922297020351187339 [03/25/2022-13:24:21] [V] [TRT] Tactic: 4922297020351187339 Time: 0.167296 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_epifadd Tactic: 5121596860264626879 [03/25/2022-13:24:21] [V] [TRT] Tactic: 5121596860264626879 Time: 0.138752 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_interior_nt_v1 Tactic: 5126565865931538390 [03/25/2022-13:24:21] [V] [TRT] Tactic: 5126565865931538390 Time: 0.1824 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: 5158259316594207439 [03/25/2022-13:24:21] [V] [TRT] Tactic: 5158259316594207439 Time: 0.154496 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 5375256703210220108 [03/25/2022-13:24:21] [V] [TRT] Tactic: 5375256703210220108 Time: 0.15296 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_no_preds Tactic: 5380489069875971144 [03/25/2022-13:24:21] [V] [TRT] Tactic: 5380489069875971144 Time: 0.2688 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32 Tactic: 5424417905073460656 [03/25/2022-13:24:21] [V] [TRT] Tactic: 5424417905073460656 Time: 0.213632 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32 Tactic: 5442043907221427810 [03/25/2022-13:24:21] [V] [TRT] Tactic: 5442043907221427810 Time: 0.163072 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_no_preds Tactic: 5698083265414543143 [03/25/2022-13:24:21] [V] [TRT] Tactic: 5698083265414543143 Time: 0.194432 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6007888770437705057 [03/25/2022-13:24:21] [V] [TRT] Tactic: 6007888770437705057 Time: 0.1568 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16 Tactic: 6394572396369862482 [03/25/2022-13:24:21] [V] [TRT] Tactic: 6394572396369862482 Time: 0.298368 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 6405251167055673379 [03/25/2022-13:24:21] [V] [TRT] Tactic: 6405251167055673379 Time: 0.160768 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 6433368103202497147 [03/25/2022-13:24:21] [V] [TRT] Tactic: 6433368103202497147 Time: 0.146816 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_epifadd Tactic: 6434020722187266170 [03/25/2022-13:24:21] [V] [TRT] Tactic: 6434020722187266170 Time: 0.165248 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6441948709525127755 [03/25/2022-13:24:21] [V] [TRT] Tactic: 6441948709525127755 Time: 0.205952 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6457435868048963632 [03/25/2022-13:24:21] [V] [TRT] Tactic: 6457435868048963632 Time: 0.141568 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1 Tactic: 6510345569544721081 [03/25/2022-13:24:21] [V] [TRT] Tactic: 6510345569544721081 Time: 0.194304 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32 Tactic: 6781129591847482048 [03/25/2022-13:24:21] [V] [TRT] Tactic: 6781129591847482048 Time: 0.206976 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6925201228918187099 [03/25/2022-13:24:21] [V] [TRT] Tactic: 6925201228918187099 Time: 0.12544 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6991524515605108718 [03/25/2022-13:24:21] [V] [TRT] Tactic: 6991524515605108718 Time: 0.184576 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32 Tactic: 7077570591813340966 [03/25/2022-13:24:21] [V] [TRT] Tactic: 7077570591813340966 Time: 0.170752 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 7318929579222925725 [03/25/2022-13:24:21] [V] [TRT] Tactic: 7318929579222925725 Time: 0.14144 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: 7504901284678552178 [03/25/2022-13:24:21] [V] [TRT] Tactic: 7504901284678552178 Time: 0.135936 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 7731430299029542276 [03/25/2022-13:24:21] [V] [TRT] Tactic: 7731430299029542276 Time: 0.116864 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 7738495016763012180 [03/25/2022-13:24:21] [V] [TRT] Tactic: 7738495016763012180 Time: 0.12736 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: 7886967395128926382 [03/25/2022-13:24:21] [V] [TRT] Tactic: 7886967395128926382 Time: 0.157184 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 8234775147403903473 [03/25/2022-13:24:21] [V] [TRT] Tactic: 8234775147403903473 Time: 0.15488 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_indexed_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: 8751622450593766232 [03/25/2022-13:24:21] [V] [TRT] Tactic: 8751622450593766232 Time: 0.135424 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 8765382722978397630 [03/25/2022-13:24:21] [V] [TRT] Tactic: 8765382722978397630 Time: 0.121728 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1 Tactic: 9062173295331155069 [03/25/2022-13:24:21] [V] [TRT] Tactic: 9062173295331155069 Time: 0.268544 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_indexed_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32 Tactic: 9064458886956700976 [03/25/2022-13:24:21] [V] [TRT] Tactic: 9064458886956700976 Time: 0.1472 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32 Tactic: -9165697322068360861 [03/25/2022-13:24:21] [V] [TRT] Tactic: -9165697322068360861 Time: 0.186496 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_small_nt_v1 Tactic: -9118785798277698619 [03/25/2022-13:24:21] [V] [TRT] Tactic: -9118785798277698619 Time: 0.177792 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: -9108166971364503411 [03/25/2022-13:24:21] [V] [TRT] Tactic: -9108166971364503411 Time: 0.208896 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -8943710627305202139 [03/25/2022-13:24:21] [V] [TRT] Tactic: -8943710627305202139 Time: 0.168192 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16 Tactic: -8861822316054763526 [03/25/2022-13:24:21] [V] [TRT] Tactic: -8861822316054763526 Time: 0.200448 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64 Tactic: -8791277710877987710 [03/25/2022-13:24:21] [V] [TRT] Tactic: -8791277710877987710 Time: 0.214016 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: -8691377209893505057 [03/25/2022-13:24:21] [V] [TRT] Tactic: -8691377209893505057 Time: 0.132224 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -8638624340850784688 [03/25/2022-13:24:21] [V] [TRT] Tactic: -8638624340850784688 Time: 0.206592 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -8556775352640313933 [03/25/2022-13:24:21] [V] [TRT] Tactic: -8556775352640313933 Time: 0.13376 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: -8520292213102999339 [03/25/2022-13:24:21] [V] [TRT] Tactic: -8520292213102999339 Time: 0.20224 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32 Tactic: -8263994888336646547 [03/25/2022-13:24:21] [V] [TRT] Tactic: -8263994888336646547 Time: 0.146688 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32 Tactic: -8205948405243401049 [03/25/2022-13:24:21] [V] [TRT] Tactic: -8205948405243401049 Time: 0.242304 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -8172318747337038866 [03/25/2022-13:24:21] [V] [TRT] Tactic: -8172318747337038866 Time: 0.193792 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -7844028314176826857 [03/25/2022-13:24:21] [V] [TRT] Tactic: -7844028314176826857 Time: 0.182656 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_epifadd Tactic: -7842775553137511386 [03/25/2022-13:24:21] [V] [TRT] Tactic: -7842775553137511386 Time: 0.175872 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32 Tactic: -7683887278997527517 [03/25/2022-13:24:21] [V] [TRT] Tactic: -7683887278997527517 Time: 0.281728 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: -7381370635708568663 [03/25/2022-13:24:21] [V] [TRT] Tactic: -7381370635708568663 Time: 0.153856 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r1s1 Tactic: -7361755530333096258 [03/25/2022-13:24:21] [V] [TRT] Tactic: -7361755530333096258 Time: 0.217472 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_t1r1s1 Tactic: -7289760022626653388 [03/25/2022-13:24:21] [V] [TRT] Tactic: -7289760022626653388 Time: 0.160384 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -6812830108414456369 [03/25/2022-13:24:21] [V] [TRT] Tactic: -6812830108414456369 Time: 0.151168 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -6527178416855951297 [03/25/2022-13:24:21] [V] [TRT] Tactic: -6527178416855951297 Time: 0.2208 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -6510232214299595844 [03/25/2022-13:24:21] [V] [TRT] Tactic: -6510232214299595844 Time: 0.209792 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_small_nt_v1 Tactic: -6400348606759295499 [03/25/2022-13:24:21] [V] [TRT] Tactic: -6400348606759295499 Time: 0.167552 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32 Tactic: -6256128573036943404 [03/25/2022-13:24:21] [V] [TRT] Tactic: -6256128573036943404 Time: 0.252288 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_small_nt_v1 Tactic: -5980889159865208399 [03/25/2022-13:24:21] [V] [TRT] Tactic: -5980889159865208399 Time: 0.199168 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_medium_nt_v1 Tactic: -5766140806760372989 [03/25/2022-13:24:21] [V] [TRT] Tactic: -5766140806760372989 Time: 0.180352 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -5697614955743334137 [03/25/2022-13:24:21] [V] [TRT] Tactic: -5697614955743334137 Time: 0.162304 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -5311474420963248369 [03/25/2022-13:24:21] [V] [TRT] Tactic: -5311474420963248369 Time: 0.222976 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16 Tactic: -5180570335464125033 [03/25/2022-13:24:21] [V] [TRT] Tactic: -5180570335464125033 Time: 0.19712 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -5170003087447722174 [03/25/2022-13:24:21] [V] [TRT] Tactic: -5170003087447722174 Time: 0.205056 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -4849712423393454704 [03/25/2022-13:24:21] [V] [TRT] Tactic: -4849712423393454704 Time: 0.13312 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_medium_nt_v1 Tactic: -4516822589357530549 [03/25/2022-13:24:21] [V] [TRT] Tactic: -4516822589357530549 Time: 0.188544 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -4425346730823666456 [03/25/2022-13:24:21] [V] [TRT] Tactic: -4425346730823666456 Time: 0.214144 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r1s1 Tactic: -4260476497340370474 [03/25/2022-13:24:21] [V] [TRT] Tactic: -4260476497340370474 Time: 0.259456 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64 Tactic: -4232916483289779353 [03/25/2022-13:24:21] [V] [TRT] Tactic: -4232916483289779353 Time: 0.27328 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -4182501876984672402 [03/25/2022-13:24:21] [V] [TRT] Tactic: -4182501876984672402 Time: 0.12928 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_t1r1s1 Tactic: -4151617293257698859 [03/25/2022-13:24:21] [V] [TRT] Tactic: -4151617293257698859 Time: 0.151808 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -3862908719298381451 [03/25/2022-13:24:21] [V] [TRT] Tactic: -3862908719298381451 Time: 0.127488 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -3613322253849278738 [03/25/2022-13:24:21] [V] [TRT] Tactic: -3613322253849278738 Time: 0.2464 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -3577322188448771475 [03/25/2022-13:24:21] [V] [TRT] Tactic: -3577322188448771475 Time: 0.195456 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16_t1r1s1 Tactic: -3531681826488401618 [03/25/2022-13:24:21] [V] [TRT] Tactic: -3531681826488401618 Time: 0.188288 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64 Tactic: -3460842194336717186 [03/25/2022-13:24:21] [V] [TRT] Tactic: -3460842194336717186 Time: 0.180992 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -2754311112012636251 [03/25/2022-13:24:21] [V] [TRT] Tactic: -2754311112012636251 Time: 0.204032 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: -2499089240293650188 [03/25/2022-13:24:21] [V] [TRT] Tactic: -2499089240293650188 Time: 0.170368 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: -2352253835013627337 [03/25/2022-13:24:21] [V] [TRT] Tactic: -2352253835013627337 Time: 0.119168 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: -2328318099174473157 [03/25/2022-13:24:21] [V] [TRT] Tactic: -2328318099174473157 Time: 0.23488 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r1s1 Tactic: -2315453944962430928 [03/25/2022-13:24:21] [V] [TRT] Tactic: -2315453944962430928 Time: 0.148352 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: -2083778562631872334 [03/25/2022-13:24:21] [V] [TRT] Tactic: -2083778562631872334 Time: 0.192 [03/25/2022-13:24:21] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32 Tactic: -2054375205435666404 [03/25/2022-13:24:22] [V] [TRT] Tactic: -2054375205435666404 Time: 0.1504 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -1740762957710554518 [03/25/2022-13:24:22] [V] [TRT] Tactic: -1740762957710554518 Time: 0.268544 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -1549742793039499659 [03/25/2022-13:24:22] [V] [TRT] Tactic: -1549742793039499659 Time: 0.21952 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -1499578657823798783 [03/25/2022-13:24:22] [V] [TRT] Tactic: -1499578657823798783 Time: 0.159744 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32 Tactic: -1498626619443284096 [03/25/2022-13:24:22] [V] [TRT] Tactic: -1498626619443284096 Time: 0.25984 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -1494157908358500249 [03/25/2022-13:24:22] [V] [TRT] Tactic: -1494157908358500249 Time: 0.200576 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_no_preds Tactic: -1465330458665632513 [03/25/2022-13:24:22] [V] [TRT] Tactic: -1465330458665632513 Time: 0.15296 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_t1r1s1 Tactic: -1328736756812546664 [03/25/2022-13:24:22] [V] [TRT] Tactic: -1328736756812546664 Time: 0.14144 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32 Tactic: -1283580231568512025 [03/25/2022-13:24:22] [V] [TRT] Tactic: -1283580231568512025 Time: 0.331904 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_epifadd Tactic: -1173968681844185579 [03/25/2022-13:24:22] [V] [TRT] Tactic: -1173968681844185579 Time: 0.354304 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32 Tactic: -762222380308749469 [03/25/2022-13:24:22] [V] [TRT] Tactic: -762222380308749469 Time: 0.19776 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -713022856474991236 [03/25/2022-13:24:22] [V] [TRT] Tactic: -713022856474991236 Time: 0.254336 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -619668460699260222 [03/25/2022-13:24:22] [V] [TRT] Tactic: -619668460699260222 Time: 0.193408 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: -556794153877490941 [03/25/2022-13:24:22] [V] [TRT] Tactic: -556794153877490941 Time: 0.19328 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -405554772060757402 [03/25/2022-13:24:22] [V] [TRT] Tactic: -405554772060757402 Time: 0.147712 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -375949437730908730 [03/25/2022-13:24:22] [V] [TRT] Tactic: -375949437730908730 Time: 0.166144 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16 Tactic: -366411318217594794 [03/25/2022-13:24:22] [V] [TRT] Tactic: -366411318217594794 Time: 0.228864 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16 Tactic: -351548418071036983 [03/25/2022-13:24:22] [V] [TRT] Tactic: -351548418071036983 Time: 0.192256 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: -233227833606287806 [03/25/2022-13:24:22] [V] [TRT] Tactic: -233227833606287806 Time: 0.15232 [03/25/2022-13:24:22] [V] [TRT] Fastest Tactic: 7731430299029542276 Time: 0.116864 [03/25/2022-13:24:22] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: 7731430299029542276 [03/25/2022-13:24:22] [V] [TRT] =============== Computing costs for [03/25/2022-13:24:22] [V] [TRT] *************** Autotuning format combination: Int8(100352,3136:4,56,1) -> Int8(25088,784:4,28,1) *************** [03/25/2022-13:24:22] [V] [TRT] --------------- Timing Runner: sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 (CudaDepthwiseConvolution) [03/25/2022-13:24:22] [V] [TRT] CudaDepthwiseConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:22] [V] [TRT] --------------- Timing Runner: sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 (FusedConvActConvolution) [03/25/2022-13:24:22] [V] [TRT] FusedConvActConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:22] [V] [TRT] --------------- Timing Runner: sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 (CaskConvolution) [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_medium_nn_v1 Tactic: 175853789719975416 [03/25/2022-13:24:22] [V] [TRT] Tactic: 175853789719975416 Time: 0.48 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_medium_nn_v1 Tactic: 2171150287007712632 [03/25/2022-13:24:22] [V] [TRT] Tactic: 2171150287007712632 Time: 0.478336 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_small_nn_v1 Tactic: 2234457234705232274 [03/25/2022-13:24:22] [V] [TRT] Tactic: 2234457234705232274 Time: 0.431616 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_medium_nn_v1 Tactic: 5834048089706882838 [03/25/2022-13:24:22] [V] [TRT] Tactic: 5834048089706882838 Time: 0.434816 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_small_nn_v1 Tactic: -8626990807754934295 [03/25/2022-13:24:22] [V] [TRT] Tactic: -8626990807754934295 Time: 0.478336 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_small_nn_v1 Tactic: -7303593854972602201 [03/25/2022-13:24:22] [V] [TRT] Tactic: -7303593854972602201 Time: 0.445952 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_medium_nn_v1 Tactic: -6585664687867083638 [03/25/2022-13:24:22] [V] [TRT] Tactic: -6585664687867083638 Time: 0.443776 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_xregs_large_nn_v1 Tactic: -3730012925709297561 [03/25/2022-13:24:22] [V] [TRT] Tactic: -3730012925709297561 Time: 0.427008 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_xregs_large_nn_v1 Tactic: -2277259417488004546 [03/25/2022-13:24:22] [V] [TRT] Tactic: -2277259417488004546 Time: 0.48768 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_small_nn_v1 Tactic: -683636008127039856 [03/25/2022-13:24:22] [V] [TRT] Tactic: -683636008127039856 Time: 0.443392 [03/25/2022-13:24:22] [V] [TRT] Fastest Tactic: -3730012925709297561 Time: 0.427008 [03/25/2022-13:24:22] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: -3730012925709297561 [03/25/2022-13:24:22] [V] [TRT] *************** Autotuning format combination: Int8(100352,3136:4,56,1) -> Int8(3136,784:32,28,1) *************** [03/25/2022-13:24:22] [V] [TRT] --------------- Timing Runner: sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 (CaskConvolution) [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_xregs_large_c32_nn_v1 Tactic: 984309058095623735 [03/25/2022-13:24:22] [V] [TRT] Tactic: 984309058095623735 Time: 0.426752 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_small_c32_nn_v1 Tactic: 1100922622480907544 [03/25/2022-13:24:22] [V] [TRT] Tactic: 1100922622480907544 Time: 0.477568 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_xregs_large_c32_nn_v1 Tactic: 3238312825609165543 [03/25/2022-13:24:22] [V] [TRT] Tactic: 3238312825609165543 Time: 0.487552 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_medium_c32_nn_v1 Tactic: 3606311198834416176 [03/25/2022-13:24:22] [V] [TRT] Tactic: 3606311198834416176 Time: 0.43456 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_small_c32_nn_v1 Tactic: 4325765560739862899 [03/25/2022-13:24:22] [V] [TRT] Tactic: 4325765560739862899 Time: 0.443776 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_medium_c32_nn_v1 Tactic: -4255737803793506479 [03/25/2022-13:24:22] [V] [TRT] Tactic: -4255737803793506479 Time: 0.444672 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_small_c32_nn_v1 Tactic: -3958182351168863467 [03/25/2022-13:24:22] [V] [TRT] Tactic: -3958182351168863467 Time: 0.44608 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_medium_c32_nn_v1 Tactic: -3111968753064955248 [03/25/2022-13:24:22] [V] [TRT] Tactic: -3111968753064955248 Time: 0.46976 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_medium_c32_nn_v1 Tactic: -1492575840277333548 [03/25/2022-13:24:22] [V] [TRT] Tactic: -1492575840277333548 Time: 0.479232 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_small_c32_nn_v1 Tactic: -868495160148524802 [03/25/2022-13:24:22] [V] [TRT] Tactic: -868495160148524802 Time: 0.431232 [03/25/2022-13:24:22] [V] [TRT] Fastest Tactic: 984309058095623735 Time: 0.426752 [03/25/2022-13:24:22] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: 984309058095623735 [03/25/2022-13:24:22] [V] [TRT] *************** Autotuning format combination: Int8(12544,3136:32,56,1) -> Int8(3136,784:32,28,1) *************** [03/25/2022-13:24:22] [V] [TRT] --------------- Timing Runner: sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 (CudaGroupConvolution) [03/25/2022-13:24:22] [V] [TRT] CudaGroupConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:22] [V] [TRT] --------------- Timing Runner: sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 (CudaDepthwiseConvolution) [03/25/2022-13:24:22] [V] [TRT] CudaDepthwiseConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:22] [V] [TRT] --------------- Timing Runner: sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 (FusedConvActConvolution) [03/25/2022-13:24:22] [V] [TRT] FusedConvActConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:22] [V] [TRT] --------------- Timing Runner: sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 (CaskConvolution) [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_epifadd Tactic: 177040020707947851 [03/25/2022-13:24:22] [V] [TRT] Tactic: 177040020707947851 Time: 0.2144 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 184229963126259101 [03/25/2022-13:24:22] [V] [TRT] Tactic: 184229963126259101 Time: 0.165632 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 289888059097454627 [03/25/2022-13:24:22] [V] [TRT] Tactic: 289888059097454627 Time: 0.147584 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32 Tactic: 328135613486708155 [03/25/2022-13:24:22] [V] [TRT] Tactic: 328135613486708155 Time: 0.324224 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 680740992583869928 [03/25/2022-13:24:22] [V] [TRT] Tactic: 680740992583869928 Time: 0.14336 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32 Tactic: 1111159740952609683 [03/25/2022-13:24:22] [V] [TRT] Tactic: 1111159740952609683 Time: 0.140544 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: 1134860903395928905 [03/25/2022-13:24:22] [V] [TRT] Tactic: 1134860903395928905 Time: 0.147712 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: 1276591930377039442 [03/25/2022-13:24:22] [V] [TRT] Tactic: 1276591930377039442 Time: 0.1728 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r3s3 Tactic: 1388866374720163187 [03/25/2022-13:24:22] [V] [TRT] Tactic: 1388866374720163187 Time: 0.192768 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32 Tactic: 1399501420456320585 [03/25/2022-13:24:22] [V] [TRT] Tactic: 1399501420456320585 Time: 0.1888 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: 1550399266192842845 [03/25/2022-13:24:22] [V] [TRT] Tactic: 1550399266192842845 Time: 0.175104 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 1572887561103143487 [03/25/2022-13:24:22] [V] [TRT] Tactic: 1572887561103143487 Time: 0.154624 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 1853122447892949466 [03/25/2022-13:24:22] [V] [TRT] Tactic: 1853122447892949466 Time: 0.187776 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: 2133329569091732311 [03/25/2022-13:24:22] [V] [TRT] Tactic: 2133329569091732311 Time: 0.150912 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32 Tactic: 2325023763229477890 [03/25/2022-13:24:22] [V] [TRT] Tactic: 2325023763229477890 Time: 0.108672 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: 2579824863892891529 [03/25/2022-13:24:22] [V] [TRT] Tactic: 2579824863892891529 Time: 0.215296 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32 Tactic: 2783960536172159663 [03/25/2022-13:24:22] [V] [TRT] Tactic: 2783960536172159663 Time: 0.096512 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32 Tactic: 2821711838552913693 [03/25/2022-13:24:22] [V] [TRT] Tactic: 2821711838552913693 Time: 0.130816 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64 Tactic: 2945009978756227538 [03/25/2022-13:24:22] [V] [TRT] Tactic: 2945009978756227538 Time: 0.107648 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_medium_nt_v1 Tactic: 2985940154541537814 [03/25/2022-13:24:22] [V] [TRT] Tactic: 2985940154541537814 Time: 0.145792 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 3284282970967328046 [03/25/2022-13:24:22] [V] [TRT] Tactic: 3284282970967328046 Time: 0.210304 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 3401614690060226673 [03/25/2022-13:24:22] [V] [TRT] Tactic: 3401614690060226673 Time: 0.17984 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32 Tactic: 3456719996792527006 [03/25/2022-13:24:22] [V] [TRT] Tactic: 3456719996792527006 Time: 0.146304 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 3512426920013359699 [03/25/2022-13:24:22] [V] [TRT] Tactic: 3512426920013359699 Time: 0.119808 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64 Tactic: 3651043333819148268 [03/25/2022-13:24:22] [V] [TRT] Tactic: 3651043333819148268 Time: 0.142592 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_medium_nt_v1 Tactic: 3899284354987683408 [03/25/2022-13:24:22] [V] [TRT] Tactic: 3899284354987683408 Time: 0.153728 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32 Tactic: 4042202769383439184 [03/25/2022-13:24:22] [V] [TRT] Tactic: 4042202769383439184 Time: 0.102272 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_large_nt_v1 Tactic: 4182625619810185112 [03/25/2022-13:24:22] [V] [TRT] Tactic: 4182625619810185112 Time: 0.157952 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 4214794893922618058 [03/25/2022-13:24:22] [V] [TRT] Tactic: 4214794893922618058 Time: 0.147712 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_epifadd Tactic: 4259547356717612415 [03/25/2022-13:24:22] [V] [TRT] Tactic: 4259547356717612415 Time: 0.157056 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 4384868749799132354 [03/25/2022-13:24:22] [V] [TRT] Tactic: 4384868749799132354 Time: 0.214656 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64 Tactic: 4414594337986714263 [03/25/2022-13:24:22] [V] [TRT] Tactic: 4414594337986714263 Time: 0.093056 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_small_nt_v1 Tactic: 4717285412741024953 [03/25/2022-13:24:22] [V] [TRT] Tactic: 4717285412741024953 Time: 0.1472 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32 Tactic: 4734519122557206480 [03/25/2022-13:24:22] [V] [TRT] Tactic: 4734519122557206480 Time: 0.166144 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: 4922297020351187339 [03/25/2022-13:24:22] [V] [TRT] Tactic: 4922297020351187339 Time: 0.129792 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16_t1r3s3 Tactic: 4931167631624420067 [03/25/2022-13:24:22] [V] [TRT] Tactic: 4931167631624420067 Time: 0.274688 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_epifadd Tactic: 5121596860264626879 [03/25/2022-13:24:22] [V] [TRT] Tactic: 5121596860264626879 Time: 0.15488 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 5136656982162849059 [03/25/2022-13:24:22] [V] [TRT] Tactic: 5136656982162849059 Time: 0.210432 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: 5158259316594207439 [03/25/2022-13:24:22] [V] [TRT] Tactic: 5158259316594207439 Time: 0.102144 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 5189825015507701541 [03/25/2022-13:24:22] [V] [TRT] Tactic: 5189825015507701541 Time: 0.345344 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32 Tactic: 5424417905073460656 [03/25/2022-13:24:22] [V] [TRT] Tactic: 5424417905073460656 Time: 0.154752 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32 Tactic: 5442043907221427810 [03/25/2022-13:24:22] [V] [TRT] Tactic: 5442043907221427810 Time: 0.16512 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_t1r3s3 Tactic: 5544365258913999384 [03/25/2022-13:24:22] [V] [TRT] Tactic: 5544365258913999384 Time: 0.161024 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r3s3 Tactic: 5641967928706599451 [03/25/2022-13:24:22] [V] [TRT] Tactic: 5641967928706599451 Time: 0.252288 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 5721595115357140131 [03/25/2022-13:24:22] [V] [TRT] Tactic: 5721595115357140131 Time: 0.12864 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 5966973378912044513 [03/25/2022-13:24:22] [V] [TRT] Tactic: 5966973378912044513 Time: 0.1088 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 6004789655466615912 [03/25/2022-13:24:22] [V] [TRT] Tactic: 6004789655466615912 Time: 0.15424 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r3s3 Tactic: 6146901278630392829 [03/25/2022-13:24:22] [V] [TRT] Tactic: 6146901278630392829 Time: 0.164224 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16 Tactic: 6394572396369862482 [03/25/2022-13:24:22] [V] [TRT] Tactic: 6394572396369862482 Time: 0.255616 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_epifadd Tactic: 6434020722187266170 [03/25/2022-13:24:22] [V] [TRT] Tactic: 6434020722187266170 Time: 0.095872 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32 Tactic: 6781129591847482048 [03/25/2022-13:24:22] [V] [TRT] Tactic: 6781129591847482048 Time: 0.109824 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 6984451771200230840 [03/25/2022-13:24:22] [V] [TRT] Tactic: 6984451771200230840 Time: 0.166656 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 7048234086361926570 [03/25/2022-13:24:22] [V] [TRT] Tactic: 7048234086361926570 Time: 0.168704 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32 Tactic: 7077570591813340966 [03/25/2022-13:24:22] [V] [TRT] Tactic: 7077570591813340966 Time: 0.09344 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 7191893591576074000 [03/25/2022-13:24:22] [V] [TRT] Tactic: 7191893591576074000 Time: 0.174336 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_t1r3s3 Tactic: 7429976449747682901 [03/25/2022-13:24:22] [V] [TRT] Tactic: 7429976449747682901 Time: 0.124544 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 7438984192263206338 [03/25/2022-13:24:22] [V] [TRT] Tactic: 7438984192263206338 Time: 0.101248 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: 7504901284678552178 [03/25/2022-13:24:22] [V] [TRT] Tactic: 7504901284678552178 Time: 0.087168 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 8096257414008860171 [03/25/2022-13:24:22] [V] [TRT] Tactic: 8096257414008860171 Time: 0.107776 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 8128112048355596715 [03/25/2022-13:24:22] [V] [TRT] Tactic: 8128112048355596715 Time: 0.102656 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_indexed_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: 8751622450593766232 [03/25/2022-13:24:22] [V] [TRT] Tactic: 8751622450593766232 Time: 0.093568 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_indexed_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32 Tactic: 9064458886956700976 [03/25/2022-13:24:22] [V] [TRT] Tactic: 9064458886956700976 Time: 0.094464 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 9143438935315839085 [03/25/2022-13:24:22] [V] [TRT] Tactic: 9143438935315839085 Time: 0.178048 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32 Tactic: -9165697322068360861 [03/25/2022-13:24:22] [V] [TRT] Tactic: -9165697322068360861 Time: 0.10176 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_small_nt_v1 Tactic: -9118785798277698619 [03/25/2022-13:24:22] [V] [TRT] Tactic: -9118785798277698619 Time: 0.144512 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: -9108166971364503411 [03/25/2022-13:24:22] [V] [TRT] Tactic: -9108166971364503411 Time: 0.169856 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16 Tactic: -8861822316054763526 [03/25/2022-13:24:22] [V] [TRT] Tactic: -8861822316054763526 Time: 0.149632 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64 Tactic: -8791277710877987710 [03/25/2022-13:24:22] [V] [TRT] Tactic: -8791277710877987710 Time: 0.152192 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: -8691377209893505057 [03/25/2022-13:24:22] [V] [TRT] Tactic: -8691377209893505057 Time: 0.0928 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: -8520292213102999339 [03/25/2022-13:24:22] [V] [TRT] Tactic: -8520292213102999339 Time: 0.140288 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16_t1r3s3 Tactic: -8475551154769412306 [03/25/2022-13:24:22] [V] [TRT] Tactic: -8475551154769412306 Time: 0.147968 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -8417388128970254446 [03/25/2022-13:24:22] [V] [TRT] Tactic: -8417388128970254446 Time: 0.137472 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32 Tactic: -8263994888336646547 [03/25/2022-13:24:22] [V] [TRT] Tactic: -8263994888336646547 Time: 0.08832 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32 Tactic: -8205948405243401049 [03/25/2022-13:24:22] [V] [TRT] Tactic: -8205948405243401049 Time: 0.175616 [03/25/2022-13:24:22] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: -7992068592656168418 [03/25/2022-13:24:23] [V] [TRT] Tactic: -7992068592656168418 Time: 0.1088 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_t1r3s3 Tactic: -7898477046581738867 [03/25/2022-13:24:23] [V] [TRT] Tactic: -7898477046581738867 Time: 0.142848 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_epifadd Tactic: -7842775553137511386 [03/25/2022-13:24:23] [V] [TRT] Tactic: -7842775553137511386 Time: 0.109056 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32 Tactic: -7683887278997527517 [03/25/2022-13:24:23] [V] [TRT] Tactic: -7683887278997527517 Time: 0.169984 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: -7381370635708568663 [03/25/2022-13:24:23] [V] [TRT] Tactic: -7381370635708568663 Time: 0.105984 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -7129320389887881029 [03/25/2022-13:24:23] [V] [TRT] Tactic: -7129320389887881029 Time: 0.142976 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -6959995514028471820 [03/25/2022-13:24:23] [V] [TRT] Tactic: -6959995514028471820 Time: 0.131968 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_small_nt_v1 Tactic: -6400348606759295499 [03/25/2022-13:24:23] [V] [TRT] Tactic: -6400348606759295499 Time: 0.141056 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: -6371781333659293809 [03/25/2022-13:24:23] [V] [TRT] Tactic: -6371781333659293809 Time: 0.171264 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32 Tactic: -6256128573036943404 [03/25/2022-13:24:23] [V] [TRT] Tactic: -6256128573036943404 Time: 0.169088 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_small_nt_v1 Tactic: -5980889159865208399 [03/25/2022-13:24:23] [V] [TRT] Tactic: -5980889159865208399 Time: 0.14976 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_medium_nt_v1 Tactic: -5766140806760372989 [03/25/2022-13:24:23] [V] [TRT] Tactic: -5766140806760372989 Time: 0.149376 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: -5709079507616090666 [03/25/2022-13:24:23] [V] [TRT] Tactic: -5709079507616090666 Time: 0.086016 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: -5698636014239116282 [03/25/2022-13:24:23] [V] [TRT] Tactic: -5698636014239116282 Time: 0.151808 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16 Tactic: -5180570335464125033 [03/25/2022-13:24:23] [V] [TRT] Tactic: -5180570335464125033 Time: 0.155904 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -4933563390723451692 [03/25/2022-13:24:23] [V] [TRT] Tactic: -4933563390723451692 Time: 0.11904 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_medium_nt_v1 Tactic: -4516822589357530549 [03/25/2022-13:24:23] [V] [TRT] Tactic: -4516822589357530549 Time: 0.153728 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64 Tactic: -4232916483289779353 [03/25/2022-13:24:23] [V] [TRT] Tactic: -4232916483289779353 Time: 0.240768 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64 Tactic: -3460842194336717186 [03/25/2022-13:24:23] [V] [TRT] Tactic: -3460842194336717186 Time: 0.111744 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: -3413217501222406256 [03/25/2022-13:24:23] [V] [TRT] Tactic: -3413217501222406256 Time: 0.092544 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: -3280888557222886418 [03/25/2022-13:24:23] [V] [TRT] Tactic: -3280888557222886418 Time: 0.128384 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -3238475748440751107 [03/25/2022-13:24:23] [V] [TRT] Tactic: -3238475748440751107 Time: 0.100608 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: -3182884991006484042 [03/25/2022-13:24:23] [V] [TRT] Tactic: -3182884991006484042 Time: 0.10752 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -3173468756112541306 [03/25/2022-13:24:23] [V] [TRT] Tactic: -3173468756112541306 Time: 0.17344 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_large_nt_v1 Tactic: -2917455979290586480 [03/25/2022-13:24:23] [V] [TRT] Tactic: -2917455979290586480 Time: 0.160896 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -2741641298163591508 [03/25/2022-13:24:23] [V] [TRT] Tactic: -2741641298163591508 Time: 0.091648 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_large_nt_v1 Tactic: -2571022005763160364 [03/25/2022-13:24:23] [V] [TRT] Tactic: -2571022005763160364 Time: 0.156416 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: -2499089240293650188 [03/25/2022-13:24:23] [V] [TRT] Tactic: -2499089240293650188 Time: 0.144128 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: -2328318099174473157 [03/25/2022-13:24:23] [V] [TRT] Tactic: -2328318099174473157 Time: 0.17344 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: -2083778562631872334 [03/25/2022-13:24:23] [V] [TRT] Tactic: -2083778562631872334 Time: 0.110464 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32 Tactic: -2054375205435666404 [03/25/2022-13:24:23] [V] [TRT] Tactic: -2054375205435666404 Time: 0.126464 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -1546787387293556842 [03/25/2022-13:24:23] [V] [TRT] Tactic: -1546787387293556842 Time: 0.086272 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32 Tactic: -1498626619443284096 [03/25/2022-13:24:23] [V] [TRT] Tactic: -1498626619443284096 Time: 0.1568 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -1471245223605064669 [03/25/2022-13:24:23] [V] [TRT] Tactic: -1471245223605064669 Time: 0.124288 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32 Tactic: -1283580231568512025 [03/25/2022-13:24:23] [V] [TRT] Tactic: -1283580231568512025 Time: 0.211456 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: -1224421172675151280 [03/25/2022-13:24:23] [V] [TRT] Tactic: -1224421172675151280 Time: 0.092544 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_epifadd Tactic: -1173968681844185579 [03/25/2022-13:24:23] [V] [TRT] Tactic: -1173968681844185579 Time: 0.214656 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -921247911551089037 [03/25/2022-13:24:23] [V] [TRT] Tactic: -921247911551089037 Time: 0.091776 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32 Tactic: -762222380308749469 [03/25/2022-13:24:23] [V] [TRT] Tactic: -762222380308749469 Time: 0.121088 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: -556794153877490941 [03/25/2022-13:24:23] [V] [TRT] Tactic: -556794153877490941 Time: 0.123136 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -516725800067794372 [03/25/2022-13:24:23] [V] [TRT] Tactic: -516725800067794372 Time: 0.0992 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_large_nt_v1 Tactic: -428104331444385564 [03/25/2022-13:24:23] [V] [TRT] Tactic: -428104331444385564 Time: 0.151808 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16 Tactic: -366411318217594794 [03/25/2022-13:24:23] [V] [TRT] Tactic: -366411318217594794 Time: 0.19456 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16 Tactic: -351548418071036983 [03/25/2022-13:24:23] [V] [TRT] Tactic: -351548418071036983 Time: 0.27776 [03/25/2022-13:24:23] [V] [TRT] Fastest Tactic: -5709079507616090666 Time: 0.086016 [03/25/2022-13:24:23] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: -5709079507616090666 [03/25/2022-13:24:23] [V] [TRT] =============== Computing costs for [03/25/2022-13:24:23] [V] [TRT] *************** Autotuning format combination: Int8(25088,784:4,28,1), Int8(100352,784:4,28,1) -> Int8(100352,784:4,28,1) *************** [03/25/2022-13:24:23] [V] [TRT] --------------- Timing Runner: sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 (CudaDepthwiseConvolution) [03/25/2022-13:24:23] [V] [TRT] CudaDepthwiseConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:23] [V] [TRT] --------------- Timing Runner: sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 (FusedConvActConvolution) [03/25/2022-13:24:23] [V] [TRT] FusedConvActConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:23] [V] [TRT] --------------- Timing Runner: sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 (CaskConvolution) [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_medium_nn_v1 Tactic: 175853789719975416 [03/25/2022-13:24:23] [V] [TRT] Tactic: 175853789719975416 Time: 0.292992 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_medium_nn_v1 Tactic: 2171150287007712632 [03/25/2022-13:24:23] [V] [TRT] Tactic: 2171150287007712632 Time: 0.333056 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_small_nn_v1 Tactic: 2234457234705232274 [03/25/2022-13:24:23] [V] [TRT] Tactic: 2234457234705232274 Time: 0.251008 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_medium_nn_v1 Tactic: 5834048089706882838 [03/25/2022-13:24:23] [V] [TRT] Tactic: 5834048089706882838 Time: 0.254848 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_interior_nn_v1 Tactic: 6299962968199310600 [03/25/2022-13:24:23] [V] [TRT] Tactic: 6299962968199310600 Time: 0.263552 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_interior_nn_v1 Tactic: 6341572697076960911 [03/25/2022-13:24:23] [V] [TRT] Tactic: 6341572697076960911 Time: 0.302976 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_small_nn_v1 Tactic: -8626990807754934295 [03/25/2022-13:24:23] [V] [TRT] Tactic: -8626990807754934295 Time: 0.28736 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_interior_nn_v1 Tactic: -8498217049614706532 [03/25/2022-13:24:23] [V] [TRT] Tactic: -8498217049614706532 Time: 0.24512 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_small_nn_v1 Tactic: -7303593854972602201 [03/25/2022-13:24:23] [V] [TRT] Tactic: -7303593854972602201 Time: 0.319872 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_medium_nn_v1 Tactic: -6585664687867083638 [03/25/2022-13:24:23] [V] [TRT] Tactic: -6585664687867083638 Time: 0.266624 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_interior_nn_v1 Tactic: -3326139578711341011 [03/25/2022-13:24:23] [V] [TRT] Tactic: -3326139578711341011 Time: 0.276352 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_small_nn_v1 Tactic: -683636008127039856 [03/25/2022-13:24:23] [V] [TRT] Tactic: -683636008127039856 Time: 0.265728 [03/25/2022-13:24:23] [V] [TRT] Fastest Tactic: -8498217049614706532 Time: 0.24512 [03/25/2022-13:24:23] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: -8498217049614706532 [03/25/2022-13:24:23] [V] [TRT] *************** Autotuning format combination: Int8(25088,784:4,28,1), Int8(12544,784:32,28,1) -> Int8(12544,784:32,28,1) *************** [03/25/2022-13:24:23] [V] [TRT] --------------- Timing Runner: sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 (CaskConvolution) [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_small_c32_nn_v1 Tactic: 1100922622480907544 [03/25/2022-13:24:23] [V] [TRT] Tactic: 1100922622480907544 Time: 0.278272 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_interior_c32_nn_v1 Tactic: 2855900226702061782 [03/25/2022-13:24:23] [V] [TRT] Tactic: 2855900226702061782 Time: 0.263168 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_medium_c32_nn_v1 Tactic: 3606311198834416176 [03/25/2022-13:24:23] [V] [TRT] Tactic: 3606311198834416176 Time: 0.248064 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_small_c32_nn_v1 Tactic: 4325765560739862899 [03/25/2022-13:24:23] [V] [TRT] Tactic: 4325765560739862899 Time: 0.266112 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_interior_c32_nn_v1 Tactic: 8803458114157674373 [03/25/2022-13:24:23] [V] [TRT] Tactic: 8803458114157674373 Time: 0.239872 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_interior_c32_nn_v1 Tactic: -6934773036503365000 [03/25/2022-13:24:23] [V] [TRT] Tactic: -6934773036503365000 Time: 0.268544 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_interior_c32_nn_v1 Tactic: -4431642509665791294 [03/25/2022-13:24:23] [V] [TRT] Tactic: -4431642509665791294 Time: 0.300288 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_medium_c32_nn_v1 Tactic: -4255737803793506479 [03/25/2022-13:24:23] [V] [TRT] Tactic: -4255737803793506479 Time: 0.26688 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_small_c32_nn_v1 Tactic: -3958182351168863467 [03/25/2022-13:24:23] [V] [TRT] Tactic: -3958182351168863467 Time: 0.30656 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_medium_c32_nn_v1 Tactic: -3111968753064955248 [03/25/2022-13:24:23] [V] [TRT] Tactic: -3111968753064955248 Time: 0.32128 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_medium_c32_nn_v1 Tactic: -1492575840277333548 [03/25/2022-13:24:23] [V] [TRT] Tactic: -1492575840277333548 Time: 0.284288 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_small_c32_nn_v1 Tactic: -868495160148524802 [03/25/2022-13:24:23] [V] [TRT] Tactic: -868495160148524802 Time: 0.244096 [03/25/2022-13:24:23] [V] [TRT] Fastest Tactic: 8803458114157674373 Time: 0.239872 [03/25/2022-13:24:23] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: 8803458114157674373 [03/25/2022-13:24:23] [V] [TRT] *************** Autotuning format combination: Int8(3136,784:32,28,1), Int8(12544,784:32,28,1) -> Int8(12544,784:32,28,1) *************** [03/25/2022-13:24:23] [V] [TRT] --------------- Timing Runner: sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 (CudaGroupConvolution) [03/25/2022-13:24:23] [V] [TRT] CudaGroupConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:23] [V] [TRT] --------------- Timing Runner: sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 (CudaDepthwiseConvolution) [03/25/2022-13:24:23] [V] [TRT] CudaDepthwiseConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:23] [V] [TRT] --------------- Timing Runner: sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 (FusedConvActConvolution) [03/25/2022-13:24:23] [V] [TRT] FusedConvActConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:23] [V] [TRT] --------------- Timing Runner: sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 (CaskConvolution) [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16_t1r1s1 Tactic: 68468667201176803 [03/25/2022-13:24:23] [V] [TRT] Tactic: 68468667201176803 Time: 0.167296 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 125145153013230687 [03/25/2022-13:24:23] [V] [TRT] Tactic: 125145153013230687 Time: 0.166144 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_interior_nt_v1 Tactic: 434957160407688216 [03/25/2022-13:24:23] [V] [TRT] Tactic: 434957160407688216 Time: 0.180096 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 805889586762897346 [03/25/2022-13:24:23] [V] [TRT] Tactic: 805889586762897346 Time: 0.140672 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: 857001784974286465 [03/25/2022-13:24:23] [V] [TRT] Tactic: 857001784974286465 Time: 0.14848 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 1214130898909872671 [03/25/2022-13:24:23] [V] [TRT] Tactic: 1214130898909872671 Time: 0.187264 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 1278425129871930205 [03/25/2022-13:24:23] [V] [TRT] Tactic: 1278425129871930205 Time: 0.132096 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 1583811548148740665 [03/25/2022-13:24:23] [V] [TRT] Tactic: 1583811548148740665 Time: 0.151936 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 1701344857577810806 [03/25/2022-13:24:23] [V] [TRT] Tactic: 1701344857577810806 Time: 0.177792 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 1797231177354918208 [03/25/2022-13:24:23] [V] [TRT] Tactic: 1797231177354918208 Time: 0.18432 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2004812516525036381 [03/25/2022-13:24:23] [V] [TRT] Tactic: 2004812516525036381 Time: 0.15232 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2030033463723799063 [03/25/2022-13:24:23] [V] [TRT] Tactic: 2030033463723799063 Time: 0.132992 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_interior_nt_v1 Tactic: 2346437292116182513 [03/25/2022-13:24:23] [V] [TRT] Tactic: 2346437292116182513 Time: 0.15424 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2376898825218218566 [03/25/2022-13:24:23] [V] [TRT] Tactic: 2376898825218218566 Time: 0.1184 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_interior_nt_v1 Tactic: 2522133112320625287 [03/25/2022-13:24:23] [V] [TRT] Tactic: 2522133112320625287 Time: 0.15808 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2548171972648455240 [03/25/2022-13:24:23] [V] [TRT] Tactic: 2548171972648455240 Time: 0.12288 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 2548946449357458230 [03/25/2022-13:24:23] [V] [TRT] Tactic: 2548946449357458230 Time: 0.169344 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 2678520742286844763 [03/25/2022-13:24:23] [V] [TRT] Tactic: 2678520742286844763 Time: 0.233088 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 2756291002030759362 [03/25/2022-13:24:23] [V] [TRT] Tactic: 2756291002030759362 Time: 0.168192 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2972948223367788520 [03/25/2022-13:24:23] [V] [TRT] Tactic: 2972948223367788520 Time: 0.115584 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_medium_nt_v1 Tactic: 2985940154541537814 [03/25/2022-13:24:23] [V] [TRT] Tactic: 2985940154541537814 Time: 0.155776 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 3043273137345374664 [03/25/2022-13:24:23] [V] [TRT] Tactic: 3043273137345374664 Time: 0.16704 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 3221677093659484230 [03/25/2022-13:24:23] [V] [TRT] Tactic: 3221677093659484230 Time: 0.203904 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: 3242897809704328258 [03/25/2022-13:24:23] [V] [TRT] Tactic: 3242897809704328258 Time: 0.174848 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 3312456766204252694 [03/25/2022-13:24:23] [V] [TRT] Tactic: 3312456766204252694 Time: 0.158592 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 3538565962642681625 [03/25/2022-13:24:23] [V] [TRT] Tactic: 3538565962642681625 Time: 0.163584 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 3541919052468401776 [03/25/2022-13:24:23] [V] [TRT] Tactic: 3541919052468401776 Time: 0.153984 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 3593397928177382100 [03/25/2022-13:24:23] [V] [TRT] Tactic: 3593397928177382100 Time: 0.188544 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 3670282018109435863 [03/25/2022-13:24:23] [V] [TRT] Tactic: 3670282018109435863 Time: 0.135296 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 3671413346254027573 [03/25/2022-13:24:23] [V] [TRT] Tactic: 3671413346254027573 Time: 0.163456 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_medium_nt_v1 Tactic: 3899284354987683408 [03/25/2022-13:24:23] [V] [TRT] Tactic: 3899284354987683408 Time: 0.180096 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 3927509214678622419 [03/25/2022-13:24:23] [V] [TRT] Tactic: 3927509214678622419 Time: 0.167168 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 4112572034735311841 [03/25/2022-13:24:23] [V] [TRT] Tactic: 4112572034735311841 Time: 0.222592 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 4239974928951431644 [03/25/2022-13:24:23] [V] [TRT] Tactic: 4239974928951431644 Time: 0.123392 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 4610760414797216079 [03/25/2022-13:24:23] [V] [TRT] Tactic: 4610760414797216079 Time: 0.13952 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_small_nt_v1 Tactic: 4717285412741024953 [03/25/2022-13:24:23] [V] [TRT] Tactic: 4717285412741024953 Time: 0.162048 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 4796956614760326119 [03/25/2022-13:24:23] [V] [TRT] Tactic: 4796956614760326119 Time: 0.142592 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 4919361344804309192 [03/25/2022-13:24:23] [V] [TRT] Tactic: 4919361344804309192 Time: 0.194944 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 5043674678294309681 [03/25/2022-13:24:23] [V] [TRT] Tactic: 5043674678294309681 Time: 0.161792 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_interior_nt_v1 Tactic: 5126565865931538390 [03/25/2022-13:24:23] [V] [TRT] Tactic: 5126565865931538390 Time: 0.159744 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5204702486885981735 [03/25/2022-13:24:23] [V] [TRT] Tactic: 5204702486885981735 Time: 0.143104 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 5375256703210220108 [03/25/2022-13:24:23] [V] [TRT] Tactic: 5375256703210220108 Time: 0.159744 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 5424258848951129084 [03/25/2022-13:24:23] [V] [TRT] Tactic: 5424258848951129084 Time: 0.13952 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5443897483205284103 [03/25/2022-13:24:23] [V] [TRT] Tactic: 5443897483205284103 Time: 0.173312 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5707566217891294846 [03/25/2022-13:24:23] [V] [TRT] Tactic: 5707566217891294846 Time: 0.128256 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5986622376339202983 [03/25/2022-13:24:23] [V] [TRT] Tactic: 5986622376339202983 Time: 0.143232 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6007888770437705057 [03/25/2022-13:24:23] [V] [TRT] Tactic: 6007888770437705057 Time: 0.14656 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 6405251167055673379 [03/25/2022-13:24:23] [V] [TRT] Tactic: 6405251167055673379 Time: 0.14784 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 6433368103202497147 [03/25/2022-13:24:23] [V] [TRT] Tactic: 6433368103202497147 Time: 0.134656 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6441948709525127755 [03/25/2022-13:24:23] [V] [TRT] Tactic: 6441948709525127755 Time: 0.193536 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 6443933097134654777 [03/25/2022-13:24:23] [V] [TRT] Tactic: 6443933097134654777 Time: 0.14592 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6457435868048963632 [03/25/2022-13:24:23] [V] [TRT] Tactic: 6457435868048963632 Time: 0.140416 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1 Tactic: 6510345569544721081 [03/25/2022-13:24:23] [V] [TRT] Tactic: 6510345569544721081 Time: 0.228992 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 6793988781414507278 [03/25/2022-13:24:23] [V] [TRT] Tactic: 6793988781414507278 Time: 0.12416 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 6880710371738875469 [03/25/2022-13:24:23] [V] [TRT] Tactic: 6880710371738875469 Time: 0.141568 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6925201228918187099 [03/25/2022-13:24:23] [V] [TRT] Tactic: 6925201228918187099 Time: 0.12928 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6991524515605108718 [03/25/2022-13:24:23] [V] [TRT] Tactic: 6991524515605108718 Time: 0.19264 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 7245509442265271220 [03/25/2022-13:24:23] [V] [TRT] Tactic: 7245509442265271220 Time: 0.133248 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 7318929579222925725 [03/25/2022-13:24:23] [V] [TRT] Tactic: 7318929579222925725 Time: 0.1344 [03/25/2022-13:24:23] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 7731430299029542276 [03/25/2022-13:24:24] [V] [TRT] Tactic: 7731430299029542276 Time: 0.123904 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 7738495016763012180 [03/25/2022-13:24:24] [V] [TRT] Tactic: 7738495016763012180 Time: 0.13632 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 8142283985160822229 [03/25/2022-13:24:24] [V] [TRT] Tactic: 8142283985160822229 Time: 0.140672 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 8173975624668590862 [03/25/2022-13:24:24] [V] [TRT] Tactic: 8173975624668590862 Time: 0.141952 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 8234775147403903473 [03/25/2022-13:24:24] [V] [TRT] Tactic: 8234775147403903473 Time: 0.147712 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 8524082966802584889 [03/25/2022-13:24:24] [V] [TRT] Tactic: 8524082966802584889 Time: 0.127232 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 8684013308930763400 [03/25/2022-13:24:24] [V] [TRT] Tactic: 8684013308930763400 Time: 0.159616 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 8765382722978397630 [03/25/2022-13:24:24] [V] [TRT] Tactic: 8765382722978397630 Time: 0.12672 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 8843193587782643431 [03/25/2022-13:24:24] [V] [TRT] Tactic: 8843193587782643431 Time: 0.181504 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 8883810517410230831 [03/25/2022-13:24:24] [V] [TRT] Tactic: 8883810517410230831 Time: 0.162048 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 8930797211803511337 [03/25/2022-13:24:24] [V] [TRT] Tactic: 8930797211803511337 Time: 0.179712 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 8935070489925739043 [03/25/2022-13:24:24] [V] [TRT] Tactic: 8935070489925739043 Time: 0.13504 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1 Tactic: 9062173295331155069 [03/25/2022-13:24:24] [V] [TRT] Tactic: 9062173295331155069 Time: 0.235648 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_small_nt_v1 Tactic: -9118785798277698619 [03/25/2022-13:24:24] [V] [TRT] Tactic: -9118785798277698619 Time: 0.160512 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -8985599729413291927 [03/25/2022-13:24:24] [V] [TRT] Tactic: -8985599729413291927 Time: 0.1536 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -8972697510150675429 [03/25/2022-13:24:24] [V] [TRT] Tactic: -8972697510150675429 Time: 0.158976 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -8943710627305202139 [03/25/2022-13:24:24] [V] [TRT] Tactic: -8943710627305202139 Time: 0.142976 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -8859846367886814331 [03/25/2022-13:24:24] [V] [TRT] Tactic: -8859846367886814331 Time: 0.163968 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -8638624340850784688 [03/25/2022-13:24:24] [V] [TRT] Tactic: -8638624340850784688 Time: 0.1984 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -8556775352640313933 [03/25/2022-13:24:24] [V] [TRT] Tactic: -8556775352640313933 Time: 0.127744 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -8382298409581540699 [03/25/2022-13:24:24] [V] [TRT] Tactic: -8382298409581540699 Time: 0.177536 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -8172318747337038866 [03/25/2022-13:24:24] [V] [TRT] Tactic: -8172318747337038866 Time: 0.190848 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -8038164441468184723 [03/25/2022-13:24:24] [V] [TRT] Tactic: -8038164441468184723 Time: 0.1344 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -7844028314176826857 [03/25/2022-13:24:24] [V] [TRT] Tactic: -7844028314176826857 Time: 0.172416 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -7364286662638617917 [03/25/2022-13:24:24] [V] [TRT] Tactic: -7364286662638617917 Time: 0.132096 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r1s1 Tactic: -7361755530333096258 [03/25/2022-13:24:24] [V] [TRT] Tactic: -7361755530333096258 Time: 0.178048 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_t1r1s1 Tactic: -7289760022626653388 [03/25/2022-13:24:24] [V] [TRT] Tactic: -7289760022626653388 Time: 0.162816 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -7106539943789766885 [03/25/2022-13:24:24] [V] [TRT] Tactic: -7106539943789766885 Time: 0.18368 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -6969478418607271266 [03/25/2022-13:24:24] [V] [TRT] Tactic: -6969478418607271266 Time: 0.184704 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -6930438165437733000 [03/25/2022-13:24:24] [V] [TRT] Tactic: -6930438165437733000 Time: 0.217984 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -6879607992933502380 [03/25/2022-13:24:24] [V] [TRT] Tactic: -6879607992933502380 Time: 0.130176 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -6527178416855951297 [03/25/2022-13:24:24] [V] [TRT] Tactic: -6527178416855951297 Time: 0.204288 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -6510232214299595844 [03/25/2022-13:24:24] [V] [TRT] Tactic: -6510232214299595844 Time: 0.210304 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_small_nt_v1 Tactic: -6400348606759295499 [03/25/2022-13:24:24] [V] [TRT] Tactic: -6400348606759295499 Time: 0.152832 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -6346247605026339453 [03/25/2022-13:24:24] [V] [TRT] Tactic: -6346247605026339453 Time: 0.139648 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -6232597026469067819 [03/25/2022-13:24:24] [V] [TRT] Tactic: -6232597026469067819 Time: 0.188672 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_small_nt_v1 Tactic: -5980889159865208399 [03/25/2022-13:24:24] [V] [TRT] Tactic: -5980889159865208399 Time: 0.177792 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_medium_nt_v1 Tactic: -5766140806760372989 [03/25/2022-13:24:24] [V] [TRT] Tactic: -5766140806760372989 Time: 0.162816 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -5697614955743334137 [03/25/2022-13:24:24] [V] [TRT] Tactic: -5697614955743334137 Time: 0.158976 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -5671123121710113970 [03/25/2022-13:24:24] [V] [TRT] Tactic: -5671123121710113970 Time: 0.131968 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -5615581362569252260 [03/25/2022-13:24:24] [V] [TRT] Tactic: -5615581362569252260 Time: 0.173312 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -5562968047117507056 [03/25/2022-13:24:24] [V] [TRT] Tactic: -5562968047117507056 Time: 0.144384 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -5516472881360101487 [03/25/2022-13:24:24] [V] [TRT] Tactic: -5516472881360101487 Time: 0.191616 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -5311474420963248369 [03/25/2022-13:24:24] [V] [TRT] Tactic: -5311474420963248369 Time: 0.16512 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -5170003087447722174 [03/25/2022-13:24:24] [V] [TRT] Tactic: -5170003087447722174 Time: 0.193024 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -4889586143772361690 [03/25/2022-13:24:24] [V] [TRT] Tactic: -4889586143772361690 Time: 0.130816 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -4849712423393454704 [03/25/2022-13:24:24] [V] [TRT] Tactic: -4849712423393454704 Time: 0.136448 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: -4681913707320020520 [03/25/2022-13:24:24] [V] [TRT] Tactic: -4681913707320020520 Time: 0.147456 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_medium_nt_v1 Tactic: -4516822589357530549 [03/25/2022-13:24:24] [V] [TRT] Tactic: -4516822589357530549 Time: 0.16448 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -4455415102719506646 [03/25/2022-13:24:24] [V] [TRT] Tactic: -4455415102719506646 Time: 0.16 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -4425346730823666456 [03/25/2022-13:24:24] [V] [TRT] Tactic: -4425346730823666456 Time: 0.200832 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r1s1 Tactic: -4260476497340370474 [03/25/2022-13:24:24] [V] [TRT] Tactic: -4260476497340370474 Time: 0.18432 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -4182501876984672402 [03/25/2022-13:24:24] [V] [TRT] Tactic: -4182501876984672402 Time: 0.134144 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_t1r1s1 Tactic: -4151617293257698859 [03/25/2022-13:24:24] [V] [TRT] Tactic: -4151617293257698859 Time: 0.148224 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -3825889760337461729 [03/25/2022-13:24:24] [V] [TRT] Tactic: -3825889760337461729 Time: 0.199424 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -3797022944823726673 [03/25/2022-13:24:24] [V] [TRT] Tactic: -3797022944823726673 Time: 0.15488 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -3613322253849278738 [03/25/2022-13:24:24] [V] [TRT] Tactic: -3613322253849278738 Time: 0.233728 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -3577322188448771475 [03/25/2022-13:24:24] [V] [TRT] Tactic: -3577322188448771475 Time: 0.168576 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16_t1r1s1 Tactic: -3531681826488401618 [03/25/2022-13:24:24] [V] [TRT] Tactic: -3531681826488401618 Time: 0.188288 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -3288585994448820820 [03/25/2022-13:24:24] [V] [TRT] Tactic: -3288585994448820820 Time: 0.163328 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -2754311112012636251 [03/25/2022-13:24:24] [V] [TRT] Tactic: -2754311112012636251 Time: 0.175872 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -2432868635536396215 [03/25/2022-13:24:24] [V] [TRT] Tactic: -2432868635536396215 Time: 0.148352 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -2379804152300264660 [03/25/2022-13:24:24] [V] [TRT] Tactic: -2379804152300264660 Time: 0.164096 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: -2352253835013627337 [03/25/2022-13:24:24] [V] [TRT] Tactic: -2352253835013627337 Time: 0.13952 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -2335587136911650799 [03/25/2022-13:24:24] [V] [TRT] Tactic: -2335587136911650799 Time: 0.173824 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r1s1 Tactic: -2315453944962430928 [03/25/2022-13:24:24] [V] [TRT] Tactic: -2315453944962430928 Time: 0.146432 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: -2238364958919154661 [03/25/2022-13:24:24] [V] [TRT] Tactic: -2238364958919154661 Time: 0.228736 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -1916483171117495388 [03/25/2022-13:24:24] [V] [TRT] Tactic: -1916483171117495388 Time: 0.1888 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -1549742793039499659 [03/25/2022-13:24:24] [V] [TRT] Tactic: -1549742793039499659 Time: 0.194816 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -1499578657823798783 [03/25/2022-13:24:24] [V] [TRT] Tactic: -1499578657823798783 Time: 0.158848 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -1494157908358500249 [03/25/2022-13:24:24] [V] [TRT] Tactic: -1494157908358500249 Time: 0.170368 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_t1r1s1 Tactic: -1328736756812546664 [03/25/2022-13:24:24] [V] [TRT] Tactic: -1328736756812546664 Time: 0.146048 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -1006589727652607355 [03/25/2022-13:24:24] [V] [TRT] Tactic: -1006589727652607355 Time: 0.176384 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -713022856474991236 [03/25/2022-13:24:24] [V] [TRT] Tactic: -713022856474991236 Time: 0.22656 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -405554772060757402 [03/25/2022-13:24:24] [V] [TRT] Tactic: -405554772060757402 Time: 0.146816 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -375949437730908730 [03/25/2022-13:24:24] [V] [TRT] Tactic: -375949437730908730 Time: 0.166144 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: -233227833606287806 [03/25/2022-13:24:24] [V] [TRT] Tactic: -233227833606287806 Time: 0.161536 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -111878368089469751 [03/25/2022-13:24:24] [V] [TRT] Tactic: -111878368089469751 Time: 0.191232 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -48936598874722005 [03/25/2022-13:24:24] [V] [TRT] Tactic: -48936598874722005 Time: 0.12032 [03/25/2022-13:24:24] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -19707840769375107 [03/25/2022-13:24:24] [V] [TRT] Tactic: -19707840769375107 Time: 0.16576 [03/25/2022-13:24:24] [V] [TRT] Fastest Tactic: 2972948223367788520 Time: 0.115584 [03/25/2022-13:24:24] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: 2972948223367788520 [03/25/2022-13:24:24] [V] [TRT] =============== Computing costs for [03/25/2022-13:24:24] [V] [TRT] *************** Autotuning format combination: Int8(100352,784:4,28,1) -> Int8(25088,784:4,28,1) *************** [03/25/2022-13:24:24] [V] [TRT] --------------- Timing Runner: sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 (CudaDepthwiseConvolution) [03/25/2022-13:24:24] [V] [TRT] CudaDepthwiseConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:24] [V] [TRT] --------------- Timing Runner: sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 (FusedConvActConvolution) [03/25/2022-13:24:24] [V] [TRT] FusedConvActConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:24] [V] [TRT] --------------- Timing Runner: sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 (CaskConvolution) [03/25/2022-13:24:24] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_medium_nn_v1 Tactic: 175853789719975416 [03/25/2022-13:24:24] [V] [TRT] Tactic: 175853789719975416 Time: 0.238336 [03/25/2022-13:24:24] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_medium_nn_v1 Tactic: 2171150287007712632 [03/25/2022-13:24:24] [V] [TRT] Tactic: 2171150287007712632 Time: 0.237824 [03/25/2022-13:24:24] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_small_nn_v1 Tactic: 2234457234705232274 [03/25/2022-13:24:24] [V] [TRT] Tactic: 2234457234705232274 Time: 0.20288 [03/25/2022-13:24:24] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_medium_nn_v1 Tactic: 5834048089706882838 [03/25/2022-13:24:24] [V] [TRT] Tactic: 5834048089706882838 Time: 0.203904 [03/25/2022-13:24:24] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_interior_nn_v1 Tactic: 6299962968199310600 [03/25/2022-13:24:24] [V] [TRT] Tactic: 6299962968199310600 Time: 0.206464 [03/25/2022-13:24:24] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_interior_nn_v1 Tactic: 6341572697076960911 [03/25/2022-13:24:24] [V] [TRT] Tactic: 6341572697076960911 Time: 0.222848 [03/25/2022-13:24:24] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_small_nn_v1 Tactic: -8626990807754934295 [03/25/2022-13:24:24] [V] [TRT] Tactic: -8626990807754934295 Time: 0.237696 [03/25/2022-13:24:24] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_interior_nn_v1 Tactic: -8498217049614706532 [03/25/2022-13:24:24] [V] [TRT] Tactic: -8498217049614706532 Time: 0.196096 [03/25/2022-13:24:24] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_small_nn_v1 Tactic: -7303593854972602201 [03/25/2022-13:24:24] [V] [TRT] Tactic: -7303593854972602201 Time: 0.228096 [03/25/2022-13:24:24] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_medium_nn_v1 Tactic: -6585664687867083638 [03/25/2022-13:24:24] [V] [TRT] Tactic: -6585664687867083638 Time: 0.207744 [03/25/2022-13:24:24] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_interior_nn_v1 Tactic: -3326139578711341011 [03/25/2022-13:24:24] [V] [TRT] Tactic: -3326139578711341011 Time: 0.22528 [03/25/2022-13:24:24] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_small_nn_v1 Tactic: -683636008127039856 [03/25/2022-13:24:24] [V] [TRT] Tactic: -683636008127039856 Time: 0.207488 [03/25/2022-13:24:24] [V] [TRT] Fastest Tactic: -8498217049614706532 Time: 0.196096 [03/25/2022-13:24:24] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: -8498217049614706532 [03/25/2022-13:24:24] [V] [TRT] *************** Autotuning format combination: Int8(100352,784:4,28,1) -> Int8(3136,784:32,28,1) *************** [03/25/2022-13:24:24] [V] [TRT] --------------- Timing Runner: sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 (CaskConvolution) [03/25/2022-13:24:24] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_small_c32_nn_v1 Tactic: 1100922622480907544 [03/25/2022-13:24:24] [V] [TRT] Tactic: 1100922622480907544 Time: 0.237056 [03/25/2022-13:24:24] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_interior_c32_nn_v1 Tactic: 2855900226702061782 [03/25/2022-13:24:24] [V] [TRT] Tactic: 2855900226702061782 Time: 0.207104 [03/25/2022-13:24:24] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_medium_c32_nn_v1 Tactic: 3606311198834416176 [03/25/2022-13:24:24] [V] [TRT] Tactic: 3606311198834416176 Time: 0.203008 [03/25/2022-13:24:24] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_small_c32_nn_v1 Tactic: 4325765560739862899 [03/25/2022-13:24:24] [V] [TRT] Tactic: 4325765560739862899 Time: 0.208128 [03/25/2022-13:24:24] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_interior_c32_nn_v1 Tactic: 8803458114157674373 [03/25/2022-13:24:24] [V] [TRT] Tactic: 8803458114157674373 Time: 0.195584 [03/25/2022-13:24:24] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_interior_c32_nn_v1 Tactic: -6934773036503365000 [03/25/2022-13:24:24] [V] [TRT] Tactic: -6934773036503365000 Time: 0.225024 [03/25/2022-13:24:24] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_interior_c32_nn_v1 Tactic: -4431642509665791294 [03/25/2022-13:24:24] [V] [TRT] Tactic: -4431642509665791294 Time: 0.221312 [03/25/2022-13:24:24] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_medium_c32_nn_v1 Tactic: -4255737803793506479 [03/25/2022-13:24:24] [V] [TRT] Tactic: -4255737803793506479 Time: 0.208128 [03/25/2022-13:24:24] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_small_c32_nn_v1 Tactic: -3958182351168863467 [03/25/2022-13:24:24] [V] [TRT] Tactic: -3958182351168863467 Time: 0.226432 [03/25/2022-13:24:24] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_medium_c32_nn_v1 Tactic: -3111968753064955248 [03/25/2022-13:24:24] [V] [TRT] Tactic: -3111968753064955248 Time: 0.237184 [03/25/2022-13:24:24] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_medium_c32_nn_v1 Tactic: -1492575840277333548 [03/25/2022-13:24:24] [V] [TRT] Tactic: -1492575840277333548 Time: 0.237056 [03/25/2022-13:24:24] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_small_c32_nn_v1 Tactic: -868495160148524802 [03/25/2022-13:24:24] [V] [TRT] Tactic: -868495160148524802 Time: 0.20224 [03/25/2022-13:24:24] [V] [TRT] Fastest Tactic: 8803458114157674373 Time: 0.195584 [03/25/2022-13:24:24] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: 8803458114157674373 [03/25/2022-13:24:24] [V] [TRT] *************** Autotuning format combination: Int8(12544,784:32,28,1) -> Int8(3136,784:32,28,1) *************** [03/25/2022-13:24:24] [V] [TRT] --------------- Timing Runner: sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 (CudaGroupConvolution) [03/25/2022-13:24:24] [V] [TRT] CudaGroupConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:24] [V] [TRT] --------------- Timing Runner: sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 (CudaDepthwiseConvolution) [03/25/2022-13:24:24] [V] [TRT] CudaDepthwiseConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:24] [V] [TRT] --------------- Timing Runner: sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 (FusedConvActConvolution) [03/25/2022-13:24:24] [V] [TRT] FusedConvActConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:24] [V] [TRT] --------------- Timing Runner: sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 (CaskConvolution) [03/25/2022-13:24:24] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16_t1r1s1 Tactic: 68468667201176803 [03/25/2022-13:24:24] [V] [TRT] Tactic: 68468667201176803 Time: 0.083072 [03/25/2022-13:24:24] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 125145153013230687 [03/25/2022-13:24:24] [V] [TRT] Tactic: 125145153013230687 Time: 0.09088 [03/25/2022-13:24:24] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_interior_nt_v1 Tactic: 434957160407688216 [03/25/2022-13:24:24] [V] [TRT] Tactic: 434957160407688216 Time: 0.0928 [03/25/2022-13:24:24] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 805889586762897346 [03/25/2022-13:24:24] [V] [TRT] Tactic: 805889586762897346 Time: 0.066944 [03/25/2022-13:24:24] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: 857001784974286465 [03/25/2022-13:24:24] [V] [TRT] Tactic: 857001784974286465 Time: 0.08704 [03/25/2022-13:24:24] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 1214130898909872671 [03/25/2022-13:24:24] [V] [TRT] Tactic: 1214130898909872671 Time: 0.113792 [03/25/2022-13:24:24] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 1278425129871930205 [03/25/2022-13:24:24] [V] [TRT] Tactic: 1278425129871930205 Time: 0.065536 [03/25/2022-13:24:24] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 1583811548148740665 [03/25/2022-13:24:24] [V] [TRT] Tactic: 1583811548148740665 Time: 0.085248 [03/25/2022-13:24:24] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 1701344857577810806 [03/25/2022-13:24:24] [V] [TRT] Tactic: 1701344857577810806 Time: 0.086912 [03/25/2022-13:24:24] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 1797231177354918208 [03/25/2022-13:24:24] [V] [TRT] Tactic: 1797231177354918208 Time: 0.087168 [03/25/2022-13:24:24] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_conv_fprop_smallk_i8i8_i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_warptilesize16x32_stage4_warpsize2x4x1_r1s1_u1v1_hw0_c512_scalebias_relu Tactic: 1913026264725750683 [03/25/2022-13:24:24] [V] [TRT] Tactic: 1913026264725750683 Time: 0.059008 [03/25/2022-13:24:24] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2004812516525036381 [03/25/2022-13:24:24] [V] [TRT] Tactic: 2004812516525036381 Time: 0.076288 [03/25/2022-13:24:24] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2030033463723799063 [03/25/2022-13:24:24] [V] [TRT] Tactic: 2030033463723799063 Time: 0.065792 [03/25/2022-13:24:24] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_interior_nt_v1 Tactic: 2346437292116182513 [03/25/2022-13:24:24] [V] [TRT] Tactic: 2346437292116182513 Time: 0.084352 [03/25/2022-13:24:24] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2376898825218218566 [03/25/2022-13:24:24] [V] [TRT] Tactic: 2376898825218218566 Time: 0.062336 [03/25/2022-13:24:24] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_interior_nt_v1 Tactic: 2522133112320625287 [03/25/2022-13:24:24] [V] [TRT] Tactic: 2522133112320625287 Time: 0.083456 [03/25/2022-13:24:24] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2548171972648455240 [03/25/2022-13:24:24] [V] [TRT] Tactic: 2548171972648455240 Time: 0.063232 [03/25/2022-13:24:24] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 2548946449357458230 [03/25/2022-13:24:24] [V] [TRT] Tactic: 2548946449357458230 Time: 0.094848 [03/25/2022-13:24:24] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: 2570666021825229009 [03/25/2022-13:24:24] [V] [TRT] Tactic: 2570666021825229009 Time: 0.078208 [03/25/2022-13:24:24] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 2678520742286844763 [03/25/2022-13:24:24] [V] [TRT] Tactic: 2678520742286844763 Time: 0.112768 [03/25/2022-13:24:24] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 2756291002030759362 [03/25/2022-13:24:24] [V] [TRT] Tactic: 2756291002030759362 Time: 0.067584 [03/25/2022-13:24:24] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2972948223367788520 [03/25/2022-13:24:24] [V] [TRT] Tactic: 2972948223367788520 Time: 0.061824 [03/25/2022-13:24:24] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_medium_nt_v1 Tactic: 2985940154541537814 [03/25/2022-13:24:24] [V] [TRT] Tactic: 2985940154541537814 Time: 0.083712 [03/25/2022-13:24:24] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 3043273137345374664 [03/25/2022-13:24:24] [V] [TRT] Tactic: 3043273137345374664 Time: 0.115328 [03/25/2022-13:24:24] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 3221677093659484230 [03/25/2022-13:24:24] [V] [TRT] Tactic: 3221677093659484230 Time: 0.1184 [03/25/2022-13:24:24] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: 3242897809704328258 [03/25/2022-13:24:24] [V] [TRT] Tactic: 3242897809704328258 Time: 0.082432 [03/25/2022-13:24:24] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 3312456766204252694 [03/25/2022-13:24:24] [V] [TRT] Tactic: 3312456766204252694 Time: 0.099456 [03/25/2022-13:24:24] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 3538565962642681625 [03/25/2022-13:24:24] [V] [TRT] Tactic: 3538565962642681625 Time: 0.076544 [03/25/2022-13:24:24] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 3541919052468401776 [03/25/2022-13:24:24] [V] [TRT] Tactic: 3541919052468401776 Time: 0.076928 [03/25/2022-13:24:24] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 3593397928177382100 [03/25/2022-13:24:24] [V] [TRT] Tactic: 3593397928177382100 Time: 0.113664 [03/25/2022-13:24:24] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 3670282018109435863 [03/25/2022-13:24:25] [V] [TRT] Tactic: 3670282018109435863 Time: 0.066048 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 3671413346254027573 [03/25/2022-13:24:25] [V] [TRT] Tactic: 3671413346254027573 Time: 0.081792 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_medium_nt_v1 Tactic: 3899284354987683408 [03/25/2022-13:24:25] [V] [TRT] Tactic: 3899284354987683408 Time: 0.091392 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 3927509214678622419 [03/25/2022-13:24:25] [V] [TRT] Tactic: 3927509214678622419 Time: 0.080384 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 4112572034735311841 [03/25/2022-13:24:25] [V] [TRT] Tactic: 4112572034735311841 Time: 0.135168 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 4239974928951431644 [03/25/2022-13:24:25] [V] [TRT] Tactic: 4239974928951431644 Time: 0.081408 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 4610760414797216079 [03/25/2022-13:24:25] [V] [TRT] Tactic: 4610760414797216079 Time: 0.069376 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_small_nt_v1 Tactic: 4717285412741024953 [03/25/2022-13:24:25] [V] [TRT] Tactic: 4717285412741024953 Time: 0.084992 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 4796956614760326119 [03/25/2022-13:24:25] [V] [TRT] Tactic: 4796956614760326119 Time: 0.080512 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: 4909502217677847353 [03/25/2022-13:24:25] [V] [TRT] Tactic: 4909502217677847353 Time: 0.06208 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 4919361344804309192 [03/25/2022-13:24:25] [V] [TRT] Tactic: 4919361344804309192 Time: 0.103168 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 5043674678294309681 [03/25/2022-13:24:25] [V] [TRT] Tactic: 5043674678294309681 Time: 0.062976 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_interior_nt_v1 Tactic: 5126565865931538390 [03/25/2022-13:24:25] [V] [TRT] Tactic: 5126565865931538390 Time: 0.085248 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5204702486885981735 [03/25/2022-13:24:25] [V] [TRT] Tactic: 5204702486885981735 Time: 0.067456 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 5375256703210220108 [03/25/2022-13:24:25] [V] [TRT] Tactic: 5375256703210220108 Time: 0.062848 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 5424258848951129084 [03/25/2022-13:24:25] [V] [TRT] Tactic: 5424258848951129084 Time: 0.062464 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5443897483205284103 [03/25/2022-13:24:25] [V] [TRT] Tactic: 5443897483205284103 Time: 0.085376 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5707566217891294846 [03/25/2022-13:24:25] [V] [TRT] Tactic: 5707566217891294846 Time: 0.066304 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5986622376339202983 [03/25/2022-13:24:25] [V] [TRT] Tactic: 5986622376339202983 Time: 0.082048 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6007888770437705057 [03/25/2022-13:24:25] [V] [TRT] Tactic: 6007888770437705057 Time: 0.070912 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 6405251167055673379 [03/25/2022-13:24:25] [V] [TRT] Tactic: 6405251167055673379 Time: 0.082176 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 6433368103202497147 [03/25/2022-13:24:25] [V] [TRT] Tactic: 6433368103202497147 Time: 0.06848 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6441948709525127755 [03/25/2022-13:24:25] [V] [TRT] Tactic: 6441948709525127755 Time: 0.114944 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 6443933097134654777 [03/25/2022-13:24:25] [V] [TRT] Tactic: 6443933097134654777 Time: 0.0992 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6457435868048963632 [03/25/2022-13:24:25] [V] [TRT] Tactic: 6457435868048963632 Time: 0.06976 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1 Tactic: 6510345569544721081 [03/25/2022-13:24:25] [V] [TRT] Tactic: 6510345569544721081 Time: 0.079232 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 6793988781414507278 [03/25/2022-13:24:25] [V] [TRT] Tactic: 6793988781414507278 Time: 0.06528 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 6880710371738875469 [03/25/2022-13:24:25] [V] [TRT] Tactic: 6880710371738875469 Time: 0.076288 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6925201228918187099 [03/25/2022-13:24:25] [V] [TRT] Tactic: 6925201228918187099 Time: 0.066688 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6991524515605108718 [03/25/2022-13:24:25] [V] [TRT] Tactic: 6991524515605108718 Time: 0.087936 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 7245509442265271220 [03/25/2022-13:24:25] [V] [TRT] Tactic: 7245509442265271220 Time: 0.082176 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 7318929579222925725 [03/25/2022-13:24:25] [V] [TRT] Tactic: 7318929579222925725 Time: 0.066816 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 7731430299029542276 [03/25/2022-13:24:25] [V] [TRT] Tactic: 7731430299029542276 Time: 0.063616 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 7738495016763012180 [03/25/2022-13:24:25] [V] [TRT] Tactic: 7738495016763012180 Time: 0.091392 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: 7886967395128926382 [03/25/2022-13:24:25] [V] [TRT] Tactic: 7886967395128926382 Time: 0.06336 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 8142283985160822229 [03/25/2022-13:24:25] [V] [TRT] Tactic: 8142283985160822229 Time: 0.070528 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 8173975624668590862 [03/25/2022-13:24:25] [V] [TRT] Tactic: 8173975624668590862 Time: 0.099456 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 8234775147403903473 [03/25/2022-13:24:25] [V] [TRT] Tactic: 8234775147403903473 Time: 0.072064 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 8524082966802584889 [03/25/2022-13:24:25] [V] [TRT] Tactic: 8524082966802584889 Time: 0.06272 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 8684013308930763400 [03/25/2022-13:24:25] [V] [TRT] Tactic: 8684013308930763400 Time: 0.08128 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 8765382722978397630 [03/25/2022-13:24:25] [V] [TRT] Tactic: 8765382722978397630 Time: 0.06336 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 8843193587782643431 [03/25/2022-13:24:25] [V] [TRT] Tactic: 8843193587782643431 Time: 0.095872 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 8883810517410230831 [03/25/2022-13:24:25] [V] [TRT] Tactic: 8883810517410230831 Time: 0.063744 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 8930797211803511337 [03/25/2022-13:24:25] [V] [TRT] Tactic: 8930797211803511337 Time: 0.085888 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 8935070489925739043 [03/25/2022-13:24:25] [V] [TRT] Tactic: 8935070489925739043 Time: 0.0736 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1 Tactic: 9062173295331155069 [03/25/2022-13:24:25] [V] [TRT] Tactic: 9062173295331155069 Time: 0.113408 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_small_nt_v1 Tactic: -9118785798277698619 [03/25/2022-13:24:25] [V] [TRT] Tactic: -9118785798277698619 Time: 0.084352 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -8985599729413291927 [03/25/2022-13:24:25] [V] [TRT] Tactic: -8985599729413291927 Time: 0.06144 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -8972697510150675429 [03/25/2022-13:24:25] [V] [TRT] Tactic: -8972697510150675429 Time: 0.073856 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -8943710627305202139 [03/25/2022-13:24:25] [V] [TRT] Tactic: -8943710627305202139 Time: 0.080128 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -8859846367886814331 [03/25/2022-13:24:25] [V] [TRT] Tactic: -8859846367886814331 Time: 0.090624 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -8638624340850784688 [03/25/2022-13:24:25] [V] [TRT] Tactic: -8638624340850784688 Time: 0.100992 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -8556775352640313933 [03/25/2022-13:24:25] [V] [TRT] Tactic: -8556775352640313933 Time: 0.06528 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -8382298409581540699 [03/25/2022-13:24:25] [V] [TRT] Tactic: -8382298409581540699 Time: 0.112896 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -8172318747337038866 [03/25/2022-13:24:25] [V] [TRT] Tactic: -8172318747337038866 Time: 0.087424 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -8038164441468184723 [03/25/2022-13:24:25] [V] [TRT] Tactic: -8038164441468184723 Time: 0.088192 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -7844028314176826857 [03/25/2022-13:24:25] [V] [TRT] Tactic: -7844028314176826857 Time: 0.115328 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -7674507941016740570 [03/25/2022-13:24:25] [V] [TRT] Tactic: -7674507941016740570 Time: 0.061824 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -7364286662638617917 [03/25/2022-13:24:25] [V] [TRT] Tactic: -7364286662638617917 Time: 0.088192 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r1s1 Tactic: -7361755530333096258 [03/25/2022-13:24:25] [V] [TRT] Tactic: -7361755530333096258 Time: 0.09728 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_t1r1s1 Tactic: -7289760022626653388 [03/25/2022-13:24:25] [V] [TRT] Tactic: -7289760022626653388 Time: 0.063872 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -7106539943789766885 [03/25/2022-13:24:25] [V] [TRT] Tactic: -7106539943789766885 Time: 0.146816 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -6969478418607271266 [03/25/2022-13:24:25] [V] [TRT] Tactic: -6969478418607271266 Time: 0.0864 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -6930438165437733000 [03/25/2022-13:24:25] [V] [TRT] Tactic: -6930438165437733000 Time: 0.13696 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -6879607992933502380 [03/25/2022-13:24:25] [V] [TRT] Tactic: -6879607992933502380 Time: 0.065536 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -6839669803644810934 [03/25/2022-13:24:25] [V] [TRT] Tactic: -6839669803644810934 Time: 0.062592 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -6812830108414456369 [03/25/2022-13:24:25] [V] [TRT] Tactic: -6812830108414456369 Time: 0.062592 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -6527178416855951297 [03/25/2022-13:24:25] [V] [TRT] Tactic: -6527178416855951297 Time: 0.105216 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -6510232214299595844 [03/25/2022-13:24:25] [V] [TRT] Tactic: -6510232214299595844 Time: 0.102272 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_small_nt_v1 Tactic: -6400348606759295499 [03/25/2022-13:24:25] [V] [TRT] Tactic: -6400348606759295499 Time: 0.081024 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -6346247605026339453 [03/25/2022-13:24:25] [V] [TRT] Tactic: -6346247605026339453 Time: 0.078976 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -6232597026469067819 [03/25/2022-13:24:25] [V] [TRT] Tactic: -6232597026469067819 Time: 0.09792 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_small_nt_v1 Tactic: -5980889159865208399 [03/25/2022-13:24:25] [V] [TRT] Tactic: -5980889159865208399 Time: 0.088448 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_medium_nt_v1 Tactic: -5766140806760372989 [03/25/2022-13:24:25] [V] [TRT] Tactic: -5766140806760372989 Time: 0.085888 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -5697614955743334137 [03/25/2022-13:24:25] [V] [TRT] Tactic: -5697614955743334137 Time: 0.07872 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -5671123121710113970 [03/25/2022-13:24:25] [V] [TRT] Tactic: -5671123121710113970 Time: 0.083456 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -5615581362569252260 [03/25/2022-13:24:25] [V] [TRT] Tactic: -5615581362569252260 Time: 0.086912 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -5562968047117507056 [03/25/2022-13:24:25] [V] [TRT] Tactic: -5562968047117507056 Time: 0.070528 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -5516472881360101487 [03/25/2022-13:24:25] [V] [TRT] Tactic: -5516472881360101487 Time: 0.100608 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -5311474420963248369 [03/25/2022-13:24:25] [V] [TRT] Tactic: -5311474420963248369 Time: 0.100736 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -5170003087447722174 [03/25/2022-13:24:25] [V] [TRT] Tactic: -5170003087447722174 Time: 0.115072 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -4889586143772361690 [03/25/2022-13:24:25] [V] [TRT] Tactic: -4889586143772361690 Time: 0.067072 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -4889498558023475527 [03/25/2022-13:24:25] [V] [TRT] Tactic: -4889498558023475527 Time: 0.063232 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -4849712423393454704 [03/25/2022-13:24:25] [V] [TRT] Tactic: -4849712423393454704 Time: 0.066944 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: -4681913707320020520 [03/25/2022-13:24:25] [V] [TRT] Tactic: -4681913707320020520 Time: 0.0864 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_medium_nt_v1 Tactic: -4516822589357530549 [03/25/2022-13:24:25] [V] [TRT] Tactic: -4516822589357530549 Time: 0.088064 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -4455415102719506646 [03/25/2022-13:24:25] [V] [TRT] Tactic: -4455415102719506646 Time: 0.078592 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -4425346730823666456 [03/25/2022-13:24:25] [V] [TRT] Tactic: -4425346730823666456 Time: 0.108032 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r1s1 Tactic: -4260476497340370474 [03/25/2022-13:24:25] [V] [TRT] Tactic: -4260476497340370474 Time: 0.11392 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -4182501876984672402 [03/25/2022-13:24:25] [V] [TRT] Tactic: -4182501876984672402 Time: 0.08448 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_t1r1s1 Tactic: -4151617293257698859 [03/25/2022-13:24:25] [V] [TRT] Tactic: -4151617293257698859 Time: 0.09984 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -3825889760337461729 [03/25/2022-13:24:25] [V] [TRT] Tactic: -3825889760337461729 Time: 0.100352 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -3797022944823726673 [03/25/2022-13:24:25] [V] [TRT] Tactic: -3797022944823726673 Time: 0.076288 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -3613322253849278738 [03/25/2022-13:24:25] [V] [TRT] Tactic: -3613322253849278738 Time: 0.139648 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -3577322188448771475 [03/25/2022-13:24:25] [V] [TRT] Tactic: -3577322188448771475 Time: 0.077184 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16_t1r1s1 Tactic: -3531681826488401618 [03/25/2022-13:24:25] [V] [TRT] Tactic: -3531681826488401618 Time: 0.149888 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -3305554949874552860 [03/25/2022-13:24:25] [V] [TRT] Tactic: -3305554949874552860 Time: 0.112768 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -3288585994448820820 [03/25/2022-13:24:25] [V] [TRT] Tactic: -3288585994448820820 Time: 0.09024 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -2754311112012636251 [03/25/2022-13:24:25] [V] [TRT] Tactic: -2754311112012636251 Time: 0.078848 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -2432868635536396215 [03/25/2022-13:24:25] [V] [TRT] Tactic: -2432868635536396215 Time: 0.085248 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -2379804152300264660 [03/25/2022-13:24:25] [V] [TRT] Tactic: -2379804152300264660 Time: 0.114176 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: -2352253835013627337 [03/25/2022-13:24:25] [V] [TRT] Tactic: -2352253835013627337 Time: 0.062464 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -2335587136911650799 [03/25/2022-13:24:25] [V] [TRT] Tactic: -2335587136911650799 Time: 0.086784 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r1s1 Tactic: -2315453944962430928 [03/25/2022-13:24:25] [V] [TRT] Tactic: -2315453944962430928 Time: 0.101632 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: -2238364958919154661 [03/25/2022-13:24:25] [V] [TRT] Tactic: -2238364958919154661 Time: 0.078592 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -1916483171117495388 [03/25/2022-13:24:25] [V] [TRT] Tactic: -1916483171117495388 Time: 0.08832 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -1740762957710554518 [03/25/2022-13:24:25] [V] [TRT] Tactic: -1740762957710554518 Time: 0.112768 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -1549742793039499659 [03/25/2022-13:24:25] [V] [TRT] Tactic: -1549742793039499659 Time: 0.099584 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -1499578657823798783 [03/25/2022-13:24:25] [V] [TRT] Tactic: -1499578657823798783 Time: 0.077952 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -1494157908358500249 [03/25/2022-13:24:25] [V] [TRT] Tactic: -1494157908358500249 Time: 0.092288 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_t1r1s1 Tactic: -1328736756812546664 [03/25/2022-13:24:25] [V] [TRT] Tactic: -1328736756812546664 Time: 0.082816 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -1006589727652607355 [03/25/2022-13:24:25] [V] [TRT] Tactic: -1006589727652607355 Time: 0.086656 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -713022856474991236 [03/25/2022-13:24:25] [V] [TRT] Tactic: -713022856474991236 Time: 0.139904 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -619668460699260222 [03/25/2022-13:24:25] [V] [TRT] Tactic: -619668460699260222 Time: 0.078464 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -405554772060757402 [03/25/2022-13:24:25] [V] [TRT] Tactic: -405554772060757402 Time: 0.068736 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -375949437730908730 [03/25/2022-13:24:25] [V] [TRT] Tactic: -375949437730908730 Time: 0.063872 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: -233227833606287806 [03/25/2022-13:24:25] [V] [TRT] Tactic: -233227833606287806 Time: 0.062976 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -111878368089469751 [03/25/2022-13:24:25] [V] [TRT] Tactic: -111878368089469751 Time: 0.100736 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -48936598874722005 [03/25/2022-13:24:25] [V] [TRT] Tactic: -48936598874722005 Time: 0.063872 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -19707840769375107 [03/25/2022-13:24:25] [V] [TRT] Tactic: -19707840769375107 Time: 0.076672 [03/25/2022-13:24:25] [V] [TRT] Fastest Tactic: 1913026264725750683 Time: 0.059008 [03/25/2022-13:24:25] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: 1913026264725750683 [03/25/2022-13:24:25] [V] [TRT] =============== Computing costs for [03/25/2022-13:24:25] [V] [TRT] *************** Autotuning format combination: Int8(25088,784:4,28,1) -> Int8(25088,784:4,28,1) *************** [03/25/2022-13:24:25] [V] [TRT] --------------- Timing Runner: sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 (CudaDepthwiseConvolution) [03/25/2022-13:24:25] [V] [TRT] CudaDepthwiseConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:25] [V] [TRT] --------------- Timing Runner: sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 (FusedConvActConvolution) [03/25/2022-13:24:25] [V] [TRT] FusedConvActConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:25] [V] [TRT] --------------- Timing Runner: sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 (CaskConvolution) [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_medium_nn_v1 Tactic: 175853789719975416 [03/25/2022-13:24:25] [V] [TRT] Tactic: 175853789719975416 Time: 0.45888 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_medium_nn_v1 Tactic: 2171150287007712632 [03/25/2022-13:24:25] [V] [TRT] Tactic: 2171150287007712632 Time: 0.473216 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_small_nn_v1 Tactic: 2234457234705232274 [03/25/2022-13:24:25] [V] [TRT] Tactic: 2234457234705232274 Time: 0.428288 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_medium_nn_v1 Tactic: 5834048089706882838 [03/25/2022-13:24:25] [V] [TRT] Tactic: 5834048089706882838 Time: 0.433792 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_small_nn_v1 Tactic: -8626990807754934295 [03/25/2022-13:24:25] [V] [TRT] Tactic: -8626990807754934295 Time: 0.456064 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_small_nn_v1 Tactic: -7303593854972602201 [03/25/2022-13:24:25] [V] [TRT] Tactic: -7303593854972602201 Time: 0.435584 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_medium_nn_v1 Tactic: -6585664687867083638 [03/25/2022-13:24:25] [V] [TRT] Tactic: -6585664687867083638 Time: 0.444672 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_xregs_large_nn_v1 Tactic: -3730012925709297561 [03/25/2022-13:24:25] [V] [TRT] Tactic: -3730012925709297561 Time: 0.425472 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_xregs_large_nn_v1 Tactic: -2277259417488004546 [03/25/2022-13:24:25] [V] [TRT] Tactic: -2277259417488004546 Time: 0.489728 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_small_nn_v1 Tactic: -683636008127039856 [03/25/2022-13:24:25] [V] [TRT] Tactic: -683636008127039856 Time: 0.442752 [03/25/2022-13:24:25] [V] [TRT] Fastest Tactic: -3730012925709297561 Time: 0.425472 [03/25/2022-13:24:25] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: -3730012925709297561 [03/25/2022-13:24:25] [V] [TRT] *************** Autotuning format combination: Int8(25088,784:4,28,1) -> Int8(3136,784:32,28,1) *************** [03/25/2022-13:24:25] [V] [TRT] --------------- Timing Runner: sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 (CaskConvolution) [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_xregs_large_c32_nn_v1 Tactic: 984309058095623735 [03/25/2022-13:24:25] [V] [TRT] Tactic: 984309058095623735 Time: 0.42496 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_small_c32_nn_v1 Tactic: 1100922622480907544 [03/25/2022-13:24:25] [V] [TRT] Tactic: 1100922622480907544 Time: 0.45248 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_xregs_large_c32_nn_v1 Tactic: 3238312825609165543 [03/25/2022-13:24:25] [V] [TRT] Tactic: 3238312825609165543 Time: 0.489472 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_medium_c32_nn_v1 Tactic: 3606311198834416176 [03/25/2022-13:24:25] [V] [TRT] Tactic: 3606311198834416176 Time: 0.433152 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_small_c32_nn_v1 Tactic: 4325765560739862899 [03/25/2022-13:24:25] [V] [TRT] Tactic: 4325765560739862899 Time: 0.443392 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_medium_c32_nn_v1 Tactic: -4255737803793506479 [03/25/2022-13:24:25] [V] [TRT] Tactic: -4255737803793506479 Time: 0.44544 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_small_c32_nn_v1 Tactic: -3958182351168863467 [03/25/2022-13:24:25] [V] [TRT] Tactic: -3958182351168863467 Time: 0.433664 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_medium_c32_nn_v1 Tactic: -3111968753064955248 [03/25/2022-13:24:25] [V] [TRT] Tactic: -3111968753064955248 Time: 0.471552 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_medium_c32_nn_v1 Tactic: -1492575840277333548 [03/25/2022-13:24:25] [V] [TRT] Tactic: -1492575840277333548 Time: 0.45632 [03/25/2022-13:24:25] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_small_c32_nn_v1 Tactic: -868495160148524802 [03/25/2022-13:24:25] [V] [TRT] Tactic: -868495160148524802 Time: 0.427776 [03/25/2022-13:24:25] [V] [TRT] Fastest Tactic: 984309058095623735 Time: 0.42496 [03/25/2022-13:24:25] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: 984309058095623735 [03/25/2022-13:24:25] [V] [TRT] *************** Autotuning format combination: Int8(3136,784:32,28,1) -> Int8(3136,784:32,28,1) *************** [03/25/2022-13:24:26] [V] [TRT] --------------- Timing Runner: sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 (CudaGroupConvolution) [03/25/2022-13:24:26] [V] [TRT] CudaGroupConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:26] [V] [TRT] --------------- Timing Runner: sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 (CudaDepthwiseConvolution) [03/25/2022-13:24:26] [V] [TRT] CudaDepthwiseConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:26] [V] [TRT] --------------- Timing Runner: sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 (FusedConvActConvolution) [03/25/2022-13:24:26] [V] [TRT] FusedConvActConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:26] [V] [TRT] --------------- Timing Runner: sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 (CaskConvolution) [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_epifadd Tactic: 177040020707947851 [03/25/2022-13:24:26] [V] [TRT] Tactic: 177040020707947851 Time: 0.166656 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 184229963126259101 [03/25/2022-13:24:26] [V] [TRT] Tactic: 184229963126259101 Time: 0.147584 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 289888059097454627 [03/25/2022-13:24:26] [V] [TRT] Tactic: 289888059097454627 Time: 0.141824 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32 Tactic: 328135613486708155 [03/25/2022-13:24:26] [V] [TRT] Tactic: 328135613486708155 Time: 0.262784 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 680740992583869928 [03/25/2022-13:24:26] [V] [TRT] Tactic: 680740992583869928 Time: 0.132352 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32 Tactic: 1111159740952609683 [03/25/2022-13:24:26] [V] [TRT] Tactic: 1111159740952609683 Time: 0.132864 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: 1134860903395928905 [03/25/2022-13:24:26] [V] [TRT] Tactic: 1134860903395928905 Time: 0.138496 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: 1276591930377039442 [03/25/2022-13:24:26] [V] [TRT] Tactic: 1276591930377039442 Time: 0.156544 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r3s3 Tactic: 1388866374720163187 [03/25/2022-13:24:26] [V] [TRT] Tactic: 1388866374720163187 Time: 0.169472 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32 Tactic: 1399501420456320585 [03/25/2022-13:24:26] [V] [TRT] Tactic: 1399501420456320585 Time: 0.176256 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: 1550399266192842845 [03/25/2022-13:24:26] [V] [TRT] Tactic: 1550399266192842845 Time: 0.164352 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 1572887561103143487 [03/25/2022-13:24:26] [V] [TRT] Tactic: 1572887561103143487 Time: 0.103936 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 1853122447892949466 [03/25/2022-13:24:26] [V] [TRT] Tactic: 1853122447892949466 Time: 0.17536 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: 2133329569091732311 [03/25/2022-13:24:26] [V] [TRT] Tactic: 2133329569091732311 Time: 0.136832 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32 Tactic: 2325023763229477890 [03/25/2022-13:24:26] [V] [TRT] Tactic: 2325023763229477890 Time: 0.082176 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: 2579824863892891529 [03/25/2022-13:24:26] [V] [TRT] Tactic: 2579824863892891529 Time: 0.193408 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32 Tactic: 2783960536172159663 [03/25/2022-13:24:26] [V] [TRT] Tactic: 2783960536172159663 Time: 0.080896 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32 Tactic: 2821711838552913693 [03/25/2022-13:24:26] [V] [TRT] Tactic: 2821711838552913693 Time: 0.121088 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64 Tactic: 2945009978756227538 [03/25/2022-13:24:26] [V] [TRT] Tactic: 2945009978756227538 Time: 0.08832 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_medium_nt_v1 Tactic: 2985940154541537814 [03/25/2022-13:24:26] [V] [TRT] Tactic: 2985940154541537814 Time: 0.13824 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 3284282970967328046 [03/25/2022-13:24:26] [V] [TRT] Tactic: 3284282970967328046 Time: 0.179584 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 3401614690060226673 [03/25/2022-13:24:26] [V] [TRT] Tactic: 3401614690060226673 Time: 0.141184 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32 Tactic: 3456719996792527006 [03/25/2022-13:24:26] [V] [TRT] Tactic: 3456719996792527006 Time: 0.136448 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 3512426920013359699 [03/25/2022-13:24:26] [V] [TRT] Tactic: 3512426920013359699 Time: 0.1088 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64 Tactic: 3651043333819148268 [03/25/2022-13:24:26] [V] [TRT] Tactic: 3651043333819148268 Time: 0.133888 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_medium_nt_v1 Tactic: 3899284354987683408 [03/25/2022-13:24:26] [V] [TRT] Tactic: 3899284354987683408 Time: 0.14784 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32 Tactic: 4042202769383439184 [03/25/2022-13:24:26] [V] [TRT] Tactic: 4042202769383439184 Time: 0.096896 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_large_nt_v1 Tactic: 4182625619810185112 [03/25/2022-13:24:26] [V] [TRT] Tactic: 4182625619810185112 Time: 0.147712 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 4214794893922618058 [03/25/2022-13:24:26] [V] [TRT] Tactic: 4214794893922618058 Time: 0.13504 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_epifadd Tactic: 4259547356717612415 [03/25/2022-13:24:26] [V] [TRT] Tactic: 4259547356717612415 Time: 0.107904 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 4384868749799132354 [03/25/2022-13:24:26] [V] [TRT] Tactic: 4384868749799132354 Time: 0.19328 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64 Tactic: 4414594337986714263 [03/25/2022-13:24:26] [V] [TRT] Tactic: 4414594337986714263 Time: 0.069632 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_small_nt_v1 Tactic: 4717285412741024953 [03/25/2022-13:24:26] [V] [TRT] Tactic: 4717285412741024953 Time: 0.137856 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32 Tactic: 4734519122557206480 [03/25/2022-13:24:26] [V] [TRT] Tactic: 4734519122557206480 Time: 0.160896 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: 4922297020351187339 [03/25/2022-13:24:26] [V] [TRT] Tactic: 4922297020351187339 Time: 0.120704 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16_t1r3s3 Tactic: 4931167631624420067 [03/25/2022-13:24:26] [V] [TRT] Tactic: 4931167631624420067 Time: 0.270464 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_epifadd Tactic: 5121596860264626879 [03/25/2022-13:24:26] [V] [TRT] Tactic: 5121596860264626879 Time: 0.150528 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 5136656982162849059 [03/25/2022-13:24:26] [V] [TRT] Tactic: 5136656982162849059 Time: 0.17984 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: 5158259316594207439 [03/25/2022-13:24:26] [V] [TRT] Tactic: 5158259316594207439 Time: 0.096384 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 5189825015507701541 [03/25/2022-13:24:26] [V] [TRT] Tactic: 5189825015507701541 Time: 0.28288 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32 Tactic: 5424417905073460656 [03/25/2022-13:24:26] [V] [TRT] Tactic: 5424417905073460656 Time: 0.121088 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32 Tactic: 5442043907221427810 [03/25/2022-13:24:26] [V] [TRT] Tactic: 5442043907221427810 Time: 0.1568 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_t1r3s3 Tactic: 5544365258913999384 [03/25/2022-13:24:26] [V] [TRT] Tactic: 5544365258913999384 Time: 0.152576 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r3s3 Tactic: 5641967928706599451 [03/25/2022-13:24:26] [V] [TRT] Tactic: 5641967928706599451 Time: 0.223104 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 5721595115357140131 [03/25/2022-13:24:26] [V] [TRT] Tactic: 5721595115357140131 Time: 0.119552 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 5966973378912044513 [03/25/2022-13:24:26] [V] [TRT] Tactic: 5966973378912044513 Time: 0.080128 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 6004789655466615912 [03/25/2022-13:24:26] [V] [TRT] Tactic: 6004789655466615912 Time: 0.104064 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r3s3 Tactic: 6146901278630392829 [03/25/2022-13:24:26] [V] [TRT] Tactic: 6146901278630392829 Time: 0.158976 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16 Tactic: 6394572396369862482 [03/25/2022-13:24:26] [V] [TRT] Tactic: 6394572396369862482 Time: 0.225408 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_epifadd Tactic: 6434020722187266170 [03/25/2022-13:24:26] [V] [TRT] Tactic: 6434020722187266170 Time: 0.082048 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32 Tactic: 6781129591847482048 [03/25/2022-13:24:26] [V] [TRT] Tactic: 6781129591847482048 Time: 0.094208 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 6984451771200230840 [03/25/2022-13:24:26] [V] [TRT] Tactic: 6984451771200230840 Time: 0.157696 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 7048234086361926570 [03/25/2022-13:24:26] [V] [TRT] Tactic: 7048234086361926570 Time: 0.147328 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32 Tactic: 7077570591813340966 [03/25/2022-13:24:26] [V] [TRT] Tactic: 7077570591813340966 Time: 0.083968 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 7191893591576074000 [03/25/2022-13:24:26] [V] [TRT] Tactic: 7191893591576074000 Time: 0.162816 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_t1r3s3 Tactic: 7429976449747682901 [03/25/2022-13:24:26] [V] [TRT] Tactic: 7429976449747682901 Time: 0.1152 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 7438984192263206338 [03/25/2022-13:24:26] [V] [TRT] Tactic: 7438984192263206338 Time: 0.093824 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: 7504901284678552178 [03/25/2022-13:24:26] [V] [TRT] Tactic: 7504901284678552178 Time: 0.07872 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 8096257414008860171 [03/25/2022-13:24:26] [V] [TRT] Tactic: 8096257414008860171 Time: 0.089984 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 8128112048355596715 [03/25/2022-13:24:26] [V] [TRT] Tactic: 8128112048355596715 Time: 0.088064 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_indexed_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: 8751622450593766232 [03/25/2022-13:24:26] [V] [TRT] Tactic: 8751622450593766232 Time: 0.083968 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_indexed_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32 Tactic: 9064458886956700976 [03/25/2022-13:24:26] [V] [TRT] Tactic: 9064458886956700976 Time: 0.085248 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 9143438935315839085 [03/25/2022-13:24:26] [V] [TRT] Tactic: 9143438935315839085 Time: 0.139904 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32 Tactic: -9165697322068360861 [03/25/2022-13:24:26] [V] [TRT] Tactic: -9165697322068360861 Time: 0.08704 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_small_nt_v1 Tactic: -9118785798277698619 [03/25/2022-13:24:26] [V] [TRT] Tactic: -9118785798277698619 Time: 0.133632 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: -9108166971364503411 [03/25/2022-13:24:26] [V] [TRT] Tactic: -9108166971364503411 Time: 0.14976 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16 Tactic: -8861822316054763526 [03/25/2022-13:24:26] [V] [TRT] Tactic: -8861822316054763526 Time: 0.14336 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64 Tactic: -8791277710877987710 [03/25/2022-13:24:26] [V] [TRT] Tactic: -8791277710877987710 Time: 0.135936 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: -8691377209893505057 [03/25/2022-13:24:26] [V] [TRT] Tactic: -8691377209893505057 Time: 0.082176 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: -8520292213102999339 [03/25/2022-13:24:26] [V] [TRT] Tactic: -8520292213102999339 Time: 0.12736 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16_t1r3s3 Tactic: -8475551154769412306 [03/25/2022-13:24:26] [V] [TRT] Tactic: -8475551154769412306 Time: 0.13824 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -8417388128970254446 [03/25/2022-13:24:26] [V] [TRT] Tactic: -8417388128970254446 Time: 0.12864 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32 Tactic: -8263994888336646547 [03/25/2022-13:24:26] [V] [TRT] Tactic: -8263994888336646547 Time: 0.07872 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32 Tactic: -8205948405243401049 [03/25/2022-13:24:26] [V] [TRT] Tactic: -8205948405243401049 Time: 0.164224 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: -7992068592656168418 [03/25/2022-13:24:26] [V] [TRT] Tactic: -7992068592656168418 Time: 0.089088 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_t1r3s3 Tactic: -7898477046581738867 [03/25/2022-13:24:26] [V] [TRT] Tactic: -7898477046581738867 Time: 0.111232 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_epifadd Tactic: -7842775553137511386 [03/25/2022-13:24:26] [V] [TRT] Tactic: -7842775553137511386 Time: 0.08192 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32 Tactic: -7683887278997527517 [03/25/2022-13:24:26] [V] [TRT] Tactic: -7683887278997527517 Time: 0.137344 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: -7381370635708568663 [03/25/2022-13:24:26] [V] [TRT] Tactic: -7381370635708568663 Time: 0.090496 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -7129320389887881029 [03/25/2022-13:24:26] [V] [TRT] Tactic: -7129320389887881029 Time: 0.13312 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -6959995514028471820 [03/25/2022-13:24:26] [V] [TRT] Tactic: -6959995514028471820 Time: 0.126464 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_small_nt_v1 Tactic: -6400348606759295499 [03/25/2022-13:24:26] [V] [TRT] Tactic: -6400348606759295499 Time: 0.135168 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: -6371781333659293809 [03/25/2022-13:24:26] [V] [TRT] Tactic: -6371781333659293809 Time: 0.142464 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32 Tactic: -6256128573036943404 [03/25/2022-13:24:26] [V] [TRT] Tactic: -6256128573036943404 Time: 0.15936 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_small_nt_v1 Tactic: -5980889159865208399 [03/25/2022-13:24:26] [V] [TRT] Tactic: -5980889159865208399 Time: 0.145408 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_medium_nt_v1 Tactic: -5766140806760372989 [03/25/2022-13:24:26] [V] [TRT] Tactic: -5766140806760372989 Time: 0.141056 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: -5709079507616090666 [03/25/2022-13:24:26] [V] [TRT] Tactic: -5709079507616090666 Time: 0.077696 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: -5698636014239116282 [03/25/2022-13:24:26] [V] [TRT] Tactic: -5698636014239116282 Time: 0.147328 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16 Tactic: -5180570335464125033 [03/25/2022-13:24:26] [V] [TRT] Tactic: -5180570335464125033 Time: 0.143872 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -4933563390723451692 [03/25/2022-13:24:26] [V] [TRT] Tactic: -4933563390723451692 Time: 0.109312 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_medium_nt_v1 Tactic: -4516822589357530549 [03/25/2022-13:24:26] [V] [TRT] Tactic: -4516822589357530549 Time: 0.142464 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64 Tactic: -4232916483289779353 [03/25/2022-13:24:26] [V] [TRT] Tactic: -4232916483289779353 Time: 0.230528 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64 Tactic: -3460842194336717186 [03/25/2022-13:24:26] [V] [TRT] Tactic: -3460842194336717186 Time: 0.083712 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: -3413217501222406256 [03/25/2022-13:24:26] [V] [TRT] Tactic: -3413217501222406256 Time: 0.079872 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: -3280888557222886418 [03/25/2022-13:24:26] [V] [TRT] Tactic: -3280888557222886418 Time: 0.116736 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -3238475748440751107 [03/25/2022-13:24:26] [V] [TRT] Tactic: -3238475748440751107 Time: 0.093568 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: -3182884991006484042 [03/25/2022-13:24:26] [V] [TRT] Tactic: -3182884991006484042 Time: 0.080256 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -3173468756112541306 [03/25/2022-13:24:26] [V] [TRT] Tactic: -3173468756112541306 Time: 0.16192 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_large_nt_v1 Tactic: -2917455979290586480 [03/25/2022-13:24:26] [V] [TRT] Tactic: -2917455979290586480 Time: 0.146432 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -2741641298163591508 [03/25/2022-13:24:26] [V] [TRT] Tactic: -2741641298163591508 Time: 0.082816 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_large_nt_v1 Tactic: -2571022005763160364 [03/25/2022-13:24:26] [V] [TRT] Tactic: -2571022005763160364 Time: 0.14208 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: -2499089240293650188 [03/25/2022-13:24:26] [V] [TRT] Tactic: -2499089240293650188 Time: 0.133504 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: -2328318099174473157 [03/25/2022-13:24:26] [V] [TRT] Tactic: -2328318099174473157 Time: 0.145152 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: -2083778562631872334 [03/25/2022-13:24:26] [V] [TRT] Tactic: -2083778562631872334 Time: 0.092928 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32 Tactic: -2054375205435666404 [03/25/2022-13:24:26] [V] [TRT] Tactic: -2054375205435666404 Time: 0.116352 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -1546787387293556842 [03/25/2022-13:24:26] [V] [TRT] Tactic: -1546787387293556842 Time: 0.077568 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32 Tactic: -1498626619443284096 [03/25/2022-13:24:26] [V] [TRT] Tactic: -1498626619443284096 Time: 0.108416 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -1471245223605064669 [03/25/2022-13:24:26] [V] [TRT] Tactic: -1471245223605064669 Time: 0.121984 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32 Tactic: -1283580231568512025 [03/25/2022-13:24:26] [V] [TRT] Tactic: -1283580231568512025 Time: 0.1824 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: -1224421172675151280 [03/25/2022-13:24:26] [V] [TRT] Tactic: -1224421172675151280 Time: 0.077952 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_epifadd Tactic: -1173968681844185579 [03/25/2022-13:24:26] [V] [TRT] Tactic: -1173968681844185579 Time: 0.183808 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -921247911551089037 [03/25/2022-13:24:26] [V] [TRT] Tactic: -921247911551089037 Time: 0.080512 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32 Tactic: -762222380308749469 [03/25/2022-13:24:26] [V] [TRT] Tactic: -762222380308749469 Time: 0.112 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: -556794153877490941 [03/25/2022-13:24:26] [V] [TRT] Tactic: -556794153877490941 Time: 0.113792 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -516725800067794372 [03/25/2022-13:24:26] [V] [TRT] Tactic: -516725800067794372 Time: 0.084992 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_large_nt_v1 Tactic: -428104331444385564 [03/25/2022-13:24:26] [V] [TRT] Tactic: -428104331444385564 Time: 0.142848 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16 Tactic: -366411318217594794 [03/25/2022-13:24:26] [V] [TRT] Tactic: -366411318217594794 Time: 0.171648 [03/25/2022-13:24:26] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16 Tactic: -351548418071036983 [03/25/2022-13:24:26] [V] [TRT] Tactic: -351548418071036983 Time: 0.274048 [03/25/2022-13:24:26] [V] [TRT] Fastest Tactic: 4414594337986714263 Time: 0.069632 [03/25/2022-13:24:26] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: 4414594337986714263 [03/25/2022-13:24:26] [V] [TRT] =============== Computing costs for [03/25/2022-13:24:26] [V] [TRT] *************** Autotuning format combination: Int8(25088,784:4,28,1), Int8(100352,784:4,28,1) -> Int8(100352,784:4,28,1) *************** [03/25/2022-13:24:26] [V] [TRT] *************** Autotuning format combination: Int8(25088,784:4,28,1), Int8(12544,784:32,28,1) -> Int8(12544,784:32,28,1) *************** [03/25/2022-13:24:26] [V] [TRT] *************** Autotuning format combination: Int8(3136,784:32,28,1), Int8(12544,784:32,28,1) -> Int8(12544,784:32,28,1) *************** [03/25/2022-13:24:26] [V] [TRT] =============== Computing costs for [03/25/2022-13:24:26] [V] [TRT] *************** Autotuning format combination: Int8(100352,784:4,28,1) -> Int8(25088,784:4,28,1) *************** [03/25/2022-13:24:26] [V] [TRT] *************** Autotuning format combination: Int8(100352,784:4,28,1) -> Int8(3136,784:32,28,1) *************** [03/25/2022-13:24:26] [V] [TRT] *************** Autotuning format combination: Int8(12544,784:32,28,1) -> Int8(3136,784:32,28,1) *************** [03/25/2022-13:24:26] [V] [TRT] Skip timing cache hit (epiFadd mismatch) for node: sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318, LayerImpl: CaskConvolution, tactic: 1913026264725750683 [03/25/2022-13:24:26] [V] [TRT] --------------- Timing Runner: sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 (CudaGroupConvolution) [03/25/2022-13:24:26] [V] [TRT] CudaGroupConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:26] [V] [TRT] --------------- Timing Runner: sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 (CudaDepthwiseConvolution) [03/25/2022-13:24:26] [V] [TRT] CudaDepthwiseConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:26] [V] [TRT] --------------- Timing Runner: sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 (FusedConvActConvolution) [03/25/2022-13:24:26] [V] [TRT] FusedConvActConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:26] [V] [TRT] --------------- Timing Runner: sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 (CaskConvolution) [03/25/2022-13:24:26] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16_t1r1s1 Tactic: 68468667201176803 [03/25/2022-13:24:26] [V] [TRT] Tactic: 68468667201176803 Time: 0.083072 [03/25/2022-13:24:26] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 125145153013230687 [03/25/2022-13:24:26] [V] [TRT] Tactic: 125145153013230687 Time: 0.09088 [03/25/2022-13:24:26] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_interior_nt_v1 Tactic: 434957160407688216 [03/25/2022-13:24:26] [V] [TRT] Tactic: 434957160407688216 Time: 0.092672 [03/25/2022-13:24:26] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 805889586762897346 [03/25/2022-13:24:26] [V] [TRT] Tactic: 805889586762897346 Time: 0.067072 [03/25/2022-13:24:26] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: 857001784974286465 [03/25/2022-13:24:26] [V] [TRT] Tactic: 857001784974286465 Time: 0.087552 [03/25/2022-13:24:26] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 1214130898909872671 [03/25/2022-13:24:26] [V] [TRT] Tactic: 1214130898909872671 Time: 0.113408 [03/25/2022-13:24:26] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 1278425129871930205 [03/25/2022-13:24:26] [V] [TRT] Tactic: 1278425129871930205 Time: 0.065664 [03/25/2022-13:24:26] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 1583811548148740665 [03/25/2022-13:24:26] [V] [TRT] Tactic: 1583811548148740665 Time: 0.085376 [03/25/2022-13:24:26] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 1701344857577810806 [03/25/2022-13:24:26] [V] [TRT] Tactic: 1701344857577810806 Time: 0.086784 [03/25/2022-13:24:26] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 1797231177354918208 [03/25/2022-13:24:26] [V] [TRT] Tactic: 1797231177354918208 Time: 0.087936 [03/25/2022-13:24:26] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_conv_fprop_smallk_i8i8_i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_warptilesize16x32_stage4_warpsize2x4x1_r1s1_u1v1_hw0_c512_scalebias_relu Tactic: 1913026264725750683 [03/25/2022-13:24:26] [V] [TRT] Tactic: 1913026264725750683 Time: 0.05888 [03/25/2022-13:24:26] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2004812516525036381 [03/25/2022-13:24:26] [V] [TRT] Tactic: 2004812516525036381 Time: 0.076032 [03/25/2022-13:24:26] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2030033463723799063 [03/25/2022-13:24:27] [V] [TRT] Tactic: 2030033463723799063 Time: 0.065536 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_interior_nt_v1 Tactic: 2346437292116182513 [03/25/2022-13:24:27] [V] [TRT] Tactic: 2346437292116182513 Time: 0.083968 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2376898825218218566 [03/25/2022-13:24:27] [V] [TRT] Tactic: 2376898825218218566 Time: 0.06272 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_interior_nt_v1 Tactic: 2522133112320625287 [03/25/2022-13:24:27] [V] [TRT] Tactic: 2522133112320625287 Time: 0.083456 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2548171972648455240 [03/25/2022-13:24:27] [V] [TRT] Tactic: 2548171972648455240 Time: 0.063232 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 2548946449357458230 [03/25/2022-13:24:27] [V] [TRT] Tactic: 2548946449357458230 Time: 0.095104 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: 2570666021825229009 [03/25/2022-13:24:27] [V] [TRT] Tactic: 2570666021825229009 Time: 0.078208 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 2678520742286844763 [03/25/2022-13:24:27] [V] [TRT] Tactic: 2678520742286844763 Time: 0.113152 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 2756291002030759362 [03/25/2022-13:24:27] [V] [TRT] Tactic: 2756291002030759362 Time: 0.067456 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2972948223367788520 [03/25/2022-13:24:27] [V] [TRT] Tactic: 2972948223367788520 Time: 0.06208 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_medium_nt_v1 Tactic: 2985940154541537814 [03/25/2022-13:24:27] [V] [TRT] Tactic: 2985940154541537814 Time: 0.083712 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 3043273137345374664 [03/25/2022-13:24:27] [V] [TRT] Tactic: 3043273137345374664 Time: 0.115584 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 3221677093659484230 [03/25/2022-13:24:27] [V] [TRT] Tactic: 3221677093659484230 Time: 0.118528 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: 3242897809704328258 [03/25/2022-13:24:27] [V] [TRT] Tactic: 3242897809704328258 Time: 0.082432 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 3312456766204252694 [03/25/2022-13:24:27] [V] [TRT] Tactic: 3312456766204252694 Time: 0.099584 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 3538565962642681625 [03/25/2022-13:24:27] [V] [TRT] Tactic: 3538565962642681625 Time: 0.076672 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 3541919052468401776 [03/25/2022-13:24:27] [V] [TRT] Tactic: 3541919052468401776 Time: 0.077056 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 3593397928177382100 [03/25/2022-13:24:27] [V] [TRT] Tactic: 3593397928177382100 Time: 0.113536 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 3670282018109435863 [03/25/2022-13:24:27] [V] [TRT] Tactic: 3670282018109435863 Time: 0.066304 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 3671413346254027573 [03/25/2022-13:24:27] [V] [TRT] Tactic: 3671413346254027573 Time: 0.081792 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_medium_nt_v1 Tactic: 3899284354987683408 [03/25/2022-13:24:27] [V] [TRT] Tactic: 3899284354987683408 Time: 0.091392 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 3927509214678622419 [03/25/2022-13:24:27] [V] [TRT] Tactic: 3927509214678622419 Time: 0.080384 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 4112572034735311841 [03/25/2022-13:24:27] [V] [TRT] Tactic: 4112572034735311841 Time: 0.136192 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 4239974928951431644 [03/25/2022-13:24:27] [V] [TRT] Tactic: 4239974928951431644 Time: 0.081408 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 4610760414797216079 [03/25/2022-13:24:27] [V] [TRT] Tactic: 4610760414797216079 Time: 0.069888 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_small_nt_v1 Tactic: 4717285412741024953 [03/25/2022-13:24:27] [V] [TRT] Tactic: 4717285412741024953 Time: 0.085248 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 4796956614760326119 [03/25/2022-13:24:27] [V] [TRT] Tactic: 4796956614760326119 Time: 0.079616 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: 4909502217677847353 [03/25/2022-13:24:27] [V] [TRT] Tactic: 4909502217677847353 Time: 0.06208 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 4919361344804309192 [03/25/2022-13:24:27] [V] [TRT] Tactic: 4919361344804309192 Time: 0.1024 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 5043674678294309681 [03/25/2022-13:24:27] [V] [TRT] Tactic: 5043674678294309681 Time: 0.062848 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_interior_nt_v1 Tactic: 5126565865931538390 [03/25/2022-13:24:27] [V] [TRT] Tactic: 5126565865931538390 Time: 0.085248 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5204702486885981735 [03/25/2022-13:24:27] [V] [TRT] Tactic: 5204702486885981735 Time: 0.06784 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 5375256703210220108 [03/25/2022-13:24:27] [V] [TRT] Tactic: 5375256703210220108 Time: 0.062848 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 5424258848951129084 [03/25/2022-13:24:27] [V] [TRT] Tactic: 5424258848951129084 Time: 0.06272 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5443897483205284103 [03/25/2022-13:24:27] [V] [TRT] Tactic: 5443897483205284103 Time: 0.085248 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5707566217891294846 [03/25/2022-13:24:27] [V] [TRT] Tactic: 5707566217891294846 Time: 0.06656 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5986622376339202983 [03/25/2022-13:24:27] [V] [TRT] Tactic: 5986622376339202983 Time: 0.082176 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6007888770437705057 [03/25/2022-13:24:27] [V] [TRT] Tactic: 6007888770437705057 Time: 0.070912 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 6405251167055673379 [03/25/2022-13:24:27] [V] [TRT] Tactic: 6405251167055673379 Time: 0.082176 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 6433368103202497147 [03/25/2022-13:24:27] [V] [TRT] Tactic: 6433368103202497147 Time: 0.068352 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6441948709525127755 [03/25/2022-13:24:27] [V] [TRT] Tactic: 6441948709525127755 Time: 0.115328 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 6443933097134654777 [03/25/2022-13:24:27] [V] [TRT] Tactic: 6443933097134654777 Time: 0.098688 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6457435868048963632 [03/25/2022-13:24:27] [V] [TRT] Tactic: 6457435868048963632 Time: 0.069632 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1 Tactic: 6510345569544721081 [03/25/2022-13:24:27] [V] [TRT] Tactic: 6510345569544721081 Time: 0.078976 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 6793988781414507278 [03/25/2022-13:24:27] [V] [TRT] Tactic: 6793988781414507278 Time: 0.065408 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 6880710371738875469 [03/25/2022-13:24:27] [V] [TRT] Tactic: 6880710371738875469 Time: 0.076288 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6925201228918187099 [03/25/2022-13:24:27] [V] [TRT] Tactic: 6925201228918187099 Time: 0.066432 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6991524515605108718 [03/25/2022-13:24:27] [V] [TRT] Tactic: 6991524515605108718 Time: 0.087552 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 7245509442265271220 [03/25/2022-13:24:27] [V] [TRT] Tactic: 7245509442265271220 Time: 0.081792 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 7318929579222925725 [03/25/2022-13:24:27] [V] [TRT] Tactic: 7318929579222925725 Time: 0.066816 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 7731430299029542276 [03/25/2022-13:24:27] [V] [TRT] Tactic: 7731430299029542276 Time: 0.063744 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 7738495016763012180 [03/25/2022-13:24:27] [V] [TRT] Tactic: 7738495016763012180 Time: 0.091392 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: 7886967395128926382 [03/25/2022-13:24:27] [V] [TRT] Tactic: 7886967395128926382 Time: 0.063488 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 8142283985160822229 [03/25/2022-13:24:27] [V] [TRT] Tactic: 8142283985160822229 Time: 0.070656 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 8173975624668590862 [03/25/2022-13:24:27] [V] [TRT] Tactic: 8173975624668590862 Time: 0.099456 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 8234775147403903473 [03/25/2022-13:24:27] [V] [TRT] Tactic: 8234775147403903473 Time: 0.072192 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 8524082966802584889 [03/25/2022-13:24:27] [V] [TRT] Tactic: 8524082966802584889 Time: 0.062848 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 8684013308930763400 [03/25/2022-13:24:27] [V] [TRT] Tactic: 8684013308930763400 Time: 0.080768 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 8765382722978397630 [03/25/2022-13:24:27] [V] [TRT] Tactic: 8765382722978397630 Time: 0.063104 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 8843193587782643431 [03/25/2022-13:24:27] [V] [TRT] Tactic: 8843193587782643431 Time: 0.096 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 8883810517410230831 [03/25/2022-13:24:27] [V] [TRT] Tactic: 8883810517410230831 Time: 0.063488 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 8930797211803511337 [03/25/2022-13:24:27] [V] [TRT] Tactic: 8930797211803511337 Time: 0.086528 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 8935070489925739043 [03/25/2022-13:24:27] [V] [TRT] Tactic: 8935070489925739043 Time: 0.073728 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1 Tactic: 9062173295331155069 [03/25/2022-13:24:27] [V] [TRT] Tactic: 9062173295331155069 Time: 0.113408 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_small_nt_v1 Tactic: -9118785798277698619 [03/25/2022-13:24:27] [V] [TRT] Tactic: -9118785798277698619 Time: 0.084224 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -8985599729413291927 [03/25/2022-13:24:27] [V] [TRT] Tactic: -8985599729413291927 Time: 0.061568 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -8972697510150675429 [03/25/2022-13:24:27] [V] [TRT] Tactic: -8972697510150675429 Time: 0.073856 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -8943710627305202139 [03/25/2022-13:24:27] [V] [TRT] Tactic: -8943710627305202139 Time: 0.08 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -8859846367886814331 [03/25/2022-13:24:27] [V] [TRT] Tactic: -8859846367886814331 Time: 0.090496 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -8638624340850784688 [03/25/2022-13:24:27] [V] [TRT] Tactic: -8638624340850784688 Time: 0.10112 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -8556775352640313933 [03/25/2022-13:24:27] [V] [TRT] Tactic: -8556775352640313933 Time: 0.065664 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -8382298409581540699 [03/25/2022-13:24:27] [V] [TRT] Tactic: -8382298409581540699 Time: 0.112896 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -8172318747337038866 [03/25/2022-13:24:27] [V] [TRT] Tactic: -8172318747337038866 Time: 0.087424 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -8038164441468184723 [03/25/2022-13:24:27] [V] [TRT] Tactic: -8038164441468184723 Time: 0.087936 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -7844028314176826857 [03/25/2022-13:24:27] [V] [TRT] Tactic: -7844028314176826857 Time: 0.116864 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -7674507941016740570 [03/25/2022-13:24:27] [V] [TRT] Tactic: -7674507941016740570 Time: 0.06208 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -7364286662638617917 [03/25/2022-13:24:27] [V] [TRT] Tactic: -7364286662638617917 Time: 0.08832 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r1s1 Tactic: -7361755530333096258 [03/25/2022-13:24:27] [V] [TRT] Tactic: -7361755530333096258 Time: 0.09728 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_t1r1s1 Tactic: -7289760022626653388 [03/25/2022-13:24:27] [V] [TRT] Tactic: -7289760022626653388 Time: 0.064256 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -7106539943789766885 [03/25/2022-13:24:27] [V] [TRT] Tactic: -7106539943789766885 Time: 0.147072 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -6969478418607271266 [03/25/2022-13:24:27] [V] [TRT] Tactic: -6969478418607271266 Time: 0.086272 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -6930438165437733000 [03/25/2022-13:24:27] [V] [TRT] Tactic: -6930438165437733000 Time: 0.136832 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -6879607992933502380 [03/25/2022-13:24:27] [V] [TRT] Tactic: -6879607992933502380 Time: 0.065792 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -6839669803644810934 [03/25/2022-13:24:27] [V] [TRT] Tactic: -6839669803644810934 Time: 0.062592 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -6812830108414456369 [03/25/2022-13:24:27] [V] [TRT] Tactic: -6812830108414456369 Time: 0.062848 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -6527178416855951297 [03/25/2022-13:24:27] [V] [TRT] Tactic: -6527178416855951297 Time: 0.104064 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -6510232214299595844 [03/25/2022-13:24:27] [V] [TRT] Tactic: -6510232214299595844 Time: 0.101888 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_small_nt_v1 Tactic: -6400348606759295499 [03/25/2022-13:24:27] [V] [TRT] Tactic: -6400348606759295499 Time: 0.080896 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -6346247605026339453 [03/25/2022-13:24:27] [V] [TRT] Tactic: -6346247605026339453 Time: 0.079104 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -6232597026469067819 [03/25/2022-13:24:27] [V] [TRT] Tactic: -6232597026469067819 Time: 0.098048 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_small_nt_v1 Tactic: -5980889159865208399 [03/25/2022-13:24:27] [V] [TRT] Tactic: -5980889159865208399 Time: 0.08896 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_medium_nt_v1 Tactic: -5766140806760372989 [03/25/2022-13:24:27] [V] [TRT] Tactic: -5766140806760372989 Time: 0.086272 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -5697614955743334137 [03/25/2022-13:24:27] [V] [TRT] Tactic: -5697614955743334137 Time: 0.078336 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -5671123121710113970 [03/25/2022-13:24:27] [V] [TRT] Tactic: -5671123121710113970 Time: 0.083328 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -5615581362569252260 [03/25/2022-13:24:27] [V] [TRT] Tactic: -5615581362569252260 Time: 0.086656 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -5562968047117507056 [03/25/2022-13:24:27] [V] [TRT] Tactic: -5562968047117507056 Time: 0.070528 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -5516472881360101487 [03/25/2022-13:24:27] [V] [TRT] Tactic: -5516472881360101487 Time: 0.100352 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -5311474420963248369 [03/25/2022-13:24:27] [V] [TRT] Tactic: -5311474420963248369 Time: 0.100736 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -5170003087447722174 [03/25/2022-13:24:27] [V] [TRT] Tactic: -5170003087447722174 Time: 0.115328 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -4889586143772361690 [03/25/2022-13:24:27] [V] [TRT] Tactic: -4889586143772361690 Time: 0.066816 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -4889498558023475527 [03/25/2022-13:24:27] [V] [TRT] Tactic: -4889498558023475527 Time: 0.063872 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -4849712423393454704 [03/25/2022-13:24:27] [V] [TRT] Tactic: -4849712423393454704 Time: 0.0672 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: -4681913707320020520 [03/25/2022-13:24:27] [V] [TRT] Tactic: -4681913707320020520 Time: 0.086144 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_medium_nt_v1 Tactic: -4516822589357530549 [03/25/2022-13:24:27] [V] [TRT] Tactic: -4516822589357530549 Time: 0.088192 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -4455415102719506646 [03/25/2022-13:24:27] [V] [TRT] Tactic: -4455415102719506646 Time: 0.077696 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -4425346730823666456 [03/25/2022-13:24:27] [V] [TRT] Tactic: -4425346730823666456 Time: 0.1088 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r1s1 Tactic: -4260476497340370474 [03/25/2022-13:24:27] [V] [TRT] Tactic: -4260476497340370474 Time: 0.114048 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -4182501876984672402 [03/25/2022-13:24:27] [V] [TRT] Tactic: -4182501876984672402 Time: 0.084352 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_t1r1s1 Tactic: -4151617293257698859 [03/25/2022-13:24:27] [V] [TRT] Tactic: -4151617293257698859 Time: 0.099584 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -3825889760337461729 [03/25/2022-13:24:27] [V] [TRT] Tactic: -3825889760337461729 Time: 0.10112 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -3797022944823726673 [03/25/2022-13:24:27] [V] [TRT] Tactic: -3797022944823726673 Time: 0.076288 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -3613322253849278738 [03/25/2022-13:24:27] [V] [TRT] Tactic: -3613322253849278738 Time: 0.139904 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -3577322188448771475 [03/25/2022-13:24:27] [V] [TRT] Tactic: -3577322188448771475 Time: 0.077952 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16_t1r1s1 Tactic: -3531681826488401618 [03/25/2022-13:24:27] [V] [TRT] Tactic: -3531681826488401618 Time: 0.150016 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -3305554949874552860 [03/25/2022-13:24:27] [V] [TRT] Tactic: -3305554949874552860 Time: 0.112768 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -3288585994448820820 [03/25/2022-13:24:27] [V] [TRT] Tactic: -3288585994448820820 Time: 0.0896 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -2754311112012636251 [03/25/2022-13:24:27] [V] [TRT] Tactic: -2754311112012636251 Time: 0.078976 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -2432868635536396215 [03/25/2022-13:24:27] [V] [TRT] Tactic: -2432868635536396215 Time: 0.08512 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -2379804152300264660 [03/25/2022-13:24:27] [V] [TRT] Tactic: -2379804152300264660 Time: 0.114048 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: -2352253835013627337 [03/25/2022-13:24:27] [V] [TRT] Tactic: -2352253835013627337 Time: 0.062848 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -2335587136911650799 [03/25/2022-13:24:27] [V] [TRT] Tactic: -2335587136911650799 Time: 0.086784 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r1s1 Tactic: -2315453944962430928 [03/25/2022-13:24:27] [V] [TRT] Tactic: -2315453944962430928 Time: 0.101632 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: -2238364958919154661 [03/25/2022-13:24:27] [V] [TRT] Tactic: -2238364958919154661 Time: 0.078592 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -1916483171117495388 [03/25/2022-13:24:27] [V] [TRT] Tactic: -1916483171117495388 Time: 0.08832 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -1740762957710554518 [03/25/2022-13:24:27] [V] [TRT] Tactic: -1740762957710554518 Time: 0.113024 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -1549742793039499659 [03/25/2022-13:24:27] [V] [TRT] Tactic: -1549742793039499659 Time: 0.099456 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -1499578657823798783 [03/25/2022-13:24:27] [V] [TRT] Tactic: -1499578657823798783 Time: 0.077952 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -1494157908358500249 [03/25/2022-13:24:27] [V] [TRT] Tactic: -1494157908358500249 Time: 0.092288 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_t1r1s1 Tactic: -1328736756812546664 [03/25/2022-13:24:27] [V] [TRT] Tactic: -1328736756812546664 Time: 0.08256 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -1006589727652607355 [03/25/2022-13:24:27] [V] [TRT] Tactic: -1006589727652607355 Time: 0.086144 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -713022856474991236 [03/25/2022-13:24:27] [V] [TRT] Tactic: -713022856474991236 Time: 0.140672 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -619668460699260222 [03/25/2022-13:24:27] [V] [TRT] Tactic: -619668460699260222 Time: 0.078464 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -405554772060757402 [03/25/2022-13:24:27] [V] [TRT] Tactic: -405554772060757402 Time: 0.068352 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -375949437730908730 [03/25/2022-13:24:27] [V] [TRT] Tactic: -375949437730908730 Time: 0.064 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: -233227833606287806 [03/25/2022-13:24:27] [V] [TRT] Tactic: -233227833606287806 Time: 0.06336 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -111878368089469751 [03/25/2022-13:24:27] [V] [TRT] Tactic: -111878368089469751 Time: 0.100736 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -48936598874722005 [03/25/2022-13:24:27] [V] [TRT] Tactic: -48936598874722005 Time: 0.063616 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -19707840769375107 [03/25/2022-13:24:27] [V] [TRT] Tactic: -19707840769375107 Time: 0.076672 [03/25/2022-13:24:27] [V] [TRT] Fastest Tactic: 1913026264725750683 Time: 0.05888 [03/25/2022-13:24:27] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: 1913026264725750683 [03/25/2022-13:24:27] [V] [TRT] =============== Computing costs for [03/25/2022-13:24:27] [V] [TRT] *************** Autotuning format combination: Int8(25088,784:4,28,1) -> Int8(25088,784:4,28,1) *************** [03/25/2022-13:24:27] [V] [TRT] *************** Autotuning format combination: Int8(25088,784:4,28,1) -> Int8(3136,784:32,28,1) *************** [03/25/2022-13:24:27] [V] [TRT] *************** Autotuning format combination: Int8(3136,784:32,28,1) -> Int8(3136,784:32,28,1) *************** [03/25/2022-13:24:27] [V] [TRT] Skip timing cache hit (epiFadd mismatch) for node: sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333, LayerImpl: CaskConvolution, tactic: 4414594337986714263 [03/25/2022-13:24:27] [V] [TRT] --------------- Timing Runner: sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 (CudaGroupConvolution) [03/25/2022-13:24:27] [V] [TRT] CudaGroupConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:27] [V] [TRT] --------------- Timing Runner: sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 (CudaDepthwiseConvolution) [03/25/2022-13:24:27] [V] [TRT] CudaDepthwiseConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:27] [V] [TRT] --------------- Timing Runner: sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 (FusedConvActConvolution) [03/25/2022-13:24:27] [V] [TRT] FusedConvActConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:27] [V] [TRT] --------------- Timing Runner: sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 (CaskConvolution) [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_epifadd Tactic: 177040020707947851 [03/25/2022-13:24:27] [V] [TRT] Tactic: 177040020707947851 Time: 0.212352 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 184229963126259101 [03/25/2022-13:24:27] [V] [TRT] Tactic: 184229963126259101 Time: 0.18944 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 289888059097454627 [03/25/2022-13:24:27] [V] [TRT] Tactic: 289888059097454627 Time: 0.182272 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32 Tactic: 328135613486708155 [03/25/2022-13:24:27] [V] [TRT] Tactic: 328135613486708155 Time: 0.330496 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 680740992583869928 [03/25/2022-13:24:27] [V] [TRT] Tactic: 680740992583869928 Time: 0.170112 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32 Tactic: 1111159740952609683 [03/25/2022-13:24:27] [V] [TRT] Tactic: 1111159740952609683 Time: 0.169728 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: 1134860903395928905 [03/25/2022-13:24:27] [V] [TRT] Tactic: 1134860903395928905 Time: 0.178048 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: 1276591930377039442 [03/25/2022-13:24:27] [V] [TRT] Tactic: 1276591930377039442 Time: 0.200576 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r3s3 Tactic: 1388866374720163187 [03/25/2022-13:24:27] [V] [TRT] Tactic: 1388866374720163187 Time: 0.21568 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32 Tactic: 1399501420456320585 [03/25/2022-13:24:27] [V] [TRT] Tactic: 1399501420456320585 Time: 0.226176 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: 1550399266192842845 [03/25/2022-13:24:27] [V] [TRT] Tactic: 1550399266192842845 Time: 0.209664 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 1572887561103143487 [03/25/2022-13:24:27] [V] [TRT] Tactic: 1572887561103143487 Time: 0.133504 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 1853122447892949466 [03/25/2022-13:24:27] [V] [TRT] Tactic: 1853122447892949466 Time: 0.22464 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: 2133329569091732311 [03/25/2022-13:24:27] [V] [TRT] Tactic: 2133329569091732311 Time: 0.174336 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32 Tactic: 2325023763229477890 [03/25/2022-13:24:27] [V] [TRT] Tactic: 2325023763229477890 Time: 0.1056 [03/25/2022-13:24:27] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: 2579824863892891529 [03/25/2022-13:24:28] [V] [TRT] Tactic: 2579824863892891529 Time: 0.24512 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32 Tactic: 2783960536172159663 [03/25/2022-13:24:28] [V] [TRT] Tactic: 2783960536172159663 Time: 0.103552 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32 Tactic: 2821711838552913693 [03/25/2022-13:24:28] [V] [TRT] Tactic: 2821711838552913693 Time: 0.154112 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64 Tactic: 2945009978756227538 [03/25/2022-13:24:28] [V] [TRT] Tactic: 2945009978756227538 Time: 0.113024 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_medium_nt_v1 Tactic: 2985940154541537814 [03/25/2022-13:24:28] [V] [TRT] Tactic: 2985940154541537814 Time: 0.177536 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 3284282970967328046 [03/25/2022-13:24:28] [V] [TRT] Tactic: 3284282970967328046 Time: 0.228736 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 3401614690060226673 [03/25/2022-13:24:28] [V] [TRT] Tactic: 3401614690060226673 Time: 0.180992 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32 Tactic: 3456719996792527006 [03/25/2022-13:24:28] [V] [TRT] Tactic: 3456719996792527006 Time: 0.174976 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 3512426920013359699 [03/25/2022-13:24:28] [V] [TRT] Tactic: 3512426920013359699 Time: 0.139392 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64 Tactic: 3651043333819148268 [03/25/2022-13:24:28] [V] [TRT] Tactic: 3651043333819148268 Time: 0.171648 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_medium_nt_v1 Tactic: 3899284354987683408 [03/25/2022-13:24:28] [V] [TRT] Tactic: 3899284354987683408 Time: 0.190208 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32 Tactic: 4042202769383439184 [03/25/2022-13:24:28] [V] [TRT] Tactic: 4042202769383439184 Time: 0.124416 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_large_nt_v1 Tactic: 4182625619810185112 [03/25/2022-13:24:28] [V] [TRT] Tactic: 4182625619810185112 Time: 0.19008 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 4214794893922618058 [03/25/2022-13:24:28] [V] [TRT] Tactic: 4214794893922618058 Time: 0.172928 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_epifadd Tactic: 4259547356717612415 [03/25/2022-13:24:28] [V] [TRT] Tactic: 4259547356717612415 Time: 0.138624 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 4384868749799132354 [03/25/2022-13:24:28] [V] [TRT] Tactic: 4384868749799132354 Time: 0.244736 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64 Tactic: 4414594337986714263 [03/25/2022-13:24:28] [V] [TRT] Tactic: 4414594337986714263 Time: 0.0896 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_small_nt_v1 Tactic: 4717285412741024953 [03/25/2022-13:24:28] [V] [TRT] Tactic: 4717285412741024953 Time: 0.17728 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32 Tactic: 4734519122557206480 [03/25/2022-13:24:28] [V] [TRT] Tactic: 4734519122557206480 Time: 0.207232 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: 4922297020351187339 [03/25/2022-13:24:28] [V] [TRT] Tactic: 4922297020351187339 Time: 0.153856 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16_t1r3s3 Tactic: 4931167631624420067 [03/25/2022-13:24:28] [V] [TRT] Tactic: 4931167631624420067 Time: 0.34816 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_epifadd Tactic: 5121596860264626879 [03/25/2022-13:24:28] [V] [TRT] Tactic: 5121596860264626879 Time: 0.193664 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 5136656982162849059 [03/25/2022-13:24:28] [V] [TRT] Tactic: 5136656982162849059 Time: 0.229632 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: 5158259316594207439 [03/25/2022-13:24:28] [V] [TRT] Tactic: 5158259316594207439 Time: 0.123008 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 5189825015507701541 [03/25/2022-13:24:28] [V] [TRT] Tactic: 5189825015507701541 Time: 0.355456 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32 Tactic: 5424417905073460656 [03/25/2022-13:24:28] [V] [TRT] Tactic: 5424417905073460656 Time: 0.15552 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32 Tactic: 5442043907221427810 [03/25/2022-13:24:28] [V] [TRT] Tactic: 5442043907221427810 Time: 0.201856 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_t1r3s3 Tactic: 5544365258913999384 [03/25/2022-13:24:28] [V] [TRT] Tactic: 5544365258913999384 Time: 0.196096 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r3s3 Tactic: 5641967928706599451 [03/25/2022-13:24:28] [V] [TRT] Tactic: 5641967928706599451 Time: 0.28288 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 5721595115357140131 [03/25/2022-13:24:28] [V] [TRT] Tactic: 5721595115357140131 Time: 0.151808 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 5966973378912044513 [03/25/2022-13:24:28] [V] [TRT] Tactic: 5966973378912044513 Time: 0.1024 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 6004789655466615912 [03/25/2022-13:24:28] [V] [TRT] Tactic: 6004789655466615912 Time: 0.133632 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r3s3 Tactic: 6146901278630392829 [03/25/2022-13:24:28] [V] [TRT] Tactic: 6146901278630392829 Time: 0.204672 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16 Tactic: 6394572396369862482 [03/25/2022-13:24:28] [V] [TRT] Tactic: 6394572396369862482 Time: 0.285312 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_epifadd Tactic: 6434020722187266170 [03/25/2022-13:24:28] [V] [TRT] Tactic: 6434020722187266170 Time: 0.105472 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32 Tactic: 6781129591847482048 [03/25/2022-13:24:28] [V] [TRT] Tactic: 6781129591847482048 Time: 0.119296 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 6984451771200230840 [03/25/2022-13:24:28] [V] [TRT] Tactic: 6984451771200230840 Time: 0.197376 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 7048234086361926570 [03/25/2022-13:24:28] [V] [TRT] Tactic: 7048234086361926570 Time: 0.18304 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32 Tactic: 7077570591813340966 [03/25/2022-13:24:28] [V] [TRT] Tactic: 7077570591813340966 Time: 0.104576 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 7191893591576074000 [03/25/2022-13:24:28] [V] [TRT] Tactic: 7191893591576074000 Time: 0.203392 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_t1r3s3 Tactic: 7429976449747682901 [03/25/2022-13:24:28] [V] [TRT] Tactic: 7429976449747682901 Time: 0.142848 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 7438984192263206338 [03/25/2022-13:24:28] [V] [TRT] Tactic: 7438984192263206338 Time: 0.116352 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: 7504901284678552178 [03/25/2022-13:24:28] [V] [TRT] Tactic: 7504901284678552178 Time: 0.09792 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 8096257414008860171 [03/25/2022-13:24:28] [V] [TRT] Tactic: 8096257414008860171 Time: 0.112 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 8128112048355596715 [03/25/2022-13:24:28] [V] [TRT] Tactic: 8128112048355596715 Time: 0.109696 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_indexed_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: 8751622450593766232 [03/25/2022-13:24:28] [V] [TRT] Tactic: 8751622450593766232 Time: 0.105088 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_indexed_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32 Tactic: 9064458886956700976 [03/25/2022-13:24:28] [V] [TRT] Tactic: 9064458886956700976 Time: 0.106496 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 9143438935315839085 [03/25/2022-13:24:28] [V] [TRT] Tactic: 9143438935315839085 Time: 0.174336 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32 Tactic: -9165697322068360861 [03/25/2022-13:24:28] [V] [TRT] Tactic: -9165697322068360861 Time: 0.1088 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_small_nt_v1 Tactic: -9118785798277698619 [03/25/2022-13:24:28] [V] [TRT] Tactic: -9118785798277698619 Time: 0.166912 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: -9108166971364503411 [03/25/2022-13:24:28] [V] [TRT] Tactic: -9108166971364503411 Time: 0.185472 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16 Tactic: -8861822316054763526 [03/25/2022-13:24:28] [V] [TRT] Tactic: -8861822316054763526 Time: 0.179456 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64 Tactic: -8791277710877987710 [03/25/2022-13:24:28] [V] [TRT] Tactic: -8791277710877987710 Time: 0.167936 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: -8691377209893505057 [03/25/2022-13:24:28] [V] [TRT] Tactic: -8691377209893505057 Time: 0.102144 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: -8520292213102999339 [03/25/2022-13:24:28] [V] [TRT] Tactic: -8520292213102999339 Time: 0.157056 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16_t1r3s3 Tactic: -8475551154769412306 [03/25/2022-13:24:28] [V] [TRT] Tactic: -8475551154769412306 Time: 0.172544 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -8417388128970254446 [03/25/2022-13:24:28] [V] [TRT] Tactic: -8417388128970254446 Time: 0.16064 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32 Tactic: -8263994888336646547 [03/25/2022-13:24:28] [V] [TRT] Tactic: -8263994888336646547 Time: 0.098432 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32 Tactic: -8205948405243401049 [03/25/2022-13:24:28] [V] [TRT] Tactic: -8205948405243401049 Time: 0.20544 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: -7992068592656168418 [03/25/2022-13:24:28] [V] [TRT] Tactic: -7992068592656168418 Time: 0.111616 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_t1r3s3 Tactic: -7898477046581738867 [03/25/2022-13:24:28] [V] [TRT] Tactic: -7898477046581738867 Time: 0.139136 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_epifadd Tactic: -7842775553137511386 [03/25/2022-13:24:28] [V] [TRT] Tactic: -7842775553137511386 Time: 0.102528 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32 Tactic: -7683887278997527517 [03/25/2022-13:24:28] [V] [TRT] Tactic: -7683887278997527517 Time: 0.172672 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: -7381370635708568663 [03/25/2022-13:24:28] [V] [TRT] Tactic: -7381370635708568663 Time: 0.113536 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -7129320389887881029 [03/25/2022-13:24:28] [V] [TRT] Tactic: -7129320389887881029 Time: 0.166528 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -6959995514028471820 [03/25/2022-13:24:28] [V] [TRT] Tactic: -6959995514028471820 Time: 0.15808 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_small_nt_v1 Tactic: -6400348606759295499 [03/25/2022-13:24:28] [V] [TRT] Tactic: -6400348606759295499 Time: 0.16896 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: -6371781333659293809 [03/25/2022-13:24:28] [V] [TRT] Tactic: -6371781333659293809 Time: 0.17728 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32 Tactic: -6256128573036943404 [03/25/2022-13:24:28] [V] [TRT] Tactic: -6256128573036943404 Time: 0.19968 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_small_nt_v1 Tactic: -5980889159865208399 [03/25/2022-13:24:28] [V] [TRT] Tactic: -5980889159865208399 Time: 0.182272 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_medium_nt_v1 Tactic: -5766140806760372989 [03/25/2022-13:24:28] [V] [TRT] Tactic: -5766140806760372989 Time: 0.176384 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: -5709079507616090666 [03/25/2022-13:24:28] [V] [TRT] Tactic: -5709079507616090666 Time: 0.097152 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: -5698636014239116282 [03/25/2022-13:24:28] [V] [TRT] Tactic: -5698636014239116282 Time: 0.18432 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16 Tactic: -5180570335464125033 [03/25/2022-13:24:28] [V] [TRT] Tactic: -5180570335464125033 Time: 0.17984 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -4933563390723451692 [03/25/2022-13:24:28] [V] [TRT] Tactic: -4933563390723451692 Time: 0.136576 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_medium_nt_v1 Tactic: -4516822589357530549 [03/25/2022-13:24:28] [V] [TRT] Tactic: -4516822589357530549 Time: 0.178048 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64 Tactic: -4232916483289779353 [03/25/2022-13:24:28] [V] [TRT] Tactic: -4232916483289779353 Time: 0.287616 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64 Tactic: -3460842194336717186 [03/25/2022-13:24:28] [V] [TRT] Tactic: -3460842194336717186 Time: 0.104576 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: -3413217501222406256 [03/25/2022-13:24:28] [V] [TRT] Tactic: -3413217501222406256 Time: 0.099584 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: -3280888557222886418 [03/25/2022-13:24:28] [V] [TRT] Tactic: -3280888557222886418 Time: 0.14592 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -3238475748440751107 [03/25/2022-13:24:28] [V] [TRT] Tactic: -3238475748440751107 Time: 0.116608 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: -3182884991006484042 [03/25/2022-13:24:28] [V] [TRT] Tactic: -3182884991006484042 Time: 0.100224 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -3173468756112541306 [03/25/2022-13:24:28] [V] [TRT] Tactic: -3173468756112541306 Time: 0.20224 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_large_nt_v1 Tactic: -2917455979290586480 [03/25/2022-13:24:28] [V] [TRT] Tactic: -2917455979290586480 Time: 0.183296 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -2741641298163591508 [03/25/2022-13:24:28] [V] [TRT] Tactic: -2741641298163591508 Time: 0.103296 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_large_nt_v1 Tactic: -2571022005763160364 [03/25/2022-13:24:28] [V] [TRT] Tactic: -2571022005763160364 Time: 0.177408 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: -2499089240293650188 [03/25/2022-13:24:28] [V] [TRT] Tactic: -2499089240293650188 Time: 0.166784 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: -2328318099174473157 [03/25/2022-13:24:28] [V] [TRT] Tactic: -2328318099174473157 Time: 0.180864 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: -2083778562631872334 [03/25/2022-13:24:28] [V] [TRT] Tactic: -2083778562631872334 Time: 0.115968 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32 Tactic: -2054375205435666404 [03/25/2022-13:24:28] [V] [TRT] Tactic: -2054375205435666404 Time: 0.145152 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -1546787387293556842 [03/25/2022-13:24:28] [V] [TRT] Tactic: -1546787387293556842 Time: 0.096512 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32 Tactic: -1498626619443284096 [03/25/2022-13:24:28] [V] [TRT] Tactic: -1498626619443284096 Time: 0.135168 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -1471245223605064669 [03/25/2022-13:24:28] [V] [TRT] Tactic: -1471245223605064669 Time: 0.151808 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32 Tactic: -1283580231568512025 [03/25/2022-13:24:28] [V] [TRT] Tactic: -1283580231568512025 Time: 0.2272 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: -1224421172675151280 [03/25/2022-13:24:28] [V] [TRT] Tactic: -1224421172675151280 Time: 0.097152 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_epifadd Tactic: -1173968681844185579 [03/25/2022-13:24:28] [V] [TRT] Tactic: -1173968681844185579 Time: 0.229888 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -921247911551089037 [03/25/2022-13:24:28] [V] [TRT] Tactic: -921247911551089037 Time: 0.10048 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32 Tactic: -762222380308749469 [03/25/2022-13:24:28] [V] [TRT] Tactic: -762222380308749469 Time: 0.139904 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: -556794153877490941 [03/25/2022-13:24:28] [V] [TRT] Tactic: -556794153877490941 Time: 0.142464 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -516725800067794372 [03/25/2022-13:24:28] [V] [TRT] Tactic: -516725800067794372 Time: 0.106496 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_large_nt_v1 Tactic: -428104331444385564 [03/25/2022-13:24:28] [V] [TRT] Tactic: -428104331444385564 Time: 0.178432 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16 Tactic: -366411318217594794 [03/25/2022-13:24:28] [V] [TRT] Tactic: -366411318217594794 Time: 0.212224 [03/25/2022-13:24:28] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16 Tactic: -351548418071036983 [03/25/2022-13:24:28] [V] [TRT] Tactic: -351548418071036983 Time: 0.34304 [03/25/2022-13:24:28] [V] [TRT] Fastest Tactic: 4414594337986714263 Time: 0.0896 [03/25/2022-13:24:28] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: 4414594337986714263 [03/25/2022-13:24:28] [V] [TRT] =============== Computing costs for [03/25/2022-13:24:28] [V] [TRT] *************** Autotuning format combination: Int8(25088,784:4,28,1), Int8(100352,784:4,28,1) -> Int8(100352,784:4,28,1) *************** [03/25/2022-13:24:28] [V] [TRT] *************** Autotuning format combination: Int8(25088,784:4,28,1), Int8(12544,784:32,28,1) -> Int8(12544,784:32,28,1) *************** [03/25/2022-13:24:28] [V] [TRT] *************** Autotuning format combination: Int8(3136,784:32,28,1), Int8(12544,784:32,28,1) -> Int8(12544,784:32,28,1) *************** [03/25/2022-13:24:28] [V] [TRT] =============== Computing costs for [03/25/2022-13:24:28] [V] [TRT] *************** Autotuning format combination: Int8(100352,784:4,28,1) -> Int8(25088,784:4,28,1) *************** [03/25/2022-13:24:28] [V] [TRT] *************** Autotuning format combination: Int8(100352,784:4,28,1) -> Int8(3136,784:32,28,1) *************** [03/25/2022-13:24:28] [V] [TRT] *************** Autotuning format combination: Int8(12544,784:32,28,1) -> Int8(3136,784:32,28,1) *************** [03/25/2022-13:24:28] [V] [TRT] Skip timing cache hit (epiFadd mismatch) for node: sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370, LayerImpl: CaskConvolution, tactic: 1913026264725750683 [03/25/2022-13:24:28] [V] [TRT] --------------- Timing Runner: sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 (CudaGroupConvolution) [03/25/2022-13:24:28] [V] [TRT] CudaGroupConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:28] [V] [TRT] --------------- Timing Runner: sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 (CudaDepthwiseConvolution) [03/25/2022-13:24:28] [V] [TRT] CudaDepthwiseConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:28] [V] [TRT] --------------- Timing Runner: sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 (FusedConvActConvolution) [03/25/2022-13:24:28] [V] [TRT] FusedConvActConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:28] [V] [TRT] --------------- Timing Runner: sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 (CaskConvolution) [03/25/2022-13:24:28] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16_t1r1s1 Tactic: 68468667201176803 [03/25/2022-13:24:28] [V] [TRT] Tactic: 68468667201176803 Time: 0.0992 [03/25/2022-13:24:28] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 125145153013230687 [03/25/2022-13:24:28] [V] [TRT] Tactic: 125145153013230687 Time: 0.105984 [03/25/2022-13:24:28] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_interior_nt_v1 Tactic: 434957160407688216 [03/25/2022-13:24:28] [V] [TRT] Tactic: 434957160407688216 Time: 0.1088 [03/25/2022-13:24:28] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 805889586762897346 [03/25/2022-13:24:28] [V] [TRT] Tactic: 805889586762897346 Time: 0.074496 [03/25/2022-13:24:28] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: 857001784974286465 [03/25/2022-13:24:28] [V] [TRT] Tactic: 857001784974286465 Time: 0.103552 [03/25/2022-13:24:28] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 1214130898909872671 [03/25/2022-13:24:28] [V] [TRT] Tactic: 1214130898909872671 Time: 0.141952 [03/25/2022-13:24:28] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 1278425129871930205 [03/25/2022-13:24:28] [V] [TRT] Tactic: 1278425129871930205 Time: 0.072192 [03/25/2022-13:24:28] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 1583811548148740665 [03/25/2022-13:24:28] [V] [TRT] Tactic: 1583811548148740665 Time: 0.10432 [03/25/2022-13:24:28] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 1701344857577810806 [03/25/2022-13:24:28] [V] [TRT] Tactic: 1701344857577810806 Time: 0.103552 [03/25/2022-13:24:28] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 1797231177354918208 [03/25/2022-13:24:28] [V] [TRT] Tactic: 1797231177354918208 Time: 0.108672 [03/25/2022-13:24:28] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_conv_fprop_smallk_i8i8_i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_warptilesize16x32_stage4_warpsize2x4x1_r1s1_u1v1_hw0_c512_scalebias_relu Tactic: 1913026264725750683 [03/25/2022-13:24:28] [V] [TRT] Tactic: 1913026264725750683 Time: 0.06208 [03/25/2022-13:24:28] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2004812516525036381 [03/25/2022-13:24:28] [V] [TRT] Tactic: 2004812516525036381 Time: 0.088448 [03/25/2022-13:24:28] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2030033463723799063 [03/25/2022-13:24:28] [V] [TRT] Tactic: 2030033463723799063 Time: 0.071808 [03/25/2022-13:24:28] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_interior_nt_v1 Tactic: 2346437292116182513 [03/25/2022-13:24:28] [V] [TRT] Tactic: 2346437292116182513 Time: 0.09984 [03/25/2022-13:24:28] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2376898825218218566 [03/25/2022-13:24:28] [V] [TRT] Tactic: 2376898825218218566 Time: 0.068864 [03/25/2022-13:24:28] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_interior_nt_v1 Tactic: 2522133112320625287 [03/25/2022-13:24:28] [V] [TRT] Tactic: 2522133112320625287 Time: 0.097792 [03/25/2022-13:24:28] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2548171972648455240 [03/25/2022-13:24:28] [V] [TRT] Tactic: 2548171972648455240 Time: 0.070528 [03/25/2022-13:24:28] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 2548946449357458230 [03/25/2022-13:24:28] [V] [TRT] Tactic: 2548946449357458230 Time: 0.114048 [03/25/2022-13:24:28] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: 2570666021825229009 [03/25/2022-13:24:28] [V] [TRT] Tactic: 2570666021825229009 Time: 0.097792 [03/25/2022-13:24:28] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 2678520742286844763 [03/25/2022-13:24:28] [V] [TRT] Tactic: 2678520742286844763 Time: 0.141056 [03/25/2022-13:24:28] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 2756291002030759362 [03/25/2022-13:24:28] [V] [TRT] Tactic: 2756291002030759362 Time: 0.083968 [03/25/2022-13:24:28] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2972948223367788520 [03/25/2022-13:24:28] [V] [TRT] Tactic: 2972948223367788520 Time: 0.07104 [03/25/2022-13:24:28] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_medium_nt_v1 Tactic: 2985940154541537814 [03/25/2022-13:24:28] [V] [TRT] Tactic: 2985940154541537814 Time: 0.099328 [03/25/2022-13:24:28] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 3043273137345374664 [03/25/2022-13:24:28] [V] [TRT] Tactic: 3043273137345374664 Time: 0.143488 [03/25/2022-13:24:28] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 3221677093659484230 [03/25/2022-13:24:28] [V] [TRT] Tactic: 3221677093659484230 Time: 0.142976 [03/25/2022-13:24:28] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: 3242897809704328258 [03/25/2022-13:24:28] [V] [TRT] Tactic: 3242897809704328258 Time: 0.097664 [03/25/2022-13:24:28] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 3312456766204252694 [03/25/2022-13:24:29] [V] [TRT] Tactic: 3312456766204252694 Time: 0.120192 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 3538565962642681625 [03/25/2022-13:24:29] [V] [TRT] Tactic: 3538565962642681625 Time: 0.095616 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 3541919052468401776 [03/25/2022-13:24:29] [V] [TRT] Tactic: 3541919052468401776 Time: 0.091264 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 3593397928177382100 [03/25/2022-13:24:29] [V] [TRT] Tactic: 3593397928177382100 Time: 0.141696 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 3670282018109435863 [03/25/2022-13:24:29] [V] [TRT] Tactic: 3670282018109435863 Time: 0.080384 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 3671413346254027573 [03/25/2022-13:24:29] [V] [TRT] Tactic: 3671413346254027573 Time: 0.098432 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_medium_nt_v1 Tactic: 3899284354987683408 [03/25/2022-13:24:29] [V] [TRT] Tactic: 3899284354987683408 Time: 0.107648 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 3927509214678622419 [03/25/2022-13:24:29] [V] [TRT] Tactic: 3927509214678622419 Time: 0.094976 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 4112572034735311841 [03/25/2022-13:24:29] [V] [TRT] Tactic: 4112572034735311841 Time: 0.16832 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 4239974928951431644 [03/25/2022-13:24:29] [V] [TRT] Tactic: 4239974928951431644 Time: 0.10112 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 4610760414797216079 [03/25/2022-13:24:29] [V] [TRT] Tactic: 4610760414797216079 Time: 0.086784 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_small_nt_v1 Tactic: 4717285412741024953 [03/25/2022-13:24:29] [V] [TRT] Tactic: 4717285412741024953 Time: 0.100736 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 4796956614760326119 [03/25/2022-13:24:29] [V] [TRT] Tactic: 4796956614760326119 Time: 0.09408 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: 4909502217677847353 [03/25/2022-13:24:29] [V] [TRT] Tactic: 4909502217677847353 Time: 0.067584 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 4919361344804309192 [03/25/2022-13:24:29] [V] [TRT] Tactic: 4919361344804309192 Time: 0.128 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 5043674678294309681 [03/25/2022-13:24:29] [V] [TRT] Tactic: 5043674678294309681 Time: 0.076544 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_interior_nt_v1 Tactic: 5126565865931538390 [03/25/2022-13:24:29] [V] [TRT] Tactic: 5126565865931538390 Time: 0.101632 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5204702486885981735 [03/25/2022-13:24:29] [V] [TRT] Tactic: 5204702486885981735 Time: 0.080896 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 5375256703210220108 [03/25/2022-13:24:29] [V] [TRT] Tactic: 5375256703210220108 Time: 0.078464 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 5424258848951129084 [03/25/2022-13:24:29] [V] [TRT] Tactic: 5424258848951129084 Time: 0.067456 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5443897483205284103 [03/25/2022-13:24:29] [V] [TRT] Tactic: 5443897483205284103 Time: 0.102016 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5707566217891294846 [03/25/2022-13:24:29] [V] [TRT] Tactic: 5707566217891294846 Time: 0.074624 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5986622376339202983 [03/25/2022-13:24:29] [V] [TRT] Tactic: 5986622376339202983 Time: 0.102016 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6007888770437705057 [03/25/2022-13:24:29] [V] [TRT] Tactic: 6007888770437705057 Time: 0.079104 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 6405251167055673379 [03/25/2022-13:24:29] [V] [TRT] Tactic: 6405251167055673379 Time: 0.096 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 6433368103202497147 [03/25/2022-13:24:29] [V] [TRT] Tactic: 6433368103202497147 Time: 0.077568 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6441948709525127755 [03/25/2022-13:24:29] [V] [TRT] Tactic: 6441948709525127755 Time: 0.144256 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 6443933097134654777 [03/25/2022-13:24:29] [V] [TRT] Tactic: 6443933097134654777 Time: 0.120576 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6457435868048963632 [03/25/2022-13:24:29] [V] [TRT] Tactic: 6457435868048963632 Time: 0.087424 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1 Tactic: 6510345569544721081 [03/25/2022-13:24:29] [V] [TRT] Tactic: 6510345569544721081 Time: 0.098688 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 6793988781414507278 [03/25/2022-13:24:29] [V] [TRT] Tactic: 6793988781414507278 Time: 0.071552 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 6880710371738875469 [03/25/2022-13:24:29] [V] [TRT] Tactic: 6880710371738875469 Time: 0.094848 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6925201228918187099 [03/25/2022-13:24:29] [V] [TRT] Tactic: 6925201228918187099 Time: 0.073856 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6991524515605108718 [03/25/2022-13:24:29] [V] [TRT] Tactic: 6991524515605108718 Time: 0.106368 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 7245509442265271220 [03/25/2022-13:24:29] [V] [TRT] Tactic: 7245509442265271220 Time: 0.10176 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 7318929579222925725 [03/25/2022-13:24:29] [V] [TRT] Tactic: 7318929579222925725 Time: 0.07424 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 7731430299029542276 [03/25/2022-13:24:29] [V] [TRT] Tactic: 7731430299029542276 Time: 0.07104 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 7738495016763012180 [03/25/2022-13:24:29] [V] [TRT] Tactic: 7738495016763012180 Time: 0.11328 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: 7886967395128926382 [03/25/2022-13:24:29] [V] [TRT] Tactic: 7886967395128926382 Time: 0.070912 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 8142283985160822229 [03/25/2022-13:24:29] [V] [TRT] Tactic: 8142283985160822229 Time: 0.078976 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 8173975624668590862 [03/25/2022-13:24:29] [V] [TRT] Tactic: 8173975624668590862 Time: 0.123008 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 8234775147403903473 [03/25/2022-13:24:29] [V] [TRT] Tactic: 8234775147403903473 Time: 0.081152 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 8524082966802584889 [03/25/2022-13:24:29] [V] [TRT] Tactic: 8524082966802584889 Time: 0.071168 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 8684013308930763400 [03/25/2022-13:24:29] [V] [TRT] Tactic: 8684013308930763400 Time: 0.096 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 8765382722978397630 [03/25/2022-13:24:29] [V] [TRT] Tactic: 8765382722978397630 Time: 0.071936 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 8843193587782643431 [03/25/2022-13:24:29] [V] [TRT] Tactic: 8843193587782643431 Time: 0.117632 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 8883810517410230831 [03/25/2022-13:24:29] [V] [TRT] Tactic: 8883810517410230831 Time: 0.071424 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 8930797211803511337 [03/25/2022-13:24:29] [V] [TRT] Tactic: 8930797211803511337 Time: 0.104704 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 8935070489925739043 [03/25/2022-13:24:29] [V] [TRT] Tactic: 8935070489925739043 Time: 0.0864 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1 Tactic: 9062173295331155069 [03/25/2022-13:24:29] [V] [TRT] Tactic: 9062173295331155069 Time: 0.141568 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_small_nt_v1 Tactic: -9118785798277698619 [03/25/2022-13:24:29] [V] [TRT] Tactic: -9118785798277698619 Time: 0.099584 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -8985599729413291927 [03/25/2022-13:24:29] [V] [TRT] Tactic: -8985599729413291927 Time: 0.07616 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -8972697510150675429 [03/25/2022-13:24:29] [V] [TRT] Tactic: -8972697510150675429 Time: 0.091904 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -8943710627305202139 [03/25/2022-13:24:29] [V] [TRT] Tactic: -8943710627305202139 Time: 0.094336 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -8859846367886814331 [03/25/2022-13:24:29] [V] [TRT] Tactic: -8859846367886814331 Time: 0.105984 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -8638624340850784688 [03/25/2022-13:24:29] [V] [TRT] Tactic: -8638624340850784688 Time: 0.121472 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -8556775352640313933 [03/25/2022-13:24:29] [V] [TRT] Tactic: -8556775352640313933 Time: 0.074112 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -8382298409581540699 [03/25/2022-13:24:29] [V] [TRT] Tactic: -8382298409581540699 Time: 0.137344 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -8172318747337038866 [03/25/2022-13:24:29] [V] [TRT] Tactic: -8172318747337038866 Time: 0.102784 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -8038164441468184723 [03/25/2022-13:24:29] [V] [TRT] Tactic: -8038164441468184723 Time: 0.106624 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -7844028314176826857 [03/25/2022-13:24:29] [V] [TRT] Tactic: -7844028314176826857 Time: 0.145408 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -7674507941016740570 [03/25/2022-13:24:29] [V] [TRT] Tactic: -7674507941016740570 Time: 0.067456 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -7364286662638617917 [03/25/2022-13:24:29] [V] [TRT] Tactic: -7364286662638617917 Time: 0.10944 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r1s1 Tactic: -7361755530333096258 [03/25/2022-13:24:29] [V] [TRT] Tactic: -7361755530333096258 Time: 0.116736 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_t1r1s1 Tactic: -7289760022626653388 [03/25/2022-13:24:29] [V] [TRT] Tactic: -7289760022626653388 Time: 0.071936 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -7106539943789766885 [03/25/2022-13:24:29] [V] [TRT] Tactic: -7106539943789766885 Time: 0.182912 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -6969478418607271266 [03/25/2022-13:24:29] [V] [TRT] Tactic: -6969478418607271266 Time: 0.100864 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -6930438165437733000 [03/25/2022-13:24:29] [V] [TRT] Tactic: -6930438165437733000 Time: 0.170112 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -6879607992933502380 [03/25/2022-13:24:29] [V] [TRT] Tactic: -6879607992933502380 Time: 0.072576 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -6839669803644810934 [03/25/2022-13:24:29] [V] [TRT] Tactic: -6839669803644810934 Time: 0.07616 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -6812830108414456369 [03/25/2022-13:24:29] [V] [TRT] Tactic: -6812830108414456369 Time: 0.076288 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -6527178416855951297 [03/25/2022-13:24:29] [V] [TRT] Tactic: -6527178416855951297 Time: 0.131456 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -6510232214299595844 [03/25/2022-13:24:29] [V] [TRT] Tactic: -6510232214299595844 Time: 0.127872 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_small_nt_v1 Tactic: -6400348606759295499 [03/25/2022-13:24:29] [V] [TRT] Tactic: -6400348606759295499 Time: 0.094976 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -6346247605026339453 [03/25/2022-13:24:29] [V] [TRT] Tactic: -6346247605026339453 Time: 0.093056 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -6232597026469067819 [03/25/2022-13:24:29] [V] [TRT] Tactic: -6232597026469067819 Time: 0.120832 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_small_nt_v1 Tactic: -5980889159865208399 [03/25/2022-13:24:29] [V] [TRT] Tactic: -5980889159865208399 Time: 0.104832 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_medium_nt_v1 Tactic: -5766140806760372989 [03/25/2022-13:24:29] [V] [TRT] Tactic: -5766140806760372989 Time: 0.10176 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -5697614955743334137 [03/25/2022-13:24:29] [V] [TRT] Tactic: -5697614955743334137 Time: 0.092928 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -5671123121710113970 [03/25/2022-13:24:29] [V] [TRT] Tactic: -5671123121710113970 Time: 0.104448 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -5615581362569252260 [03/25/2022-13:24:29] [V] [TRT] Tactic: -5615581362569252260 Time: 0.107392 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -5562968047117507056 [03/25/2022-13:24:29] [V] [TRT] Tactic: -5562968047117507056 Time: 0.078208 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -5516472881360101487 [03/25/2022-13:24:29] [V] [TRT] Tactic: -5516472881360101487 Time: 0.120576 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -5311474420963248369 [03/25/2022-13:24:29] [V] [TRT] Tactic: -5311474420963248369 Time: 0.121856 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -5170003087447722174 [03/25/2022-13:24:29] [V] [TRT] Tactic: -5170003087447722174 Time: 0.143872 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -4889586143772361690 [03/25/2022-13:24:29] [V] [TRT] Tactic: -4889586143772361690 Time: 0.08384 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -4889498558023475527 [03/25/2022-13:24:29] [V] [TRT] Tactic: -4889498558023475527 Time: 0.071808 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -4849712423393454704 [03/25/2022-13:24:29] [V] [TRT] Tactic: -4849712423393454704 Time: 0.08384 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: -4681913707320020520 [03/25/2022-13:24:29] [V] [TRT] Tactic: -4681913707320020520 Time: 0.10176 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_medium_nt_v1 Tactic: -4516822589357530549 [03/25/2022-13:24:29] [V] [TRT] Tactic: -4516822589357530549 Time: 0.10496 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -4455415102719506646 [03/25/2022-13:24:29] [V] [TRT] Tactic: -4455415102719506646 Time: 0.097408 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -4425346730823666456 [03/25/2022-13:24:29] [V] [TRT] Tactic: -4425346730823666456 Time: 0.129792 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r1s1 Tactic: -4260476497340370474 [03/25/2022-13:24:29] [V] [TRT] Tactic: -4260476497340370474 Time: 0.138496 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -4182501876984672402 [03/25/2022-13:24:29] [V] [TRT] Tactic: -4182501876984672402 Time: 0.104832 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_t1r1s1 Tactic: -4151617293257698859 [03/25/2022-13:24:29] [V] [TRT] Tactic: -4151617293257698859 Time: 0.122112 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -3825889760337461729 [03/25/2022-13:24:29] [V] [TRT] Tactic: -3825889760337461729 Time: 0.12672 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -3797022944823726673 [03/25/2022-13:24:29] [V] [TRT] Tactic: -3797022944823726673 Time: 0.09536 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -3613322253849278738 [03/25/2022-13:24:29] [V] [TRT] Tactic: -3613322253849278738 Time: 0.172416 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -3577322188448771475 [03/25/2022-13:24:29] [V] [TRT] Tactic: -3577322188448771475 Time: 0.097536 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16_t1r1s1 Tactic: -3531681826488401618 [03/25/2022-13:24:29] [V] [TRT] Tactic: -3531681826488401618 Time: 0.18688 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -3305554949874552860 [03/25/2022-13:24:29] [V] [TRT] Tactic: -3305554949874552860 Time: 0.140544 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -3288585994448820820 [03/25/2022-13:24:29] [V] [TRT] Tactic: -3288585994448820820 Time: 0.104192 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -2754311112012636251 [03/25/2022-13:24:29] [V] [TRT] Tactic: -2754311112012636251 Time: 0.097664 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -2432868635536396215 [03/25/2022-13:24:29] [V] [TRT] Tactic: -2432868635536396215 Time: 0.103808 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -2379804152300264660 [03/25/2022-13:24:29] [V] [TRT] Tactic: -2379804152300264660 Time: 0.142976 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: -2352253835013627337 [03/25/2022-13:24:29] [V] [TRT] Tactic: -2352253835013627337 Time: 0.068224 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -2335587136911650799 [03/25/2022-13:24:29] [V] [TRT] Tactic: -2335587136911650799 Time: 0.106624 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r1s1 Tactic: -2315453944962430928 [03/25/2022-13:24:29] [V] [TRT] Tactic: -2315453944962430928 Time: 0.126208 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: -2238364958919154661 [03/25/2022-13:24:29] [V] [TRT] Tactic: -2238364958919154661 Time: 0.098176 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -1916483171117495388 [03/25/2022-13:24:29] [V] [TRT] Tactic: -1916483171117495388 Time: 0.107264 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -1740762957710554518 [03/25/2022-13:24:29] [V] [TRT] Tactic: -1740762957710554518 Time: 0.1408 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -1549742793039499659 [03/25/2022-13:24:29] [V] [TRT] Tactic: -1549742793039499659 Time: 0.122112 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -1499578657823798783 [03/25/2022-13:24:29] [V] [TRT] Tactic: -1499578657823798783 Time: 0.097024 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -1494157908358500249 [03/25/2022-13:24:29] [V] [TRT] Tactic: -1494157908358500249 Time: 0.108032 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_t1r1s1 Tactic: -1328736756812546664 [03/25/2022-13:24:29] [V] [TRT] Tactic: -1328736756812546664 Time: 0.102912 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -1006589727652607355 [03/25/2022-13:24:29] [V] [TRT] Tactic: -1006589727652607355 Time: 0.10752 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -713022856474991236 [03/25/2022-13:24:29] [V] [TRT] Tactic: -713022856474991236 Time: 0.174336 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -619668460699260222 [03/25/2022-13:24:29] [V] [TRT] Tactic: -619668460699260222 Time: 0.098048 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -405554772060757402 [03/25/2022-13:24:29] [V] [TRT] Tactic: -405554772060757402 Time: 0.082176 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -375949437730908730 [03/25/2022-13:24:29] [V] [TRT] Tactic: -375949437730908730 Time: 0.078976 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: -233227833606287806 [03/25/2022-13:24:29] [V] [TRT] Tactic: -233227833606287806 Time: 0.076544 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -111878368089469751 [03/25/2022-13:24:29] [V] [TRT] Tactic: -111878368089469751 Time: 0.120064 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -48936598874722005 [03/25/2022-13:24:29] [V] [TRT] Tactic: -48936598874722005 Time: 0.0704 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -19707840769375107 [03/25/2022-13:24:29] [V] [TRT] Tactic: -19707840769375107 Time: 0.094848 [03/25/2022-13:24:29] [V] [TRT] Fastest Tactic: 1913026264725750683 Time: 0.06208 [03/25/2022-13:24:29] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: 1913026264725750683 [03/25/2022-13:24:29] [V] [TRT] =============== Computing costs for [03/25/2022-13:24:29] [V] [TRT] *************** Autotuning format combination: Int8(25088,784:4,28,1) -> Int8(25088,784:4,28,1) *************** [03/25/2022-13:24:29] [V] [TRT] *************** Autotuning format combination: Int8(25088,784:4,28,1) -> Int8(3136,784:32,28,1) *************** [03/25/2022-13:24:29] [V] [TRT] *************** Autotuning format combination: Int8(3136,784:32,28,1) -> Int8(3136,784:32,28,1) *************** [03/25/2022-13:24:29] [V] [TRT] Skip timing cache hit (epiFadd mismatch) for node: sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385, LayerImpl: CaskConvolution, tactic: 4414594337986714263 [03/25/2022-13:24:29] [V] [TRT] --------------- Timing Runner: sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 (CudaGroupConvolution) [03/25/2022-13:24:29] [V] [TRT] CudaGroupConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:29] [V] [TRT] --------------- Timing Runner: sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 (CudaDepthwiseConvolution) [03/25/2022-13:24:29] [V] [TRT] CudaDepthwiseConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:29] [V] [TRT] --------------- Timing Runner: sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 (FusedConvActConvolution) [03/25/2022-13:24:29] [V] [TRT] FusedConvActConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:29] [V] [TRT] --------------- Timing Runner: sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 (CaskConvolution) [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_epifadd Tactic: 177040020707947851 [03/25/2022-13:24:29] [V] [TRT] Tactic: 177040020707947851 Time: 0.206592 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 184229963126259101 [03/25/2022-13:24:29] [V] [TRT] Tactic: 184229963126259101 Time: 0.184704 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 289888059097454627 [03/25/2022-13:24:29] [V] [TRT] Tactic: 289888059097454627 Time: 0.177664 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32 Tactic: 328135613486708155 [03/25/2022-13:24:29] [V] [TRT] Tactic: 328135613486708155 Time: 0.3232 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 680740992583869928 [03/25/2022-13:24:29] [V] [TRT] Tactic: 680740992583869928 Time: 0.165504 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32 Tactic: 1111159740952609683 [03/25/2022-13:24:29] [V] [TRT] Tactic: 1111159740952609683 Time: 0.165888 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: 1134860903395928905 [03/25/2022-13:24:29] [V] [TRT] Tactic: 1134860903395928905 Time: 0.173696 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: 1276591930377039442 [03/25/2022-13:24:29] [V] [TRT] Tactic: 1276591930377039442 Time: 0.195328 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r3s3 Tactic: 1388866374720163187 [03/25/2022-13:24:29] [V] [TRT] Tactic: 1388866374720163187 Time: 0.210176 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32 Tactic: 1399501420456320585 [03/25/2022-13:24:29] [V] [TRT] Tactic: 1399501420456320585 Time: 0.221056 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: 1550399266192842845 [03/25/2022-13:24:29] [V] [TRT] Tactic: 1550399266192842845 Time: 0.205184 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 1572887561103143487 [03/25/2022-13:24:29] [V] [TRT] Tactic: 1572887561103143487 Time: 0.130048 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 1853122447892949466 [03/25/2022-13:24:29] [V] [TRT] Tactic: 1853122447892949466 Time: 0.219264 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: 2133329569091732311 [03/25/2022-13:24:29] [V] [TRT] Tactic: 2133329569091732311 Time: 0.17024 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32 Tactic: 2325023763229477890 [03/25/2022-13:24:29] [V] [TRT] Tactic: 2325023763229477890 Time: 0.102784 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: 2579824863892891529 [03/25/2022-13:24:29] [V] [TRT] Tactic: 2579824863892891529 Time: 0.238592 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32 Tactic: 2783960536172159663 [03/25/2022-13:24:29] [V] [TRT] Tactic: 2783960536172159663 Time: 0.101248 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32 Tactic: 2821711838552913693 [03/25/2022-13:24:29] [V] [TRT] Tactic: 2821711838552913693 Time: 0.15168 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64 Tactic: 2945009978756227538 [03/25/2022-13:24:29] [V] [TRT] Tactic: 2945009978756227538 Time: 0.11008 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_medium_nt_v1 Tactic: 2985940154541537814 [03/25/2022-13:24:29] [V] [TRT] Tactic: 2985940154541537814 Time: 0.172928 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 3284282970967328046 [03/25/2022-13:24:29] [V] [TRT] Tactic: 3284282970967328046 Time: 0.223616 [03/25/2022-13:24:29] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 3401614690060226673 [03/25/2022-13:24:30] [V] [TRT] Tactic: 3401614690060226673 Time: 0.176128 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32 Tactic: 3456719996792527006 [03/25/2022-13:24:30] [V] [TRT] Tactic: 3456719996792527006 Time: 0.17024 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 3512426920013359699 [03/25/2022-13:24:30] [V] [TRT] Tactic: 3512426920013359699 Time: 0.135808 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64 Tactic: 3651043333819148268 [03/25/2022-13:24:30] [V] [TRT] Tactic: 3651043333819148268 Time: 0.167552 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_medium_nt_v1 Tactic: 3899284354987683408 [03/25/2022-13:24:30] [V] [TRT] Tactic: 3899284354987683408 Time: 0.185216 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32 Tactic: 4042202769383439184 [03/25/2022-13:24:30] [V] [TRT] Tactic: 4042202769383439184 Time: 0.12096 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_large_nt_v1 Tactic: 4182625619810185112 [03/25/2022-13:24:30] [V] [TRT] Tactic: 4182625619810185112 Time: 0.184832 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 4214794893922618058 [03/25/2022-13:24:30] [V] [TRT] Tactic: 4214794893922618058 Time: 0.168448 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_epifadd Tactic: 4259547356717612415 [03/25/2022-13:24:30] [V] [TRT] Tactic: 4259547356717612415 Time: 0.135296 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 4384868749799132354 [03/25/2022-13:24:30] [V] [TRT] Tactic: 4384868749799132354 Time: 0.239104 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64 Tactic: 4414594337986714263 [03/25/2022-13:24:30] [V] [TRT] Tactic: 4414594337986714263 Time: 0.088448 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_small_nt_v1 Tactic: 4717285412741024953 [03/25/2022-13:24:30] [V] [TRT] Tactic: 4717285412741024953 Time: 0.177152 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32 Tactic: 4734519122557206480 [03/25/2022-13:24:30] [V] [TRT] Tactic: 4734519122557206480 Time: 0.207232 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: 4922297020351187339 [03/25/2022-13:24:30] [V] [TRT] Tactic: 4922297020351187339 Time: 0.154368 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16_t1r3s3 Tactic: 4931167631624420067 [03/25/2022-13:24:30] [V] [TRT] Tactic: 4931167631624420067 Time: 0.34816 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_epifadd Tactic: 5121596860264626879 [03/25/2022-13:24:30] [V] [TRT] Tactic: 5121596860264626879 Time: 0.193664 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 5136656982162849059 [03/25/2022-13:24:30] [V] [TRT] Tactic: 5136656982162849059 Time: 0.229632 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: 5158259316594207439 [03/25/2022-13:24:30] [V] [TRT] Tactic: 5158259316594207439 Time: 0.12288 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 5189825015507701541 [03/25/2022-13:24:30] [V] [TRT] Tactic: 5189825015507701541 Time: 0.3552 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32 Tactic: 5424417905073460656 [03/25/2022-13:24:30] [V] [TRT] Tactic: 5424417905073460656 Time: 0.15552 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32 Tactic: 5442043907221427810 [03/25/2022-13:24:30] [V] [TRT] Tactic: 5442043907221427810 Time: 0.201728 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_t1r3s3 Tactic: 5544365258913999384 [03/25/2022-13:24:30] [V] [TRT] Tactic: 5544365258913999384 Time: 0.195968 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r3s3 Tactic: 5641967928706599451 [03/25/2022-13:24:30] [V] [TRT] Tactic: 5641967928706599451 Time: 0.28288 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 5721595115357140131 [03/25/2022-13:24:30] [V] [TRT] Tactic: 5721595115357140131 Time: 0.152192 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 5966973378912044513 [03/25/2022-13:24:30] [V] [TRT] Tactic: 5966973378912044513 Time: 0.102784 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 6004789655466615912 [03/25/2022-13:24:30] [V] [TRT] Tactic: 6004789655466615912 Time: 0.133888 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r3s3 Tactic: 6146901278630392829 [03/25/2022-13:24:30] [V] [TRT] Tactic: 6146901278630392829 Time: 0.2048 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16 Tactic: 6394572396369862482 [03/25/2022-13:24:30] [V] [TRT] Tactic: 6394572396369862482 Time: 0.285056 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_epifadd Tactic: 6434020722187266170 [03/25/2022-13:24:30] [V] [TRT] Tactic: 6434020722187266170 Time: 0.105856 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32 Tactic: 6781129591847482048 [03/25/2022-13:24:30] [V] [TRT] Tactic: 6781129591847482048 Time: 0.120704 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 6984451771200230840 [03/25/2022-13:24:30] [V] [TRT] Tactic: 6984451771200230840 Time: 0.202624 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 7048234086361926570 [03/25/2022-13:24:30] [V] [TRT] Tactic: 7048234086361926570 Time: 0.187136 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32 Tactic: 7077570591813340966 [03/25/2022-13:24:30] [V] [TRT] Tactic: 7077570591813340966 Time: 0.107648 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 7191893591576074000 [03/25/2022-13:24:30] [V] [TRT] Tactic: 7191893591576074000 Time: 0.207488 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_t1r3s3 Tactic: 7429976449747682901 [03/25/2022-13:24:30] [V] [TRT] Tactic: 7429976449747682901 Time: 0.146048 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 7438984192263206338 [03/25/2022-13:24:30] [V] [TRT] Tactic: 7438984192263206338 Time: 0.119552 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: 7504901284678552178 [03/25/2022-13:24:30] [V] [TRT] Tactic: 7504901284678552178 Time: 0.100608 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 8096257414008860171 [03/25/2022-13:24:30] [V] [TRT] Tactic: 8096257414008860171 Time: 0.115328 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 8128112048355596715 [03/25/2022-13:24:30] [V] [TRT] Tactic: 8128112048355596715 Time: 0.113152 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_indexed_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: 8751622450593766232 [03/25/2022-13:24:30] [V] [TRT] Tactic: 8751622450593766232 Time: 0.107648 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_indexed_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32 Tactic: 9064458886956700976 [03/25/2022-13:24:30] [V] [TRT] Tactic: 9064458886956700976 Time: 0.109568 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 9143438935315839085 [03/25/2022-13:24:30] [V] [TRT] Tactic: 9143438935315839085 Time: 0.179072 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32 Tactic: -9165697322068360861 [03/25/2022-13:24:30] [V] [TRT] Tactic: -9165697322068360861 Time: 0.112512 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_small_nt_v1 Tactic: -9118785798277698619 [03/25/2022-13:24:30] [V] [TRT] Tactic: -9118785798277698619 Time: 0.171776 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: -9108166971364503411 [03/25/2022-13:24:30] [V] [TRT] Tactic: -9108166971364503411 Time: 0.190464 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16 Tactic: -8861822316054763526 [03/25/2022-13:24:30] [V] [TRT] Tactic: -8861822316054763526 Time: 0.184576 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64 Tactic: -8791277710877987710 [03/25/2022-13:24:30] [V] [TRT] Tactic: -8791277710877987710 Time: 0.172544 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: -8691377209893505057 [03/25/2022-13:24:30] [V] [TRT] Tactic: -8691377209893505057 Time: 0.105344 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: -8520292213102999339 [03/25/2022-13:24:30] [V] [TRT] Tactic: -8520292213102999339 Time: 0.161408 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16_t1r3s3 Tactic: -8475551154769412306 [03/25/2022-13:24:30] [V] [TRT] Tactic: -8475551154769412306 Time: 0.177152 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -8417388128970254446 [03/25/2022-13:24:30] [V] [TRT] Tactic: -8417388128970254446 Time: 0.164992 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32 Tactic: -8263994888336646547 [03/25/2022-13:24:30] [V] [TRT] Tactic: -8263994888336646547 Time: 0.101248 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32 Tactic: -8205948405243401049 [03/25/2022-13:24:30] [V] [TRT] Tactic: -8205948405243401049 Time: 0.20992 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: -7992068592656168418 [03/25/2022-13:24:30] [V] [TRT] Tactic: -7992068592656168418 Time: 0.11456 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_t1r3s3 Tactic: -7898477046581738867 [03/25/2022-13:24:30] [V] [TRT] Tactic: -7898477046581738867 Time: 0.14272 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_epifadd Tactic: -7842775553137511386 [03/25/2022-13:24:30] [V] [TRT] Tactic: -7842775553137511386 Time: 0.105088 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32 Tactic: -7683887278997527517 [03/25/2022-13:24:30] [V] [TRT] Tactic: -7683887278997527517 Time: 0.176768 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: -7381370635708568663 [03/25/2022-13:24:30] [V] [TRT] Tactic: -7381370635708568663 Time: 0.11648 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -7129320389887881029 [03/25/2022-13:24:30] [V] [TRT] Tactic: -7129320389887881029 Time: 0.17088 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -6959995514028471820 [03/25/2022-13:24:30] [V] [TRT] Tactic: -6959995514028471820 Time: 0.162304 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_small_nt_v1 Tactic: -6400348606759295499 [03/25/2022-13:24:30] [V] [TRT] Tactic: -6400348606759295499 Time: 0.173696 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: -6371781333659293809 [03/25/2022-13:24:30] [V] [TRT] Tactic: -6371781333659293809 Time: 0.18176 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32 Tactic: -6256128573036943404 [03/25/2022-13:24:30] [V] [TRT] Tactic: -6256128573036943404 Time: 0.204672 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_small_nt_v1 Tactic: -5980889159865208399 [03/25/2022-13:24:30] [V] [TRT] Tactic: -5980889159865208399 Time: 0.187008 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_medium_nt_v1 Tactic: -5766140806760372989 [03/25/2022-13:24:30] [V] [TRT] Tactic: -5766140806760372989 Time: 0.181248 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: -5709079507616090666 [03/25/2022-13:24:30] [V] [TRT] Tactic: -5709079507616090666 Time: 0.099456 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: -5698636014239116282 [03/25/2022-13:24:30] [V] [TRT] Tactic: -5698636014239116282 Time: 0.189312 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16 Tactic: -5180570335464125033 [03/25/2022-13:24:30] [V] [TRT] Tactic: -5180570335464125033 Time: 0.184576 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -4933563390723451692 [03/25/2022-13:24:30] [V] [TRT] Tactic: -4933563390723451692 Time: 0.140416 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_medium_nt_v1 Tactic: -4516822589357530549 [03/25/2022-13:24:30] [V] [TRT] Tactic: -4516822589357530549 Time: 0.18304 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64 Tactic: -4232916483289779353 [03/25/2022-13:24:30] [V] [TRT] Tactic: -4232916483289779353 Time: 0.292864 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64 Tactic: -3460842194336717186 [03/25/2022-13:24:30] [V] [TRT] Tactic: -3460842194336717186 Time: 0.107008 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: -3413217501222406256 [03/25/2022-13:24:30] [V] [TRT] Tactic: -3413217501222406256 Time: 0.1024 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: -3280888557222886418 [03/25/2022-13:24:30] [V] [TRT] Tactic: -3280888557222886418 Time: 0.149504 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -3238475748440751107 [03/25/2022-13:24:30] [V] [TRT] Tactic: -3238475748440751107 Time: 0.119552 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: -3182884991006484042 [03/25/2022-13:24:30] [V] [TRT] Tactic: -3182884991006484042 Time: 0.10304 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -3173468756112541306 [03/25/2022-13:24:30] [V] [TRT] Tactic: -3173468756112541306 Time: 0.206336 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_large_nt_v1 Tactic: -2917455979290586480 [03/25/2022-13:24:30] [V] [TRT] Tactic: -2917455979290586480 Time: 0.188288 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -2741641298163591508 [03/25/2022-13:24:30] [V] [TRT] Tactic: -2741641298163591508 Time: 0.105856 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_large_nt_v1 Tactic: -2571022005763160364 [03/25/2022-13:24:30] [V] [TRT] Tactic: -2571022005763160364 Time: 0.182144 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: -2499089240293650188 [03/25/2022-13:24:30] [V] [TRT] Tactic: -2499089240293650188 Time: 0.17152 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: -2328318099174473157 [03/25/2022-13:24:30] [V] [TRT] Tactic: -2328318099174473157 Time: 0.185856 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: -2083778562631872334 [03/25/2022-13:24:30] [V] [TRT] Tactic: -2083778562631872334 Time: 0.119296 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32 Tactic: -2054375205435666404 [03/25/2022-13:24:30] [V] [TRT] Tactic: -2054375205435666404 Time: 0.148096 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -1546787387293556842 [03/25/2022-13:24:30] [V] [TRT] Tactic: -1546787387293556842 Time: 0.099328 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32 Tactic: -1498626619443284096 [03/25/2022-13:24:30] [V] [TRT] Tactic: -1498626619443284096 Time: 0.139008 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -1471245223605064669 [03/25/2022-13:24:30] [V] [TRT] Tactic: -1471245223605064669 Time: 0.155136 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32 Tactic: -1283580231568512025 [03/25/2022-13:24:30] [V] [TRT] Tactic: -1283580231568512025 Time: 0.232576 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: -1224421172675151280 [03/25/2022-13:24:30] [V] [TRT] Tactic: -1224421172675151280 Time: 0.099968 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_epifadd Tactic: -1173968681844185579 [03/25/2022-13:24:30] [V] [TRT] Tactic: -1173968681844185579 Time: 0.23552 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -921247911551089037 [03/25/2022-13:24:30] [V] [TRT] Tactic: -921247911551089037 Time: 0.102784 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32 Tactic: -762222380308749469 [03/25/2022-13:24:30] [V] [TRT] Tactic: -762222380308749469 Time: 0.143616 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: -556794153877490941 [03/25/2022-13:24:30] [V] [TRT] Tactic: -556794153877490941 Time: 0.146176 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -516725800067794372 [03/25/2022-13:24:30] [V] [TRT] Tactic: -516725800067794372 Time: 0.108544 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_large_nt_v1 Tactic: -428104331444385564 [03/25/2022-13:24:30] [V] [TRT] Tactic: -428104331444385564 Time: 0.183552 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16 Tactic: -366411318217594794 [03/25/2022-13:24:30] [V] [TRT] Tactic: -366411318217594794 Time: 0.217472 [03/25/2022-13:24:30] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16 Tactic: -351548418071036983 [03/25/2022-13:24:30] [V] [TRT] Tactic: -351548418071036983 Time: 0.352256 [03/25/2022-13:24:30] [V] [TRT] Fastest Tactic: 4414594337986714263 Time: 0.088448 [03/25/2022-13:24:30] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: 4414594337986714263 [03/25/2022-13:24:30] [V] [TRT] =============== Computing costs for [03/25/2022-13:24:30] [V] [TRT] *************** Autotuning format combination: Int8(25088,784:4,28,1), Int8(100352,784:4,28,1) -> Int8(100352,784:4,28,1) *************** [03/25/2022-13:24:30] [V] [TRT] *************** Autotuning format combination: Int8(25088,784:4,28,1), Int8(12544,784:32,28,1) -> Int8(12544,784:32,28,1) *************** [03/25/2022-13:24:30] [V] [TRT] *************** Autotuning format combination: Int8(3136,784:32,28,1), Int8(12544,784:32,28,1) -> Int8(12544,784:32,28,1) *************** [03/25/2022-13:24:30] [V] [TRT] =============== Computing costs for [03/25/2022-13:24:30] [V] [TRT] *************** Autotuning format combination: Int8(100352,784:4,28,1) -> Int8(50176,784:4,28,1) *************** [03/25/2022-13:24:30] [V] [TRT] --------------- Timing Runner: sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 (CudaDepthwiseConvolution) [03/25/2022-13:24:30] [V] [TRT] CudaDepthwiseConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:30] [V] [TRT] --------------- Timing Runner: sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 (FusedConvActConvolution) [03/25/2022-13:24:30] [V] [TRT] FusedConvActConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:30] [V] [TRT] --------------- Timing Runner: sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 (CaskConvolution) [03/25/2022-13:24:30] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_medium_nn_v1 Tactic: 175853789719975416 [03/25/2022-13:24:30] [V] [TRT] Tactic: 175853789719975416 Time: 0.558848 [03/25/2022-13:24:30] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_medium_nn_v1 Tactic: 2171150287007712632 [03/25/2022-13:24:30] [V] [TRT] Tactic: 2171150287007712632 Time: 0.55168 [03/25/2022-13:24:30] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_small_nn_v1 Tactic: 2234457234705232274 [03/25/2022-13:24:30] [V] [TRT] Tactic: 2234457234705232274 Time: 0.499072 [03/25/2022-13:24:30] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_medium_nn_v1 Tactic: 5834048089706882838 [03/25/2022-13:24:30] [V] [TRT] Tactic: 5834048089706882838 Time: 0.500992 [03/25/2022-13:24:30] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_interior_nn_v1 Tactic: 6299962968199310600 [03/25/2022-13:24:30] [V] [TRT] Tactic: 6299962968199310600 Time: 0.48448 [03/25/2022-13:24:30] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_interior_nn_v1 Tactic: 6341572697076960911 [03/25/2022-13:24:30] [V] [TRT] Tactic: 6341572697076960911 Time: 0.506112 [03/25/2022-13:24:30] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_small_nn_v1 Tactic: -8626990807754934295 [03/25/2022-13:24:30] [V] [TRT] Tactic: -8626990807754934295 Time: 0.553088 [03/25/2022-13:24:30] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_interior_nn_v1 Tactic: -8498217049614706532 [03/25/2022-13:24:30] [V] [TRT] Tactic: -8498217049614706532 Time: 0.482304 [03/25/2022-13:24:30] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_small_nn_v1 Tactic: -7303593854972602201 [03/25/2022-13:24:30] [V] [TRT] Tactic: -7303593854972602201 Time: 0.520704 [03/25/2022-13:24:30] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_medium_nn_v1 Tactic: -6585664687867083638 [03/25/2022-13:24:30] [V] [TRT] Tactic: -6585664687867083638 Time: 0.487296 [03/25/2022-13:24:30] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_interior_nn_v1 Tactic: -3326139578711341011 [03/25/2022-13:24:30] [V] [TRT] Tactic: -3326139578711341011 Time: 0.522368 [03/25/2022-13:24:30] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_small_nn_v1 Tactic: -683636008127039856 [03/25/2022-13:24:30] [V] [TRT] Tactic: -683636008127039856 Time: 0.486656 [03/25/2022-13:24:30] [V] [TRT] Fastest Tactic: -8498217049614706532 Time: 0.482304 [03/25/2022-13:24:30] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: -8498217049614706532 [03/25/2022-13:24:30] [V] [TRT] *************** Autotuning format combination: Int8(100352,784:4,28,1) -> Int8(6272,784:32,28,1) *************** [03/25/2022-13:24:30] [V] [TRT] --------------- Timing Runner: sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 (CaskConvolution) [03/25/2022-13:24:30] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_small_c32_nn_v1 Tactic: 1100922622480907544 [03/25/2022-13:24:30] [V] [TRT] Tactic: 1100922622480907544 Time: 0.55104 [03/25/2022-13:24:30] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_interior_c32_nn_v1 Tactic: 2855900226702061782 [03/25/2022-13:24:30] [V] [TRT] Tactic: 2855900226702061782 Time: 0.48704 [03/25/2022-13:24:30] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_medium_c32_nn_v1 Tactic: 3606311198834416176 [03/25/2022-13:24:31] [V] [TRT] Tactic: 3606311198834416176 Time: 0.499712 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_small_c32_nn_v1 Tactic: 4325765560739862899 [03/25/2022-13:24:31] [V] [TRT] Tactic: 4325765560739862899 Time: 0.489216 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_interior_c32_nn_v1 Tactic: 8803458114157674373 [03/25/2022-13:24:31] [V] [TRT] Tactic: 8803458114157674373 Time: 0.48128 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_interior_c32_nn_v1 Tactic: -6934773036503365000 [03/25/2022-13:24:31] [V] [TRT] Tactic: -6934773036503365000 Time: 0.523136 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_interior_c32_nn_v1 Tactic: -4431642509665791294 [03/25/2022-13:24:31] [V] [TRT] Tactic: -4431642509665791294 Time: 0.504704 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_medium_c32_nn_v1 Tactic: -4255737803793506479 [03/25/2022-13:24:31] [V] [TRT] Tactic: -4255737803793506479 Time: 0.488192 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_small_c32_nn_v1 Tactic: -3958182351168863467 [03/25/2022-13:24:31] [V] [TRT] Tactic: -3958182351168863467 Time: 0.516992 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_medium_c32_nn_v1 Tactic: -3111968753064955248 [03/25/2022-13:24:31] [V] [TRT] Tactic: -3111968753064955248 Time: 0.44992 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_medium_c32_nn_v1 Tactic: -1492575840277333548 [03/25/2022-13:24:31] [V] [TRT] Tactic: -1492575840277333548 Time: 0.456064 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_small_c32_nn_v1 Tactic: -868495160148524802 [03/25/2022-13:24:31] [V] [TRT] Tactic: -868495160148524802 Time: 0.38784 [03/25/2022-13:24:31] [V] [TRT] Fastest Tactic: -868495160148524802 Time: 0.38784 [03/25/2022-13:24:31] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: -868495160148524802 [03/25/2022-13:24:31] [V] [TRT] *************** Autotuning format combination: Int8(12544,784:32,28,1) -> Int8(6272,784:32,28,1) *************** [03/25/2022-13:24:31] [V] [TRT] --------------- Timing Runner: sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 (CudaGroupConvolution) [03/25/2022-13:24:31] [V] [TRT] CudaGroupConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:31] [V] [TRT] --------------- Timing Runner: sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 (CudaDepthwiseConvolution) [03/25/2022-13:24:31] [V] [TRT] CudaDepthwiseConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:31] [V] [TRT] --------------- Timing Runner: sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 (FusedConvActConvolution) [03/25/2022-13:24:31] [V] [TRT] FusedConvActConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:31] [V] [TRT] --------------- Timing Runner: sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 (CaskConvolution) [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16_t1r1s1 Tactic: 68468667201176803 [03/25/2022-13:24:31] [V] [TRT] Tactic: 68468667201176803 Time: 0.13888 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 125145153013230687 [03/25/2022-13:24:31] [V] [TRT] Tactic: 125145153013230687 Time: 0.147712 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_interior_nt_v1 Tactic: 434957160407688216 [03/25/2022-13:24:31] [V] [TRT] Tactic: 434957160407688216 Time: 0.164736 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 805889586762897346 [03/25/2022-13:24:31] [V] [TRT] Tactic: 805889586762897346 Time: 0.097792 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: 857001784974286465 [03/25/2022-13:24:31] [V] [TRT] Tactic: 857001784974286465 Time: 0.088704 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 1214130898909872671 [03/25/2022-13:24:31] [V] [TRT] Tactic: 1214130898909872671 Time: 0.163584 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 1278425129871930205 [03/25/2022-13:24:31] [V] [TRT] Tactic: 1278425129871930205 Time: 0.094464 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 1583811548148740665 [03/25/2022-13:24:31] [V] [TRT] Tactic: 1583811548148740665 Time: 0.137472 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 1701344857577810806 [03/25/2022-13:24:31] [V] [TRT] Tactic: 1701344857577810806 Time: 0.121216 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 1797231177354918208 [03/25/2022-13:24:31] [V] [TRT] Tactic: 1797231177354918208 Time: 0.14912 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_conv_fprop_smallk_i8i8_i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_warptilesize16x32_stage4_warpsize2x4x1_r1s1_u1v1_hw0_c512_scalebias_relu Tactic: 1913026264725750683 [03/25/2022-13:24:31] [V] [TRT] Tactic: 1913026264725750683 Time: 0.07232 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2004812516525036381 [03/25/2022-13:24:31] [V] [TRT] Tactic: 2004812516525036381 Time: 0.126208 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2030033463723799063 [03/25/2022-13:24:31] [V] [TRT] Tactic: 2030033463723799063 Time: 0.094208 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_interior_nt_v1 Tactic: 2346437292116182513 [03/25/2022-13:24:31] [V] [TRT] Tactic: 2346437292116182513 Time: 0.14144 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2376898825218218566 [03/25/2022-13:24:31] [V] [TRT] Tactic: 2376898825218218566 Time: 0.081536 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_interior_nt_v1 Tactic: 2522133112320625287 [03/25/2022-13:24:31] [V] [TRT] Tactic: 2522133112320625287 Time: 0.137216 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2548171972648455240 [03/25/2022-13:24:31] [V] [TRT] Tactic: 2548171972648455240 Time: 0.08832 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 2548946449357458230 [03/25/2022-13:24:31] [V] [TRT] Tactic: 2548946449357458230 Time: 0.171008 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: 2570666021825229009 [03/25/2022-13:24:31] [V] [TRT] Tactic: 2570666021825229009 Time: 0.158464 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 2678520742286844763 [03/25/2022-13:24:31] [V] [TRT] Tactic: 2678520742286844763 Time: 0.260224 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 2756291002030759362 [03/25/2022-13:24:31] [V] [TRT] Tactic: 2756291002030759362 Time: 0.111232 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2972948223367788520 [03/25/2022-13:24:31] [V] [TRT] Tactic: 2972948223367788520 Time: 0.087552 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_medium_nt_v1 Tactic: 2985940154541537814 [03/25/2022-13:24:31] [V] [TRT] Tactic: 2985940154541537814 Time: 0.14016 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 3043273137345374664 [03/25/2022-13:24:31] [V] [TRT] Tactic: 3043273137345374664 Time: 0.16704 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 3221677093659484230 [03/25/2022-13:24:31] [V] [TRT] Tactic: 3221677093659484230 Time: 0.164608 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: 3242897809704328258 [03/25/2022-13:24:31] [V] [TRT] Tactic: 3242897809704328258 Time: 0.142464 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 3312456766204252694 [03/25/2022-13:24:31] [V] [TRT] Tactic: 3312456766204252694 Time: 0.18496 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 3538565962642681625 [03/25/2022-13:24:31] [V] [TRT] Tactic: 3538565962642681625 Time: 0.116992 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 3541919052468401776 [03/25/2022-13:24:31] [V] [TRT] Tactic: 3541919052468401776 Time: 0.128768 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 3593397928177382100 [03/25/2022-13:24:31] [V] [TRT] Tactic: 3593397928177382100 Time: 0.164096 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 3670282018109435863 [03/25/2022-13:24:31] [V] [TRT] Tactic: 3670282018109435863 Time: 0.102528 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 3671413346254027573 [03/25/2022-13:24:31] [V] [TRT] Tactic: 3671413346254027573 Time: 0.115712 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_medium_nt_v1 Tactic: 3899284354987683408 [03/25/2022-13:24:31] [V] [TRT] Tactic: 3899284354987683408 Time: 0.162048 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 3927509214678622419 [03/25/2022-13:24:31] [V] [TRT] Tactic: 3927509214678622419 Time: 0.138496 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 4112572034735311841 [03/25/2022-13:24:31] [V] [TRT] Tactic: 4112572034735311841 Time: 0.216064 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 4239974928951431644 [03/25/2022-13:24:31] [V] [TRT] Tactic: 4239974928951431644 Time: 0.120832 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 4610760414797216079 [03/25/2022-13:24:31] [V] [TRT] Tactic: 4610760414797216079 Time: 0.109824 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_small_nt_v1 Tactic: 4717285412741024953 [03/25/2022-13:24:31] [V] [TRT] Tactic: 4717285412741024953 Time: 0.14208 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 4796956614760326119 [03/25/2022-13:24:31] [V] [TRT] Tactic: 4796956614760326119 Time: 0.105984 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: 4909502217677847353 [03/25/2022-13:24:31] [V] [TRT] Tactic: 4909502217677847353 Time: 0.082304 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 4919361344804309192 [03/25/2022-13:24:31] [V] [TRT] Tactic: 4919361344804309192 Time: 0.175488 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 5043674678294309681 [03/25/2022-13:24:31] [V] [TRT] Tactic: 5043674678294309681 Time: 0.113152 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_interior_nt_v1 Tactic: 5126565865931538390 [03/25/2022-13:24:31] [V] [TRT] Tactic: 5126565865931538390 Time: 0.142976 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5204702486885981735 [03/25/2022-13:24:31] [V] [TRT] Tactic: 5204702486885981735 Time: 0.104832 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 5375256703210220108 [03/25/2022-13:24:31] [V] [TRT] Tactic: 5375256703210220108 Time: 0.099968 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 5424258848951129084 [03/25/2022-13:24:31] [V] [TRT] Tactic: 5424258848951129084 Time: 0.08256 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5443897483205284103 [03/25/2022-13:24:31] [V] [TRT] Tactic: 5443897483205284103 Time: 0.119296 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5707566217891294846 [03/25/2022-13:24:31] [V] [TRT] Tactic: 5707566217891294846 Time: 0.096128 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5986622376339202983 [03/25/2022-13:24:31] [V] [TRT] Tactic: 5986622376339202983 Time: 0.121344 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6007888770437705057 [03/25/2022-13:24:31] [V] [TRT] Tactic: 6007888770437705057 Time: 0.104832 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 6405251167055673379 [03/25/2022-13:24:31] [V] [TRT] Tactic: 6405251167055673379 Time: 0.108672 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 6433368103202497147 [03/25/2022-13:24:31] [V] [TRT] Tactic: 6433368103202497147 Time: 0.101888 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6441948709525127755 [03/25/2022-13:24:31] [V] [TRT] Tactic: 6441948709525127755 Time: 0.1664 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 6443933097134654777 [03/25/2022-13:24:31] [V] [TRT] Tactic: 6443933097134654777 Time: 0.105088 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6457435868048963632 [03/25/2022-13:24:31] [V] [TRT] Tactic: 6457435868048963632 Time: 0.111232 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1 Tactic: 6510345569544721081 [03/25/2022-13:24:31] [V] [TRT] Tactic: 6510345569544721081 Time: 0.158976 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 6793988781414507278 [03/25/2022-13:24:31] [V] [TRT] Tactic: 6793988781414507278 Time: 0.085504 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 6880710371738875469 [03/25/2022-13:24:31] [V] [TRT] Tactic: 6880710371738875469 Time: 0.124032 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6925201228918187099 [03/25/2022-13:24:31] [V] [TRT] Tactic: 6925201228918187099 Time: 0.088832 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6991524515605108718 [03/25/2022-13:24:31] [V] [TRT] Tactic: 6991524515605108718 Time: 0.144256 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 7245509442265271220 [03/25/2022-13:24:31] [V] [TRT] Tactic: 7245509442265271220 Time: 0.122368 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 7318929579222925725 [03/25/2022-13:24:31] [V] [TRT] Tactic: 7318929579222925725 Time: 0.098304 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 7731430299029542276 [03/25/2022-13:24:31] [V] [TRT] Tactic: 7731430299029542276 Time: 0.08448 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 7738495016763012180 [03/25/2022-13:24:31] [V] [TRT] Tactic: 7738495016763012180 Time: 0.09472 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: 7886967395128926382 [03/25/2022-13:24:31] [V] [TRT] Tactic: 7886967395128926382 Time: 0.102272 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 8142283985160822229 [03/25/2022-13:24:31] [V] [TRT] Tactic: 8142283985160822229 Time: 0.104448 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 8173975624668590862 [03/25/2022-13:24:31] [V] [TRT] Tactic: 8173975624668590862 Time: 0.1024 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 8234775147403903473 [03/25/2022-13:24:31] [V] [TRT] Tactic: 8234775147403903473 Time: 0.108288 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 8524082966802584889 [03/25/2022-13:24:31] [V] [TRT] Tactic: 8524082966802584889 Time: 0.090112 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 8684013308930763400 [03/25/2022-13:24:31] [V] [TRT] Tactic: 8684013308930763400 Time: 0.135552 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 8765382722978397630 [03/25/2022-13:24:31] [V] [TRT] Tactic: 8765382722978397630 Time: 0.091136 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 8843193587782643431 [03/25/2022-13:24:31] [V] [TRT] Tactic: 8843193587782643431 Time: 0.135424 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 8883810517410230831 [03/25/2022-13:24:31] [V] [TRT] Tactic: 8883810517410230831 Time: 0.10368 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 8930797211803511337 [03/25/2022-13:24:31] [V] [TRT] Tactic: 8930797211803511337 Time: 0.1408 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 8935070489925739043 [03/25/2022-13:24:31] [V] [TRT] Tactic: 8935070489925739043 Time: 0.096 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1 Tactic: 9062173295331155069 [03/25/2022-13:24:31] [V] [TRT] Tactic: 9062173295331155069 Time: 0.26048 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_small_nt_v1 Tactic: -9118785798277698619 [03/25/2022-13:24:31] [V] [TRT] Tactic: -9118785798277698619 Time: 0.139392 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -8985599729413291927 [03/25/2022-13:24:31] [V] [TRT] Tactic: -8985599729413291927 Time: 0.095104 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -8972697510150675429 [03/25/2022-13:24:31] [V] [TRT] Tactic: -8972697510150675429 Time: 0.129408 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -8943710627305202139 [03/25/2022-13:24:31] [V] [TRT] Tactic: -8943710627305202139 Time: 0.139008 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -8859846367886814331 [03/25/2022-13:24:31] [V] [TRT] Tactic: -8859846367886814331 Time: 0.157696 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -8638624340850784688 [03/25/2022-13:24:31] [V] [TRT] Tactic: -8638624340850784688 Time: 0.14464 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -8556775352640313933 [03/25/2022-13:24:31] [V] [TRT] Tactic: -8556775352640313933 Time: 0.093824 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -8382298409581540699 [03/25/2022-13:24:31] [V] [TRT] Tactic: -8382298409581540699 Time: 0.213248 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -8172318747337038866 [03/25/2022-13:24:31] [V] [TRT] Tactic: -8172318747337038866 Time: 0.154112 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -8038164441468184723 [03/25/2022-13:24:31] [V] [TRT] Tactic: -8038164441468184723 Time: 0.094976 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -7844028314176826857 [03/25/2022-13:24:31] [V] [TRT] Tactic: -7844028314176826857 Time: 0.16832 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -7674507941016740570 [03/25/2022-13:24:31] [V] [TRT] Tactic: -7674507941016740570 Time: 0.08192 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -7364286662638617917 [03/25/2022-13:24:31] [V] [TRT] Tactic: -7364286662638617917 Time: 0.09152 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r1s1 Tactic: -7361755530333096258 [03/25/2022-13:24:31] [V] [TRT] Tactic: -7361755530333096258 Time: 0.17472 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_t1r1s1 Tactic: -7289760022626653388 [03/25/2022-13:24:31] [V] [TRT] Tactic: -7289760022626653388 Time: 0.103552 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -7106539943789766885 [03/25/2022-13:24:31] [V] [TRT] Tactic: -7106539943789766885 Time: 0.14976 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -6969478418607271266 [03/25/2022-13:24:31] [V] [TRT] Tactic: -6969478418607271266 Time: 0.150912 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -6930438165437733000 [03/25/2022-13:24:31] [V] [TRT] Tactic: -6930438165437733000 Time: 0.2176 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -6879607992933502380 [03/25/2022-13:24:31] [V] [TRT] Tactic: -6879607992933502380 Time: 0.095616 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -6839669803644810934 [03/25/2022-13:24:31] [V] [TRT] Tactic: -6839669803644810934 Time: 0.11264 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -6812830108414456369 [03/25/2022-13:24:31] [V] [TRT] Tactic: -6812830108414456369 Time: 0.112256 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -6779804930216439173 [03/25/2022-13:24:31] [V] [TRT] Tactic: -6779804930216439173 Time: 0.087424 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -6527178416855951297 [03/25/2022-13:24:31] [V] [TRT] Tactic: -6527178416855951297 Time: 0.180352 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -6510232214299595844 [03/25/2022-13:24:31] [V] [TRT] Tactic: -6510232214299595844 Time: 0.173184 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_small_nt_v1 Tactic: -6400348606759295499 [03/25/2022-13:24:31] [V] [TRT] Tactic: -6400348606759295499 Time: 0.133376 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -6346247605026339453 [03/25/2022-13:24:31] [V] [TRT] Tactic: -6346247605026339453 Time: 0.137088 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -6232597026469067819 [03/25/2022-13:24:31] [V] [TRT] Tactic: -6232597026469067819 Time: 0.140928 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_small_nt_v1 Tactic: -5980889159865208399 [03/25/2022-13:24:31] [V] [TRT] Tactic: -5980889159865208399 Time: 0.15808 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_medium_nt_v1 Tactic: -5766140806760372989 [03/25/2022-13:24:31] [V] [TRT] Tactic: -5766140806760372989 Time: 0.14336 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -5697614955743334137 [03/25/2022-13:24:31] [V] [TRT] Tactic: -5697614955743334137 Time: 0.1312 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -5671123121710113970 [03/25/2022-13:24:31] [V] [TRT] Tactic: -5671123121710113970 Time: 0.1216 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -5615581362569252260 [03/25/2022-13:24:31] [V] [TRT] Tactic: -5615581362569252260 Time: 0.146176 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -5562968047117507056 [03/25/2022-13:24:31] [V] [TRT] Tactic: -5562968047117507056 Time: 0.103936 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -5516472881360101487 [03/25/2022-13:24:31] [V] [TRT] Tactic: -5516472881360101487 Time: 0.143872 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -5311474420963248369 [03/25/2022-13:24:31] [V] [TRT] Tactic: -5311474420963248369 Time: 0.186752 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -5170003087447722174 [03/25/2022-13:24:31] [V] [TRT] Tactic: -5170003087447722174 Time: 0.17024 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -4889586143772361690 [03/25/2022-13:24:31] [V] [TRT] Tactic: -4889586143772361690 Time: 0.099584 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -4889498558023475527 [03/25/2022-13:24:31] [V] [TRT] Tactic: -4889498558023475527 Time: 0.102656 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -4849712423393454704 [03/25/2022-13:24:31] [V] [TRT] Tactic: -4849712423393454704 Time: 0.099712 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: -4681913707320020520 [03/25/2022-13:24:31] [V] [TRT] Tactic: -4681913707320020520 Time: 0.087168 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_medium_nt_v1 Tactic: -4516822589357530549 [03/25/2022-13:24:31] [V] [TRT] Tactic: -4516822589357530549 Time: 0.148864 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -4455415102719506646 [03/25/2022-13:24:31] [V] [TRT] Tactic: -4455415102719506646 Time: 0.116224 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -4425346730823666456 [03/25/2022-13:24:31] [V] [TRT] Tactic: -4425346730823666456 Time: 0.149248 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r1s1 Tactic: -4260476497340370474 [03/25/2022-13:24:31] [V] [TRT] Tactic: -4260476497340370474 Time: 0.21504 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -4182501876984672402 [03/25/2022-13:24:31] [V] [TRT] Tactic: -4182501876984672402 Time: 0.121856 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_t1r1s1 Tactic: -4151617293257698859 [03/25/2022-13:24:31] [V] [TRT] Tactic: -4151617293257698859 Time: 0.105344 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -3862908719298381451 [03/25/2022-13:24:31] [V] [TRT] Tactic: -3862908719298381451 Time: 0.089216 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -3825889760337461729 [03/25/2022-13:24:31] [V] [TRT] Tactic: -3825889760337461729 Time: 0.167552 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -3797022944823726673 [03/25/2022-13:24:31] [V] [TRT] Tactic: -3797022944823726673 Time: 0.114176 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -3613322253849278738 [03/25/2022-13:24:31] [V] [TRT] Tactic: -3613322253849278738 Time: 0.224768 [03/25/2022-13:24:31] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -3577322188448771475 [03/25/2022-13:24:32] [V] [TRT] Tactic: -3577322188448771475 Time: 0.137344 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16_t1r1s1 Tactic: -3531681826488401618 [03/25/2022-13:24:32] [V] [TRT] Tactic: -3531681826488401618 Time: 0.152448 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -3305554949874552860 [03/25/2022-13:24:32] [V] [TRT] Tactic: -3305554949874552860 Time: 0.260224 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -3288585994448820820 [03/25/2022-13:24:32] [V] [TRT] Tactic: -3288585994448820820 Time: 0.145664 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -2754311112012636251 [03/25/2022-13:24:32] [V] [TRT] Tactic: -2754311112012636251 Time: 0.137856 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -2432868635536396215 [03/25/2022-13:24:32] [V] [TRT] Tactic: -2432868635536396215 Time: 0.13632 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -2379804152300264660 [03/25/2022-13:24:32] [V] [TRT] Tactic: -2379804152300264660 Time: 0.16704 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: -2352253835013627337 [03/25/2022-13:24:32] [V] [TRT] Tactic: -2352253835013627337 Time: 0.082944 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -2335587136911650799 [03/25/2022-13:24:32] [V] [TRT] Tactic: -2335587136911650799 Time: 0.123904 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r1s1 Tactic: -2315453944962430928 [03/25/2022-13:24:32] [V] [TRT] Tactic: -2315453944962430928 Time: 0.104832 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: -2238364958919154661 [03/25/2022-13:24:32] [V] [TRT] Tactic: -2238364958919154661 Time: 0.158464 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -1916483171117495388 [03/25/2022-13:24:32] [V] [TRT] Tactic: -1916483171117495388 Time: 0.144384 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -1740762957710554518 [03/25/2022-13:24:32] [V] [TRT] Tactic: -1740762957710554518 Time: 0.260224 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -1549742793039499659 [03/25/2022-13:24:32] [V] [TRT] Tactic: -1549742793039499659 Time: 0.14272 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -1499578657823798783 [03/25/2022-13:24:32] [V] [TRT] Tactic: -1499578657823798783 Time: 0.118528 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -1494157908358500249 [03/25/2022-13:24:32] [V] [TRT] Tactic: -1494157908358500249 Time: 0.160896 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_t1r1s1 Tactic: -1328736756812546664 [03/25/2022-13:24:32] [V] [TRT] Tactic: -1328736756812546664 Time: 0.120192 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -1006589727652607355 [03/25/2022-13:24:32] [V] [TRT] Tactic: -1006589727652607355 Time: 0.145024 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -713022856474991236 [03/25/2022-13:24:32] [V] [TRT] Tactic: -713022856474991236 Time: 0.23168 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -619668460699260222 [03/25/2022-13:24:32] [V] [TRT] Tactic: -619668460699260222 Time: 0.158208 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -405554772060757402 [03/25/2022-13:24:32] [V] [TRT] Tactic: -405554772060757402 Time: 0.105984 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -375949437730908730 [03/25/2022-13:24:32] [V] [TRT] Tactic: -375949437730908730 Time: 0.104448 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: -233227833606287806 [03/25/2022-13:24:32] [V] [TRT] Tactic: -233227833606287806 Time: 0.113024 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -111878368089469751 [03/25/2022-13:24:32] [V] [TRT] Tactic: -111878368089469751 Time: 0.138112 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -48936598874722005 [03/25/2022-13:24:32] [V] [TRT] Tactic: -48936598874722005 Time: 0.090368 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -19707840769375107 [03/25/2022-13:24:32] [V] [TRT] Tactic: -19707840769375107 Time: 0.133248 [03/25/2022-13:24:32] [V] [TRT] Fastest Tactic: 1913026264725750683 Time: 0.07232 [03/25/2022-13:24:32] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: 1913026264725750683 [03/25/2022-13:24:32] [V] [TRT] =============== Computing costs for [03/25/2022-13:24:32] [V] [TRT] *************** Autotuning format combination: Int8(100352,784:4,28,1) -> Int8(50176,196:4,14,1) *************** [03/25/2022-13:24:32] [V] [TRT] --------------- Timing Runner: sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 (CudaDepthwiseConvolution) [03/25/2022-13:24:32] [V] [TRT] CudaDepthwiseConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:32] [V] [TRT] --------------- Timing Runner: sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 (FusedConvActConvolution) [03/25/2022-13:24:32] [V] [TRT] FusedConvActConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:32] [V] [TRT] --------------- Timing Runner: sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 (CaskConvolution) [03/25/2022-13:24:32] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_medium_nn_v1 Tactic: 175853789719975416 [03/25/2022-13:24:32] [V] [TRT] Tactic: 175853789719975416 Time: 0.758016 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_medium_nn_v1 Tactic: 2171150287007712632 [03/25/2022-13:24:32] [V] [TRT] Tactic: 2171150287007712632 Time: 0.806656 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_small_nn_v1 Tactic: 2234457234705232274 [03/25/2022-13:24:32] [V] [TRT] Tactic: 2234457234705232274 Time: 0.58304 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_medium_nn_v1 Tactic: 5834048089706882838 [03/25/2022-13:24:32] [V] [TRT] Tactic: 5834048089706882838 Time: 0.582784 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_interior_nn_v1 Tactic: 6299962968199310600 [03/25/2022-13:24:32] [V] [TRT] Tactic: 6299962968199310600 Time: 0.393984 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_interior_nn_v1 Tactic: 6341572697076960911 [03/25/2022-13:24:32] [V] [TRT] Tactic: 6341572697076960911 Time: 0.855552 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_small_nn_v1 Tactic: -8626990807754934295 [03/25/2022-13:24:32] [V] [TRT] Tactic: -8626990807754934295 Time: 0.764672 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_interior_nn_v1 Tactic: -8498217049614706532 [03/25/2022-13:24:32] [V] [TRT] Tactic: -8498217049614706532 Time: 0.562304 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_small_nn_v1 Tactic: -7303593854972602201 [03/25/2022-13:24:32] [V] [TRT] Tactic: -7303593854972602201 Time: 0.888704 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_medium_nn_v1 Tactic: -6585664687867083638 [03/25/2022-13:24:32] [V] [TRT] Tactic: -6585664687867083638 Time: 0.396672 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_interior_nn_v1 Tactic: -3326139578711341011 [03/25/2022-13:24:32] [V] [TRT] Tactic: -3326139578711341011 Time: 0.67328 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_small_nn_v1 Tactic: -683636008127039856 [03/25/2022-13:24:32] [V] [TRT] Tactic: -683636008127039856 Time: 0.395392 [03/25/2022-13:24:32] [V] [TRT] Fastest Tactic: 6299962968199310600 Time: 0.393984 [03/25/2022-13:24:32] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: 6299962968199310600 [03/25/2022-13:24:32] [V] [TRT] *************** Autotuning format combination: Int8(100352,784:4,28,1) -> Int8(6272,196:32,14,1) *************** [03/25/2022-13:24:32] [V] [TRT] --------------- Timing Runner: sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 (CaskConvolution) [03/25/2022-13:24:32] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_small_c32_nn_v1 Tactic: 1100922622480907544 [03/25/2022-13:24:32] [V] [TRT] Tactic: 1100922622480907544 Time: 0.767104 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_interior_c32_nn_v1 Tactic: 2855900226702061782 [03/25/2022-13:24:32] [V] [TRT] Tactic: 2855900226702061782 Time: 0.395136 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_medium_c32_nn_v1 Tactic: 3606311198834416176 [03/25/2022-13:24:32] [V] [TRT] Tactic: 3606311198834416176 Time: 0.5792 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_small_c32_nn_v1 Tactic: 4325765560739862899 [03/25/2022-13:24:32] [V] [TRT] Tactic: 4325765560739862899 Time: 0.396544 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_interior_c32_nn_v1 Tactic: 8803458114157674373 [03/25/2022-13:24:32] [V] [TRT] Tactic: 8803458114157674373 Time: 0.559872 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_interior_c32_nn_v1 Tactic: -6934773036503365000 [03/25/2022-13:24:32] [V] [TRT] Tactic: -6934773036503365000 Time: 0.675712 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_interior_c32_nn_v1 Tactic: -4431642509665791294 [03/25/2022-13:24:32] [V] [TRT] Tactic: -4431642509665791294 Time: 0.864512 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_medium_c32_nn_v1 Tactic: -4255737803793506479 [03/25/2022-13:24:32] [V] [TRT] Tactic: -4255737803793506479 Time: 0.397056 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_small_c32_nn_v1 Tactic: -3958182351168863467 [03/25/2022-13:24:32] [V] [TRT] Tactic: -3958182351168863467 Time: 0.905088 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_medium_c32_nn_v1 Tactic: -3111968753064955248 [03/25/2022-13:24:32] [V] [TRT] Tactic: -3111968753064955248 Time: 0.825216 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_medium_c32_nn_v1 Tactic: -1492575840277333548 [03/25/2022-13:24:32] [V] [TRT] Tactic: -1492575840277333548 Time: 0.762368 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_small_c32_nn_v1 Tactic: -868495160148524802 [03/25/2022-13:24:32] [V] [TRT] Tactic: -868495160148524802 Time: 0.581632 [03/25/2022-13:24:32] [V] [TRT] Fastest Tactic: 2855900226702061782 Time: 0.395136 [03/25/2022-13:24:32] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: 2855900226702061782 [03/25/2022-13:24:32] [V] [TRT] *************** Autotuning format combination: Int8(12544,784:32,28,1) -> Int8(6272,196:32,14,1) *************** [03/25/2022-13:24:32] [V] [TRT] --------------- Timing Runner: sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 (CudaGroupConvolution) [03/25/2022-13:24:32] [V] [TRT] CudaGroupConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:32] [V] [TRT] --------------- Timing Runner: sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 (CudaDepthwiseConvolution) [03/25/2022-13:24:32] [V] [TRT] CudaDepthwiseConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:32] [V] [TRT] --------------- Timing Runner: sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 (FusedConvActConvolution) [03/25/2022-13:24:32] [V] [TRT] FusedConvActConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:32] [V] [TRT] --------------- Timing Runner: sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 (CaskConvolution) [03/25/2022-13:24:32] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16_t1r1s1 Tactic: 68468667201176803 [03/25/2022-13:24:32] [V] [TRT] Tactic: 68468667201176803 Time: 0.142592 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 125145153013230687 [03/25/2022-13:24:32] [V] [TRT] Tactic: 125145153013230687 Time: 0.116224 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32 Tactic: 328135613486708155 [03/25/2022-13:24:32] [V] [TRT] Tactic: 328135613486708155 Time: 0.24256 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_interior_nt_v1 Tactic: 434957160407688216 [03/25/2022-13:24:32] [V] [TRT] Tactic: 434957160407688216 Time: 0.167296 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: 857001784974286465 [03/25/2022-13:24:32] [V] [TRT] Tactic: 857001784974286465 Time: 0.086272 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32 Tactic: 1111159740952609683 [03/25/2022-13:24:32] [V] [TRT] Tactic: 1111159740952609683 Time: 0.124288 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: 1134860903395928905 [03/25/2022-13:24:32] [V] [TRT] Tactic: 1134860903395928905 Time: 0.122496 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: 1276591930377039442 [03/25/2022-13:24:32] [V] [TRT] Tactic: 1276591930377039442 Time: 0.144896 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32 Tactic: 1399501420456320585 [03/25/2022-13:24:32] [V] [TRT] Tactic: 1399501420456320585 Time: 0.144256 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 1583811548148740665 [03/25/2022-13:24:32] [V] [TRT] Tactic: 1583811548148740665 Time: 0.11072 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 1701344857577810806 [03/25/2022-13:24:32] [V] [TRT] Tactic: 1701344857577810806 Time: 0.117632 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 1797231177354918208 [03/25/2022-13:24:32] [V] [TRT] Tactic: 1797231177354918208 Time: 0.173312 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: 2133329569091732311 [03/25/2022-13:24:32] [V] [TRT] Tactic: 2133329569091732311 Time: 0.146688 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_no_preds Tactic: 2186058294798640800 [03/25/2022-13:24:32] [V] [TRT] Tactic: 2186058294798640800 Time: 0.090368 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32 Tactic: 2325023763229477890 [03/25/2022-13:24:32] [V] [TRT] Tactic: 2325023763229477890 Time: 0.127104 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_interior_nt_v1 Tactic: 2346437292116182513 [03/25/2022-13:24:32] [V] [TRT] Tactic: 2346437292116182513 Time: 0.141696 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_no_preds Tactic: 2434539343777234419 [03/25/2022-13:24:32] [V] [TRT] Tactic: 2434539343777234419 Time: 0.091136 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_interior_nt_v1 Tactic: 2522133112320625287 [03/25/2022-13:24:32] [V] [TRT] Tactic: 2522133112320625287 Time: 0.141568 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: 2579824863892891529 [03/25/2022-13:24:32] [V] [TRT] Tactic: 2579824863892891529 Time: 0.211456 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32 Tactic: 2783960536172159663 [03/25/2022-13:24:32] [V] [TRT] Tactic: 2783960536172159663 Time: 0.112896 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32 Tactic: 2821711838552913693 [03/25/2022-13:24:32] [V] [TRT] Tactic: 2821711838552913693 Time: 0.103808 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64 Tactic: 2945009978756227538 [03/25/2022-13:24:32] [V] [TRT] Tactic: 2945009978756227538 Time: 0.117504 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_medium_nt_v1 Tactic: 2985940154541537814 [03/25/2022-13:24:32] [V] [TRT] Tactic: 2985940154541537814 Time: 0.140288 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: 3242897809704328258 [03/25/2022-13:24:32] [V] [TRT] Tactic: 3242897809704328258 Time: 0.146432 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32 Tactic: 3456719996792527006 [03/25/2022-13:24:32] [V] [TRT] Tactic: 3456719996792527006 Time: 0.122112 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 3538565962642681625 [03/25/2022-13:24:32] [V] [TRT] Tactic: 3538565962642681625 Time: 0.122112 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64 Tactic: 3651043333819148268 [03/25/2022-13:24:32] [V] [TRT] Tactic: 3651043333819148268 Time: 0.107776 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_no_preds Tactic: 3866129666720518662 [03/25/2022-13:24:32] [V] [TRT] Tactic: 3866129666720518662 Time: 0.118016 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_medium_nt_v1 Tactic: 3899284354987683408 [03/25/2022-13:24:32] [V] [TRT] Tactic: 3899284354987683408 Time: 0.163072 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32 Tactic: 4042202769383439184 [03/25/2022-13:24:32] [V] [TRT] Tactic: 4042202769383439184 Time: 0.119936 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64 Tactic: 4414594337986714263 [03/25/2022-13:24:32] [V] [TRT] Tactic: 4414594337986714263 Time: 0.107264 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_small_nt_v1 Tactic: 4717285412741024953 [03/25/2022-13:24:32] [V] [TRT] Tactic: 4717285412741024953 Time: 0.147072 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32 Tactic: 4734519122557206480 [03/25/2022-13:24:32] [V] [TRT] Tactic: 4734519122557206480 Time: 0.11072 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: 4909502217677847353 [03/25/2022-13:24:32] [V] [TRT] Tactic: 4909502217677847353 Time: 0.081152 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: 4922297020351187339 [03/25/2022-13:24:32] [V] [TRT] Tactic: 4922297020351187339 Time: 0.115456 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_interior_nt_v1 Tactic: 5126565865931538390 [03/25/2022-13:24:32] [V] [TRT] Tactic: 5126565865931538390 Time: 0.148352 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_no_preds Tactic: 5380489069875971144 [03/25/2022-13:24:32] [V] [TRT] Tactic: 5380489069875971144 Time: 0.185728 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32 Tactic: 5424417905073460656 [03/25/2022-13:24:32] [V] [TRT] Tactic: 5424417905073460656 Time: 0.15872 [03/25/2022-13:24:32] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32 Tactic: 5442043907221427810 [03/25/2022-13:24:33] [V] [TRT] Tactic: 5442043907221427810 Time: 0.11072 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_no_preds Tactic: 5698083265414543143 [03/25/2022-13:24:33] [V] [TRT] Tactic: 5698083265414543143 Time: 0.146944 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6007888770437705057 [03/25/2022-13:24:33] [V] [TRT] Tactic: 6007888770437705057 Time: 0.104448 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16 Tactic: 6394572396369862482 [03/25/2022-13:24:33] [V] [TRT] Tactic: 6394572396369862482 Time: 0.252288 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 6405251167055673379 [03/25/2022-13:24:33] [V] [TRT] Tactic: 6405251167055673379 Time: 0.107008 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 6433368103202497147 [03/25/2022-13:24:33] [V] [TRT] Tactic: 6433368103202497147 Time: 0.109056 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6441948709525127755 [03/25/2022-13:24:33] [V] [TRT] Tactic: 6441948709525127755 Time: 0.1568 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6457435868048963632 [03/25/2022-13:24:33] [V] [TRT] Tactic: 6457435868048963632 Time: 0.103936 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1 Tactic: 6510345569544721081 [03/25/2022-13:24:33] [V] [TRT] Tactic: 6510345569544721081 Time: 0.14592 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32 Tactic: 6781129591847482048 [03/25/2022-13:24:33] [V] [TRT] Tactic: 6781129591847482048 Time: 0.150016 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6925201228918187099 [03/25/2022-13:24:33] [V] [TRT] Tactic: 6925201228918187099 Time: 0.086144 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6991524515605108718 [03/25/2022-13:24:33] [V] [TRT] Tactic: 6991524515605108718 Time: 0.119552 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32 Tactic: 7077570591813340966 [03/25/2022-13:24:33] [V] [TRT] Tactic: 7077570591813340966 Time: 0.115968 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 7318929579222925725 [03/25/2022-13:24:33] [V] [TRT] Tactic: 7318929579222925725 Time: 0.111104 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: 7886967395128926382 [03/25/2022-13:24:33] [V] [TRT] Tactic: 7886967395128926382 Time: 0.116096 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 8234775147403903473 [03/25/2022-13:24:33] [V] [TRT] Tactic: 8234775147403903473 Time: 0.10752 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 8765382722978397630 [03/25/2022-13:24:33] [V] [TRT] Tactic: 8765382722978397630 Time: 0.090624 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1 Tactic: 9062173295331155069 [03/25/2022-13:24:33] [V] [TRT] Tactic: 9062173295331155069 Time: 0.18432 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_indexed_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32 Tactic: 9064458886956700976 [03/25/2022-13:24:33] [V] [TRT] Tactic: 9064458886956700976 Time: 0.103936 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32 Tactic: -9165697322068360861 [03/25/2022-13:24:33] [V] [TRT] Tactic: -9165697322068360861 Time: 0.12864 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_small_nt_v1 Tactic: -9118785798277698619 [03/25/2022-13:24:33] [V] [TRT] Tactic: -9118785798277698619 Time: 0.143104 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: -9108166971364503411 [03/25/2022-13:24:33] [V] [TRT] Tactic: -9108166971364503411 Time: 0.171008 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -8943710627305202139 [03/25/2022-13:24:33] [V] [TRT] Tactic: -8943710627305202139 Time: 0.140672 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16 Tactic: -8861822316054763526 [03/25/2022-13:24:33] [V] [TRT] Tactic: -8861822316054763526 Time: 0.15936 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64 Tactic: -8791277710877987710 [03/25/2022-13:24:33] [V] [TRT] Tactic: -8791277710877987710 Time: 0.147456 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: -8691377209893505057 [03/25/2022-13:24:33] [V] [TRT] Tactic: -8691377209893505057 Time: 0.095488 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -8638624340850784688 [03/25/2022-13:24:33] [V] [TRT] Tactic: -8638624340850784688 Time: 0.139264 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: -8520292213102999339 [03/25/2022-13:24:33] [V] [TRT] Tactic: -8520292213102999339 Time: 0.137728 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32 Tactic: -8263994888336646547 [03/25/2022-13:24:33] [V] [TRT] Tactic: -8263994888336646547 Time: 0.101504 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32 Tactic: -8205948405243401049 [03/25/2022-13:24:33] [V] [TRT] Tactic: -8205948405243401049 Time: 0.176512 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -8172318747337038866 [03/25/2022-13:24:33] [V] [TRT] Tactic: -8172318747337038866 Time: 0.15488 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -7844028314176826857 [03/25/2022-13:24:33] [V] [TRT] Tactic: -7844028314176826857 Time: 0.137216 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32 Tactic: -7683887278997527517 [03/25/2022-13:24:33] [V] [TRT] Tactic: -7683887278997527517 Time: 0.205056 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: -7381370635708568663 [03/25/2022-13:24:33] [V] [TRT] Tactic: -7381370635708568663 Time: 0.117376 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r1s1 Tactic: -7361755530333096258 [03/25/2022-13:24:33] [V] [TRT] Tactic: -7361755530333096258 Time: 0.184576 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_t1r1s1 Tactic: -7289760022626653388 [03/25/2022-13:24:33] [V] [TRT] Tactic: -7289760022626653388 Time: 0.119808 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -6812830108414456369 [03/25/2022-13:24:33] [V] [TRT] Tactic: -6812830108414456369 Time: 0.119424 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -6510232214299595844 [03/25/2022-13:24:33] [V] [TRT] Tactic: -6510232214299595844 Time: 0.181376 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_small_nt_v1 Tactic: -6400348606759295499 [03/25/2022-13:24:33] [V] [TRT] Tactic: -6400348606759295499 Time: 0.134016 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32 Tactic: -6256128573036943404 [03/25/2022-13:24:33] [V] [TRT] Tactic: -6256128573036943404 Time: 0.159872 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_small_nt_v1 Tactic: -5980889159865208399 [03/25/2022-13:24:33] [V] [TRT] Tactic: -5980889159865208399 Time: 0.158976 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_medium_nt_v1 Tactic: -5766140806760372989 [03/25/2022-13:24:33] [V] [TRT] Tactic: -5766140806760372989 Time: 0.145792 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -5697614955743334137 [03/25/2022-13:24:33] [V] [TRT] Tactic: -5697614955743334137 Time: 0.132096 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -5311474420963248369 [03/25/2022-13:24:33] [V] [TRT] Tactic: -5311474420963248369 Time: 0.197504 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16 Tactic: -5180570335464125033 [03/25/2022-13:24:33] [V] [TRT] Tactic: -5180570335464125033 Time: 0.156544 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_medium_nt_v1 Tactic: -4516822589357530549 [03/25/2022-13:24:33] [V] [TRT] Tactic: -4516822589357530549 Time: 0.152448 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -4425346730823666456 [03/25/2022-13:24:33] [V] [TRT] Tactic: -4425346730823666456 Time: 0.134784 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r1s1 Tactic: -4260476497340370474 [03/25/2022-13:24:33] [V] [TRT] Tactic: -4260476497340370474 Time: 0.231552 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64 Tactic: -4232916483289779353 [03/25/2022-13:24:33] [V] [TRT] Tactic: -4232916483289779353 Time: 0.187776 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -4182501876984672402 [03/25/2022-13:24:33] [V] [TRT] Tactic: -4182501876984672402 Time: 0.103936 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_t1r1s1 Tactic: -4151617293257698859 [03/25/2022-13:24:33] [V] [TRT] Tactic: -4151617293257698859 Time: 0.10304 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -3862908719298381451 [03/25/2022-13:24:33] [V] [TRT] Tactic: -3862908719298381451 Time: 0.086272 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -3613322253849278738 [03/25/2022-13:24:33] [V] [TRT] Tactic: -3613322253849278738 Time: 0.202368 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16_t1r1s1 Tactic: -3531681826488401618 [03/25/2022-13:24:33] [V] [TRT] Tactic: -3531681826488401618 Time: 0.152832 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64 Tactic: -3460842194336717186 [03/25/2022-13:24:33] [V] [TRT] Tactic: -3460842194336717186 Time: 0.134784 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -2754311112012636251 [03/25/2022-13:24:33] [V] [TRT] Tactic: -2754311112012636251 Time: 0.165504 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: -2499089240293650188 [03/25/2022-13:24:33] [V] [TRT] Tactic: -2499089240293650188 Time: 0.1408 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: -2352253835013627337 [03/25/2022-13:24:33] [V] [TRT] Tactic: -2352253835013627337 Time: 0.081664 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: -2328318099174473157 [03/25/2022-13:24:33] [V] [TRT] Tactic: -2328318099174473157 Time: 0.189568 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r1s1 Tactic: -2315453944962430928 [03/25/2022-13:24:33] [V] [TRT] Tactic: -2315453944962430928 Time: 0.104832 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32 Tactic: -2054375205435666404 [03/25/2022-13:24:33] [V] [TRT] Tactic: -2054375205435666404 Time: 0.11904 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -1740762957710554518 [03/25/2022-13:24:33] [V] [TRT] Tactic: -1740762957710554518 Time: 0.182912 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -1549742793039499659 [03/25/2022-13:24:33] [V] [TRT] Tactic: -1549742793039499659 Time: 0.138752 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32 Tactic: -1498626619443284096 [03/25/2022-13:24:33] [V] [TRT] Tactic: -1498626619443284096 Time: 0.190592 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -1494157908358500249 [03/25/2022-13:24:33] [V] [TRT] Tactic: -1494157908358500249 Time: 0.166528 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_no_preds Tactic: -1465330458665632513 [03/25/2022-13:24:33] [V] [TRT] Tactic: -1465330458665632513 Time: 0.119936 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_t1r1s1 Tactic: -1328736756812546664 [03/25/2022-13:24:33] [V] [TRT] Tactic: -1328736756812546664 Time: 0.113152 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32 Tactic: -1283580231568512025 [03/25/2022-13:24:33] [V] [TRT] Tactic: -1283580231568512025 Time: 0.24128 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32 Tactic: -762222380308749469 [03/25/2022-13:24:33] [V] [TRT] Tactic: -762222380308749469 Time: 0.148992 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -619668460699260222 [03/25/2022-13:24:33] [V] [TRT] Tactic: -619668460699260222 Time: 0.145152 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -405554772060757402 [03/25/2022-13:24:33] [V] [TRT] Tactic: -405554772060757402 Time: 0.117632 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -375949437730908730 [03/25/2022-13:24:33] [V] [TRT] Tactic: -375949437730908730 Time: 0.123392 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16 Tactic: -366411318217594794 [03/25/2022-13:24:33] [V] [TRT] Tactic: -366411318217594794 Time: 0.19136 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16 Tactic: -351548418071036983 [03/25/2022-13:24:33] [V] [TRT] Tactic: -351548418071036983 Time: 0.155264 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: -233227833606287806 [03/25/2022-13:24:33] [V] [TRT] Tactic: -233227833606287806 Time: 0.120448 [03/25/2022-13:24:33] [V] [TRT] Fastest Tactic: 4909502217677847353 Time: 0.081152 [03/25/2022-13:24:33] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: 4909502217677847353 [03/25/2022-13:24:33] [V] [TRT] =============== Computing costs for [03/25/2022-13:24:33] [V] [TRT] *************** Autotuning format combination: Int8(50176,784:4,28,1) -> Int8(12544,196:4,14,1) *************** [03/25/2022-13:24:33] [V] [TRT] --------------- Timing Runner: sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 (CudaDepthwiseConvolution) [03/25/2022-13:24:33] [V] [TRT] CudaDepthwiseConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:33] [V] [TRT] --------------- Timing Runner: sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 (FusedConvActConvolution) [03/25/2022-13:24:33] [V] [TRT] FusedConvActConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:33] [V] [TRT] --------------- Timing Runner: sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 (CaskConvolution) [03/25/2022-13:24:33] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_medium_nn_v1 Tactic: 175853789719975416 [03/25/2022-13:24:33] [V] [TRT] Tactic: 175853789719975416 Time: 0.50688 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_medium_nn_v1 Tactic: 2171150287007712632 [03/25/2022-13:24:33] [V] [TRT] Tactic: 2171150287007712632 Time: 0.471552 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_small_nn_v1 Tactic: 2234457234705232274 [03/25/2022-13:24:33] [V] [TRT] Tactic: 2234457234705232274 Time: 0.45696 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_medium_nn_v1 Tactic: 5834048089706882838 [03/25/2022-13:24:33] [V] [TRT] Tactic: 5834048089706882838 Time: 0.46336 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_small_nn_v1 Tactic: -8626990807754934295 [03/25/2022-13:24:33] [V] [TRT] Tactic: -8626990807754934295 Time: 0.507648 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_small_nn_v1 Tactic: -7303593854972602201 [03/25/2022-13:24:33] [V] [TRT] Tactic: -7303593854972602201 Time: 0.45568 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_medium_nn_v1 Tactic: -6585664687867083638 [03/25/2022-13:24:33] [V] [TRT] Tactic: -6585664687867083638 Time: 0.442752 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_xregs_large_nn_v1 Tactic: -3730012925709297561 [03/25/2022-13:24:33] [V] [TRT] Tactic: -3730012925709297561 Time: 0.45184 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_xregs_large_nn_v1 Tactic: -2277259417488004546 [03/25/2022-13:24:33] [V] [TRT] Tactic: -2277259417488004546 Time: 0.470272 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_small_nn_v1 Tactic: -683636008127039856 [03/25/2022-13:24:33] [V] [TRT] Tactic: -683636008127039856 Time: 0.440576 [03/25/2022-13:24:33] [V] [TRT] Fastest Tactic: -683636008127039856 Time: 0.440576 [03/25/2022-13:24:33] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: -683636008127039856 [03/25/2022-13:24:33] [V] [TRT] *************** Autotuning format combination: Int8(50176,784:4,28,1) -> Int8(1568,196:32,14,1) *************** [03/25/2022-13:24:33] [V] [TRT] --------------- Timing Runner: sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 (CaskConvolution) [03/25/2022-13:24:33] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_xregs_large_c32_nn_v1 Tactic: 984309058095623735 [03/25/2022-13:24:33] [V] [TRT] Tactic: 984309058095623735 Time: 0.45184 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_small_c32_nn_v1 Tactic: 1100922622480907544 [03/25/2022-13:24:33] [V] [TRT] Tactic: 1100922622480907544 Time: 0.505088 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_xregs_large_c32_nn_v1 Tactic: 3238312825609165543 [03/25/2022-13:24:33] [V] [TRT] Tactic: 3238312825609165543 Time: 0.470272 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_medium_c32_nn_v1 Tactic: 3606311198834416176 [03/25/2022-13:24:33] [V] [TRT] Tactic: 3606311198834416176 Time: 0.462976 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_small_c32_nn_v1 Tactic: 4325765560739862899 [03/25/2022-13:24:33] [V] [TRT] Tactic: 4325765560739862899 Time: 0.441344 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_medium_c32_nn_v1 Tactic: -4255737803793506479 [03/25/2022-13:24:33] [V] [TRT] Tactic: -4255737803793506479 Time: 0.443136 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_small_c32_nn_v1 Tactic: -3958182351168863467 [03/25/2022-13:24:33] [V] [TRT] Tactic: -3958182351168863467 Time: 0.45504 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_medium_c32_nn_v1 Tactic: -3111968753064955248 [03/25/2022-13:24:33] [V] [TRT] Tactic: -3111968753064955248 Time: 0.470912 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_medium_c32_nn_v1 Tactic: -1492575840277333548 [03/25/2022-13:24:33] [V] [TRT] Tactic: -1492575840277333548 Time: 0.50624 [03/25/2022-13:24:33] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_small_c32_nn_v1 Tactic: -868495160148524802 [03/25/2022-13:24:33] [V] [TRT] Tactic: -868495160148524802 Time: 0.456192 [03/25/2022-13:24:33] [V] [TRT] Fastest Tactic: 4325765560739862899 Time: 0.441344 [03/25/2022-13:24:33] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: 4325765560739862899 [03/25/2022-13:24:33] [V] [TRT] *************** Autotuning format combination: Int8(6272,784:32,28,1) -> Int8(1568,196:32,14,1) *************** [03/25/2022-13:24:33] [V] [TRT] --------------- Timing Runner: sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 (CudaGroupConvolution) [03/25/2022-13:24:33] [V] [TRT] CudaGroupConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:33] [V] [TRT] --------------- Timing Runner: sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 (CudaDepthwiseConvolution) [03/25/2022-13:24:33] [V] [TRT] CudaDepthwiseConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:33] [V] [TRT] --------------- Timing Runner: sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 (FusedConvActConvolution) [03/25/2022-13:24:33] [V] [TRT] FusedConvActConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:34] [V] [TRT] --------------- Timing Runner: sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 (CaskConvolution) [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_epifadd Tactic: 177040020707947851 [03/25/2022-13:24:34] [V] [TRT] Tactic: 177040020707947851 Time: 0.185472 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 184229963126259101 [03/25/2022-13:24:34] [V] [TRT] Tactic: 184229963126259101 Time: 0.1184 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 289888059097454627 [03/25/2022-13:24:34] [V] [TRT] Tactic: 289888059097454627 Time: 0.132736 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32 Tactic: 328135613486708155 [03/25/2022-13:24:34] [V] [TRT] Tactic: 328135613486708155 Time: 0.30336 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 680740992583869928 [03/25/2022-13:24:34] [V] [TRT] Tactic: 680740992583869928 Time: 0.144128 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32 Tactic: 1111159740952609683 [03/25/2022-13:24:34] [V] [TRT] Tactic: 1111159740952609683 Time: 0.118272 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: 1134860903395928905 [03/25/2022-13:24:34] [V] [TRT] Tactic: 1134860903395928905 Time: 0.107392 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: 1276591930377039442 [03/25/2022-13:24:34] [V] [TRT] Tactic: 1276591930377039442 Time: 0.122112 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r3s3 Tactic: 1388866374720163187 [03/25/2022-13:24:34] [V] [TRT] Tactic: 1388866374720163187 Time: 0.18496 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32 Tactic: 1399501420456320585 [03/25/2022-13:24:34] [V] [TRT] Tactic: 1399501420456320585 Time: 0.14336 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: 1550399266192842845 [03/25/2022-13:24:34] [V] [TRT] Tactic: 1550399266192842845 Time: 0.144768 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 1572887561103143487 [03/25/2022-13:24:34] [V] [TRT] Tactic: 1572887561103143487 Time: 0.148608 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 1853122447892949466 [03/25/2022-13:24:34] [V] [TRT] Tactic: 1853122447892949466 Time: 0.143232 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: 2133329569091732311 [03/25/2022-13:24:34] [V] [TRT] Tactic: 2133329569091732311 Time: 0.13952 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32 Tactic: 2325023763229477890 [03/25/2022-13:24:34] [V] [TRT] Tactic: 2325023763229477890 Time: 0.094464 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: 2579824863892891529 [03/25/2022-13:24:34] [V] [TRT] Tactic: 2579824863892891529 Time: 0.20672 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32 Tactic: 2783960536172159663 [03/25/2022-13:24:34] [V] [TRT] Tactic: 2783960536172159663 Time: 0.084864 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32 Tactic: 2821711838552913693 [03/25/2022-13:24:34] [V] [TRT] Tactic: 2821711838552913693 Time: 0.118784 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64 Tactic: 2945009978756227538 [03/25/2022-13:24:34] [V] [TRT] Tactic: 2945009978756227538 Time: 0.091392 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_medium_nt_v1 Tactic: 2985940154541537814 [03/25/2022-13:24:34] [V] [TRT] Tactic: 2985940154541537814 Time: 0.13952 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 3284282970967328046 [03/25/2022-13:24:34] [V] [TRT] Tactic: 3284282970967328046 Time: 0.18368 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 3401614690060226673 [03/25/2022-13:24:34] [V] [TRT] Tactic: 3401614690060226673 Time: 0.165504 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32 Tactic: 3456719996792527006 [03/25/2022-13:24:34] [V] [TRT] Tactic: 3456719996792527006 Time: 0.102656 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 3512426920013359699 [03/25/2022-13:24:34] [V] [TRT] Tactic: 3512426920013359699 Time: 0.109952 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64 Tactic: 3651043333819148268 [03/25/2022-13:24:34] [V] [TRT] Tactic: 3651043333819148268 Time: 0.064512 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_medium_nt_v1 Tactic: 3899284354987683408 [03/25/2022-13:24:34] [V] [TRT] Tactic: 3899284354987683408 Time: 0.138496 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32 Tactic: 4042202769383439184 [03/25/2022-13:24:34] [V] [TRT] Tactic: 4042202769383439184 Time: 0.08896 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_large_nt_v1 Tactic: 4182625619810185112 [03/25/2022-13:24:34] [V] [TRT] Tactic: 4182625619810185112 Time: 0.151808 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 4214794893922618058 [03/25/2022-13:24:34] [V] [TRT] Tactic: 4214794893922618058 Time: 0.1376 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_epifadd Tactic: 4259547356717612415 [03/25/2022-13:24:34] [V] [TRT] Tactic: 4259547356717612415 Time: 0.150272 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 4384868749799132354 [03/25/2022-13:24:34] [V] [TRT] Tactic: 4384868749799132354 Time: 0.20672 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64 Tactic: 4414594337986714263 [03/25/2022-13:24:34] [V] [TRT] Tactic: 4414594337986714263 Time: 0.073728 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_small_nt_v1 Tactic: 4717285412741024953 [03/25/2022-13:24:34] [V] [TRT] Tactic: 4717285412741024953 Time: 0.13952 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32 Tactic: 4734519122557206480 [03/25/2022-13:24:34] [V] [TRT] Tactic: 4734519122557206480 Time: 0.07744 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: 4922297020351187339 [03/25/2022-13:24:34] [V] [TRT] Tactic: 4922297020351187339 Time: 0.115712 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16_t1r3s3 Tactic: 4931167631624420067 [03/25/2022-13:24:34] [V] [TRT] Tactic: 4931167631624420067 Time: 0.132352 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_epifadd Tactic: 5121596860264626879 [03/25/2022-13:24:34] [V] [TRT] Tactic: 5121596860264626879 Time: 0.075008 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 5136656982162849059 [03/25/2022-13:24:34] [V] [TRT] Tactic: 5136656982162849059 Time: 0.183808 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: 5158259316594207439 [03/25/2022-13:24:34] [V] [TRT] Tactic: 5158259316594207439 Time: 0.087936 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 5189825015507701541 [03/25/2022-13:24:34] [V] [TRT] Tactic: 5189825015507701541 Time: 0.323712 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32 Tactic: 5424417905073460656 [03/25/2022-13:24:34] [V] [TRT] Tactic: 5424417905073460656 Time: 0.13824 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32 Tactic: 5442043907221427810 [03/25/2022-13:24:34] [V] [TRT] Tactic: 5442043907221427810 Time: 0.090368 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_t1r3s3 Tactic: 5544365258913999384 [03/25/2022-13:24:34] [V] [TRT] Tactic: 5544365258913999384 Time: 0.089344 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r3s3 Tactic: 5641967928706599451 [03/25/2022-13:24:34] [V] [TRT] Tactic: 5641967928706599451 Time: 0.241664 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 5721595115357140131 [03/25/2022-13:24:34] [V] [TRT] Tactic: 5721595115357140131 Time: 0.116736 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 5966973378912044513 [03/25/2022-13:24:34] [V] [TRT] Tactic: 5966973378912044513 Time: 0.094464 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 6004789655466615912 [03/25/2022-13:24:34] [V] [TRT] Tactic: 6004789655466615912 Time: 0.148736 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r3s3 Tactic: 6146901278630392829 [03/25/2022-13:24:34] [V] [TRT] Tactic: 6146901278630392829 Time: 0.077184 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16 Tactic: 6394572396369862482 [03/25/2022-13:24:34] [V] [TRT] Tactic: 6394572396369862482 Time: 0.242944 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_epifadd Tactic: 6434020722187266170 [03/25/2022-13:24:34] [V] [TRT] Tactic: 6434020722187266170 Time: 0.077696 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32 Tactic: 6781129591847482048 [03/25/2022-13:24:34] [V] [TRT] Tactic: 6781129591847482048 Time: 0.100864 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 6984451771200230840 [03/25/2022-13:24:34] [V] [TRT] Tactic: 6984451771200230840 Time: 0.114816 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 7048234086361926570 [03/25/2022-13:24:34] [V] [TRT] Tactic: 7048234086361926570 Time: 0.159488 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32 Tactic: 7077570591813340966 [03/25/2022-13:24:34] [V] [TRT] Tactic: 7077570591813340966 Time: 0.088576 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 7191893591576074000 [03/25/2022-13:24:34] [V] [TRT] Tactic: 7191893591576074000 Time: 0.14272 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_t1r3s3 Tactic: 7429976449747682901 [03/25/2022-13:24:34] [V] [TRT] Tactic: 7429976449747682901 Time: 0.111488 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 7438984192263206338 [03/25/2022-13:24:34] [V] [TRT] Tactic: 7438984192263206338 Time: 0.08704 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: 7504901284678552178 [03/25/2022-13:24:34] [V] [TRT] Tactic: 7504901284678552178 Time: 0.076032 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 8096257414008860171 [03/25/2022-13:24:34] [V] [TRT] Tactic: 8096257414008860171 Time: 0.099584 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 8128112048355596715 [03/25/2022-13:24:34] [V] [TRT] Tactic: 8128112048355596715 Time: 0.094976 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_indexed_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: 8751622450593766232 [03/25/2022-13:24:34] [V] [TRT] Tactic: 8751622450593766232 Time: 0.082176 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_indexed_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32 Tactic: 9064458886956700976 [03/25/2022-13:24:34] [V] [TRT] Tactic: 9064458886956700976 Time: 0.083072 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 9143438935315839085 [03/25/2022-13:24:34] [V] [TRT] Tactic: 9143438935315839085 Time: 0.164096 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32 Tactic: -9165697322068360861 [03/25/2022-13:24:34] [V] [TRT] Tactic: -9165697322068360861 Time: 0.08064 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_small_nt_v1 Tactic: -9118785798277698619 [03/25/2022-13:24:34] [V] [TRT] Tactic: -9118785798277698619 Time: 0.138624 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: -9108166971364503411 [03/25/2022-13:24:34] [V] [TRT] Tactic: -9108166971364503411 Time: 0.159616 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16 Tactic: -8861822316054763526 [03/25/2022-13:24:34] [V] [TRT] Tactic: -8861822316054763526 Time: 0.133632 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64 Tactic: -8791277710877987710 [03/25/2022-13:24:34] [V] [TRT] Tactic: -8791277710877987710 Time: 0.112512 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: -8691377209893505057 [03/25/2022-13:24:34] [V] [TRT] Tactic: -8691377209893505057 Time: 0.084096 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: -8520292213102999339 [03/25/2022-13:24:34] [V] [TRT] Tactic: -8520292213102999339 Time: 0.119936 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16_t1r3s3 Tactic: -8475551154769412306 [03/25/2022-13:24:34] [V] [TRT] Tactic: -8475551154769412306 Time: 0.139392 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -8417388128970254446 [03/25/2022-13:24:34] [V] [TRT] Tactic: -8417388128970254446 Time: 0.116864 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32 Tactic: -8263994888336646547 [03/25/2022-13:24:34] [V] [TRT] Tactic: -8263994888336646547 Time: 0.076672 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32 Tactic: -8205948405243401049 [03/25/2022-13:24:34] [V] [TRT] Tactic: -8205948405243401049 Time: 0.144896 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: -7992068592656168418 [03/25/2022-13:24:34] [V] [TRT] Tactic: -7992068592656168418 Time: 0.099584 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_t1r3s3 Tactic: -7898477046581738867 [03/25/2022-13:24:34] [V] [TRT] Tactic: -7898477046581738867 Time: 0.132352 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_epifadd Tactic: -7842775553137511386 [03/25/2022-13:24:34] [V] [TRT] Tactic: -7842775553137511386 Time: 0.094336 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32 Tactic: -7683887278997527517 [03/25/2022-13:24:34] [V] [TRT] Tactic: -7683887278997527517 Time: 0.153728 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: -7381370635708568663 [03/25/2022-13:24:34] [V] [TRT] Tactic: -7381370635708568663 Time: 0.095872 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -7129320389887881029 [03/25/2022-13:24:34] [V] [TRT] Tactic: -7129320389887881029 Time: 0.105088 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -6959995514028471820 [03/25/2022-13:24:34] [V] [TRT] Tactic: -6959995514028471820 Time: 0.119552 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_small_nt_v1 Tactic: -6400348606759295499 [03/25/2022-13:24:34] [V] [TRT] Tactic: -6400348606759295499 Time: 0.135296 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: -6371781333659293809 [03/25/2022-13:24:34] [V] [TRT] Tactic: -6371781333659293809 Time: 0.160256 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32 Tactic: -6256128573036943404 [03/25/2022-13:24:34] [V] [TRT] Tactic: -6256128573036943404 Time: 0.114688 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_small_nt_v1 Tactic: -5980889159865208399 [03/25/2022-13:24:34] [V] [TRT] Tactic: -5980889159865208399 Time: 0.135424 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_medium_nt_v1 Tactic: -5766140806760372989 [03/25/2022-13:24:34] [V] [TRT] Tactic: -5766140806760372989 Time: 0.143872 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: -5709079507616090666 [03/25/2022-13:24:34] [V] [TRT] Tactic: -5709079507616090666 Time: 0.07552 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: -5698636014239116282 [03/25/2022-13:24:34] [V] [TRT] Tactic: -5698636014239116282 Time: 0.073984 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16 Tactic: -5180570335464125033 [03/25/2022-13:24:34] [V] [TRT] Tactic: -5180570335464125033 Time: 0.145152 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -4933563390723451692 [03/25/2022-13:24:34] [V] [TRT] Tactic: -4933563390723451692 Time: 0.11072 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_medium_nt_v1 Tactic: -4516822589357530549 [03/25/2022-13:24:34] [V] [TRT] Tactic: -4516822589357530549 Time: 0.146304 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64 Tactic: -4232916483289779353 [03/25/2022-13:24:34] [V] [TRT] Tactic: -4232916483289779353 Time: 0.170624 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64 Tactic: -3460842194336717186 [03/25/2022-13:24:34] [V] [TRT] Tactic: -3460842194336717186 Time: 0.095872 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: -3413217501222406256 [03/25/2022-13:24:34] [V] [TRT] Tactic: -3413217501222406256 Time: 0.07616 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: -3280888557222886418 [03/25/2022-13:24:34] [V] [TRT] Tactic: -3280888557222886418 Time: 0.094592 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -3238475748440751107 [03/25/2022-13:24:34] [V] [TRT] Tactic: -3238475748440751107 Time: 0.087936 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: -3182884991006484042 [03/25/2022-13:24:34] [V] [TRT] Tactic: -3182884991006484042 Time: 0.09408 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -3173468756112541306 [03/25/2022-13:24:34] [V] [TRT] Tactic: -3173468756112541306 Time: 0.142592 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_large_nt_v1 Tactic: -2917455979290586480 [03/25/2022-13:24:34] [V] [TRT] Tactic: -2917455979290586480 Time: 0.144512 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -2741641298163591508 [03/25/2022-13:24:34] [V] [TRT] Tactic: -2741641298163591508 Time: 0.088064 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_large_nt_v1 Tactic: -2571022005763160364 [03/25/2022-13:24:34] [V] [TRT] Tactic: -2571022005763160364 Time: 0.148224 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: -2499089240293650188 [03/25/2022-13:24:34] [V] [TRT] Tactic: -2499089240293650188 Time: 0.14464 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: -2328318099174473157 [03/25/2022-13:24:34] [V] [TRT] Tactic: -2328318099174473157 Time: 0.162176 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: -2083778562631872334 [03/25/2022-13:24:34] [V] [TRT] Tactic: -2083778562631872334 Time: 0.101248 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32 Tactic: -2054375205435666404 [03/25/2022-13:24:34] [V] [TRT] Tactic: -2054375205435666404 Time: 0.113152 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -1546787387293556842 [03/25/2022-13:24:34] [V] [TRT] Tactic: -1546787387293556842 Time: 0.075904 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32 Tactic: -1498626619443284096 [03/25/2022-13:24:34] [V] [TRT] Tactic: -1498626619443284096 Time: 0.14976 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -1471245223605064669 [03/25/2022-13:24:34] [V] [TRT] Tactic: -1471245223605064669 Time: 0.107776 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32 Tactic: -1283580231568512025 [03/25/2022-13:24:34] [V] [TRT] Tactic: -1283580231568512025 Time: 0.18688 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: -1224421172675151280 [03/25/2022-13:24:34] [V] [TRT] Tactic: -1224421172675151280 Time: 0.082944 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_epifadd Tactic: -1173968681844185579 [03/25/2022-13:24:34] [V] [TRT] Tactic: -1173968681844185579 Time: 0.188416 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -921247911551089037 [03/25/2022-13:24:34] [V] [TRT] Tactic: -921247911551089037 Time: 0.084608 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32 Tactic: -762222380308749469 [03/25/2022-13:24:34] [V] [TRT] Tactic: -762222380308749469 Time: 0.110976 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: -556794153877490941 [03/25/2022-13:24:34] [V] [TRT] Tactic: -556794153877490941 Time: 0.111616 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -516725800067794372 [03/25/2022-13:24:34] [V] [TRT] Tactic: -516725800067794372 Time: 0.079872 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_large_nt_v1 Tactic: -428104331444385564 [03/25/2022-13:24:34] [V] [TRT] Tactic: -428104331444385564 Time: 0.146048 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16 Tactic: -366411318217594794 [03/25/2022-13:24:34] [V] [TRT] Tactic: -366411318217594794 Time: 0.186112 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16 Tactic: -351548418071036983 [03/25/2022-13:24:34] [V] [TRT] Tactic: -351548418071036983 Time: 0.133248 [03/25/2022-13:24:34] [V] [TRT] Fastest Tactic: 3651043333819148268 Time: 0.064512 [03/25/2022-13:24:34] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: 3651043333819148268 [03/25/2022-13:24:34] [V] [TRT] =============== Computing costs for [03/25/2022-13:24:34] [V] [TRT] *************** Autotuning format combination: Int8(12544,196:4,14,1), Int8(50176,196:4,14,1) -> Int8(50176,196:4,14,1) *************** [03/25/2022-13:24:34] [V] [TRT] --------------- Timing Runner: sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 (CudaDepthwiseConvolution) [03/25/2022-13:24:34] [V] [TRT] CudaDepthwiseConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:34] [V] [TRT] --------------- Timing Runner: sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 (FusedConvActConvolution) [03/25/2022-13:24:34] [V] [TRT] FusedConvActConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:34] [V] [TRT] --------------- Timing Runner: sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 (CaskConvolution) [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_medium_nn_v1 Tactic: 175853789719975416 [03/25/2022-13:24:34] [V] [TRT] Tactic: 175853789719975416 Time: 0.241792 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_medium_nn_v1 Tactic: 2171150287007712632 [03/25/2022-13:24:34] [V] [TRT] Tactic: 2171150287007712632 Time: 0.253952 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_small_nn_v1 Tactic: 2234457234705232274 [03/25/2022-13:24:34] [V] [TRT] Tactic: 2234457234705232274 Time: 0.217344 [03/25/2022-13:24:34] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_medium_nn_v1 Tactic: 5834048089706882838 [03/25/2022-13:24:35] [V] [TRT] Tactic: 5834048089706882838 Time: 0.21888 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_interior_nn_v1 Tactic: 6299962968199310600 [03/25/2022-13:24:35] [V] [TRT] Tactic: 6299962968199310600 Time: 0.212224 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_interior_nn_v1 Tactic: 6341572697076960911 [03/25/2022-13:24:35] [V] [TRT] Tactic: 6341572697076960911 Time: 0.232704 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_small_nn_v1 Tactic: -8626990807754934295 [03/25/2022-13:24:35] [V] [TRT] Tactic: -8626990807754934295 Time: 0.23808 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_interior_nn_v1 Tactic: -8498217049614706532 [03/25/2022-13:24:35] [V] [TRT] Tactic: -8498217049614706532 Time: 0.209664 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_small_nn_v1 Tactic: -7303593854972602201 [03/25/2022-13:24:35] [V] [TRT] Tactic: -7303593854972602201 Time: 0.240384 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_medium_nn_v1 Tactic: -6585664687867083638 [03/25/2022-13:24:35] [V] [TRT] Tactic: -6585664687867083638 Time: 0.213888 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_interior_nn_v1 Tactic: -3326139578711341011 [03/25/2022-13:24:35] [V] [TRT] Tactic: -3326139578711341011 Time: 0.226432 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_small_nn_v1 Tactic: -683636008127039856 [03/25/2022-13:24:35] [V] [TRT] Tactic: -683636008127039856 Time: 0.213504 [03/25/2022-13:24:35] [V] [TRT] Fastest Tactic: -8498217049614706532 Time: 0.209664 [03/25/2022-13:24:35] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: -8498217049614706532 [03/25/2022-13:24:35] [V] [TRT] *************** Autotuning format combination: Int8(12544,196:4,14,1), Int8(6272,196:32,14,1) -> Int8(6272,196:32,14,1) *************** [03/25/2022-13:24:35] [V] [TRT] --------------- Timing Runner: sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 (CaskConvolution) [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_small_c32_nn_v1 Tactic: 1100922622480907544 [03/25/2022-13:24:35] [V] [TRT] Tactic: 1100922622480907544 Time: 0.233984 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_interior_c32_nn_v1 Tactic: 2855900226702061782 [03/25/2022-13:24:35] [V] [TRT] Tactic: 2855900226702061782 Time: 0.211968 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_medium_c32_nn_v1 Tactic: 3606311198834416176 [03/25/2022-13:24:35] [V] [TRT] Tactic: 3606311198834416176 Time: 0.216704 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_small_c32_nn_v1 Tactic: 4325765560739862899 [03/25/2022-13:24:35] [V] [TRT] Tactic: 4325765560739862899 Time: 0.213248 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_interior_c32_nn_v1 Tactic: 8803458114157674373 [03/25/2022-13:24:35] [V] [TRT] Tactic: 8803458114157674373 Time: 0.208 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_interior_c32_nn_v1 Tactic: -6934773036503365000 [03/25/2022-13:24:35] [V] [TRT] Tactic: -6934773036503365000 Time: 0.221952 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_interior_c32_nn_v1 Tactic: -4431642509665791294 [03/25/2022-13:24:35] [V] [TRT] Tactic: -4431642509665791294 Time: 0.230016 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_medium_c32_nn_v1 Tactic: -4255737803793506479 [03/25/2022-13:24:35] [V] [TRT] Tactic: -4255737803793506479 Time: 0.213376 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_small_c32_nn_v1 Tactic: -3958182351168863467 [03/25/2022-13:24:35] [V] [TRT] Tactic: -3958182351168863467 Time: 0.23936 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_medium_c32_nn_v1 Tactic: -3111968753064955248 [03/25/2022-13:24:35] [V] [TRT] Tactic: -3111968753064955248 Time: 0.246912 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_medium_c32_nn_v1 Tactic: -1492575840277333548 [03/25/2022-13:24:35] [V] [TRT] Tactic: -1492575840277333548 Time: 0.236032 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_small_c32_nn_v1 Tactic: -868495160148524802 [03/25/2022-13:24:35] [V] [TRT] Tactic: -868495160148524802 Time: 0.215296 [03/25/2022-13:24:35] [V] [TRT] Fastest Tactic: 8803458114157674373 Time: 0.208 [03/25/2022-13:24:35] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: 8803458114157674373 [03/25/2022-13:24:35] [V] [TRT] *************** Autotuning format combination: Int8(1568,196:32,14,1), Int8(6272,196:32,14,1) -> Int8(6272,196:32,14,1) *************** [03/25/2022-13:24:35] [V] [TRT] --------------- Timing Runner: sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 (CudaGroupConvolution) [03/25/2022-13:24:35] [V] [TRT] CudaGroupConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:35] [V] [TRT] --------------- Timing Runner: sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 (CudaDepthwiseConvolution) [03/25/2022-13:24:35] [V] [TRT] CudaDepthwiseConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:35] [V] [TRT] --------------- Timing Runner: sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 (FusedConvActConvolution) [03/25/2022-13:24:35] [V] [TRT] FusedConvActConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:35] [V] [TRT] --------------- Timing Runner: sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 (CaskConvolution) [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16_t1r1s1 Tactic: 68468667201176803 [03/25/2022-13:24:35] [V] [TRT] Tactic: 68468667201176803 Time: 0.113536 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 125145153013230687 [03/25/2022-13:24:35] [V] [TRT] Tactic: 125145153013230687 Time: 0.1024 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_interior_nt_v1 Tactic: 434957160407688216 [03/25/2022-13:24:35] [V] [TRT] Tactic: 434957160407688216 Time: 0.125568 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 805889586762897346 [03/25/2022-13:24:35] [V] [TRT] Tactic: 805889586762897346 Time: 0.09088 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: 857001784974286465 [03/25/2022-13:24:35] [V] [TRT] Tactic: 857001784974286465 Time: 0.091904 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 1214130898909872671 [03/25/2022-13:24:35] [V] [TRT] Tactic: 1214130898909872671 Time: 0.12096 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 1278425129871930205 [03/25/2022-13:24:35] [V] [TRT] Tactic: 1278425129871930205 Time: 0.0864 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 1583811548148740665 [03/25/2022-13:24:35] [V] [TRT] Tactic: 1583811548148740665 Time: 0.095872 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 1701344857577810806 [03/25/2022-13:24:35] [V] [TRT] Tactic: 1701344857577810806 Time: 0.104576 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 1797231177354918208 [03/25/2022-13:24:35] [V] [TRT] Tactic: 1797231177354918208 Time: 0.131328 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2004812516525036381 [03/25/2022-13:24:35] [V] [TRT] Tactic: 2004812516525036381 Time: 0.094464 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2030033463723799063 [03/25/2022-13:24:35] [V] [TRT] Tactic: 2030033463723799063 Time: 0.089728 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_interior_nt_v1 Tactic: 2346437292116182513 [03/25/2022-13:24:35] [V] [TRT] Tactic: 2346437292116182513 Time: 0.107648 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2376898825218218566 [03/25/2022-13:24:35] [V] [TRT] Tactic: 2376898825218218566 Time: 0.07616 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_interior_nt_v1 Tactic: 2522133112320625287 [03/25/2022-13:24:35] [V] [TRT] Tactic: 2522133112320625287 Time: 0.106624 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2548171972648455240 [03/25/2022-13:24:35] [V] [TRT] Tactic: 2548171972648455240 Time: 0.079872 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 2548946449357458230 [03/25/2022-13:24:35] [V] [TRT] Tactic: 2548946449357458230 Time: 0.121088 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: 2570666021825229009 [03/25/2022-13:24:35] [V] [TRT] Tactic: 2570666021825229009 Time: 0.129536 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 2678520742286844763 [03/25/2022-13:24:35] [V] [TRT] Tactic: 2678520742286844763 Time: 0.140672 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 2756291002030759362 [03/25/2022-13:24:35] [V] [TRT] Tactic: 2756291002030759362 Time: 0.106752 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2972948223367788520 [03/25/2022-13:24:35] [V] [TRT] Tactic: 2972948223367788520 Time: 0.077824 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_medium_nt_v1 Tactic: 2985940154541537814 [03/25/2022-13:24:35] [V] [TRT] Tactic: 2985940154541537814 Time: 0.10688 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 3043273137345374664 [03/25/2022-13:24:35] [V] [TRT] Tactic: 3043273137345374664 Time: 0.108288 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 3221677093659484230 [03/25/2022-13:24:35] [V] [TRT] Tactic: 3221677093659484230 Time: 0.133504 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: 3242897809704328258 [03/25/2022-13:24:35] [V] [TRT] Tactic: 3242897809704328258 Time: 0.116736 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 3312456766204252694 [03/25/2022-13:24:35] [V] [TRT] Tactic: 3312456766204252694 Time: 0.122112 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 3538565962642681625 [03/25/2022-13:24:35] [V] [TRT] Tactic: 3538565962642681625 Time: 0.104064 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 3541919052468401776 [03/25/2022-13:24:35] [V] [TRT] Tactic: 3541919052468401776 Time: 0.103936 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 3593397928177382100 [03/25/2022-13:24:35] [V] [TRT] Tactic: 3593397928177382100 Time: 0.121728 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 3670282018109435863 [03/25/2022-13:24:35] [V] [TRT] Tactic: 3670282018109435863 Time: 0.087552 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 3671413346254027573 [03/25/2022-13:24:35] [V] [TRT] Tactic: 3671413346254027573 Time: 0.096896 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_medium_nt_v1 Tactic: 3899284354987683408 [03/25/2022-13:24:35] [V] [TRT] Tactic: 3899284354987683408 Time: 0.1248 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 3927509214678622419 [03/25/2022-13:24:35] [V] [TRT] Tactic: 3927509214678622419 Time: 0.112896 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 4112572034735311841 [03/25/2022-13:24:35] [V] [TRT] Tactic: 4112572034735311841 Time: 0.155136 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 4239974928951431644 [03/25/2022-13:24:35] [V] [TRT] Tactic: 4239974928951431644 Time: 0.083968 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 4610760414797216079 [03/25/2022-13:24:35] [V] [TRT] Tactic: 4610760414797216079 Time: 0.089216 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_small_nt_v1 Tactic: 4717285412741024953 [03/25/2022-13:24:35] [V] [TRT] Tactic: 4717285412741024953 Time: 0.110976 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 4796956614760326119 [03/25/2022-13:24:35] [V] [TRT] Tactic: 4796956614760326119 Time: 0.084224 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: 4909502217677847353 [03/25/2022-13:24:35] [V] [TRT] Tactic: 4909502217677847353 Time: 0.0832 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 4919361344804309192 [03/25/2022-13:24:35] [V] [TRT] Tactic: 4919361344804309192 Time: 0.134656 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 5043674678294309681 [03/25/2022-13:24:35] [V] [TRT] Tactic: 5043674678294309681 Time: 0.095744 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_interior_nt_v1 Tactic: 5126565865931538390 [03/25/2022-13:24:35] [V] [TRT] Tactic: 5126565865931538390 Time: 0.109184 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5204702486885981735 [03/25/2022-13:24:35] [V] [TRT] Tactic: 5204702486885981735 Time: 0.09152 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 5375256703210220108 [03/25/2022-13:24:35] [V] [TRT] Tactic: 5375256703210220108 Time: 0.099456 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 5424258848951129084 [03/25/2022-13:24:35] [V] [TRT] Tactic: 5424258848951129084 Time: 0.084352 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5443897483205284103 [03/25/2022-13:24:35] [V] [TRT] Tactic: 5443897483205284103 Time: 0.101376 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5707566217891294846 [03/25/2022-13:24:35] [V] [TRT] Tactic: 5707566217891294846 Time: 0.083328 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5986622376339202983 [03/25/2022-13:24:35] [V] [TRT] Tactic: 5986622376339202983 Time: 0.093952 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6007888770437705057 [03/25/2022-13:24:35] [V] [TRT] Tactic: 6007888770437705057 Time: 0.094464 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 6405251167055673379 [03/25/2022-13:24:35] [V] [TRT] Tactic: 6405251167055673379 Time: 0.08704 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 6433368103202497147 [03/25/2022-13:24:35] [V] [TRT] Tactic: 6433368103202497147 Time: 0.087424 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6441948709525127755 [03/25/2022-13:24:35] [V] [TRT] Tactic: 6441948709525127755 Time: 0.127232 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 6443933097134654777 [03/25/2022-13:24:35] [V] [TRT] Tactic: 6443933097134654777 Time: 0.094336 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6457435868048963632 [03/25/2022-13:24:35] [V] [TRT] Tactic: 6457435868048963632 Time: 0.089728 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1 Tactic: 6510345569544721081 [03/25/2022-13:24:35] [V] [TRT] Tactic: 6510345569544721081 Time: 0.131328 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 6793988781414507278 [03/25/2022-13:24:35] [V] [TRT] Tactic: 6793988781414507278 Time: 0.079104 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 6880710371738875469 [03/25/2022-13:24:35] [V] [TRT] Tactic: 6880710371738875469 Time: 0.090368 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6925201228918187099 [03/25/2022-13:24:35] [V] [TRT] Tactic: 6925201228918187099 Time: 0.081664 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6991524515605108718 [03/25/2022-13:24:35] [V] [TRT] Tactic: 6991524515605108718 Time: 0.11776 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 7245509442265271220 [03/25/2022-13:24:35] [V] [TRT] Tactic: 7245509442265271220 Time: 0.089088 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 7318929579222925725 [03/25/2022-13:24:35] [V] [TRT] Tactic: 7318929579222925725 Time: 0.086528 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 7731430299029542276 [03/25/2022-13:24:35] [V] [TRT] Tactic: 7731430299029542276 Time: 0.079104 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 7738495016763012180 [03/25/2022-13:24:35] [V] [TRT] Tactic: 7738495016763012180 Time: 0.089344 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: 7886967395128926382 [03/25/2022-13:24:35] [V] [TRT] Tactic: 7886967395128926382 Time: 0.0928 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 8142283985160822229 [03/25/2022-13:24:35] [V] [TRT] Tactic: 8142283985160822229 Time: 0.091264 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 8173975624668590862 [03/25/2022-13:24:35] [V] [TRT] Tactic: 8173975624668590862 Time: 0.09216 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 8234775147403903473 [03/25/2022-13:24:35] [V] [TRT] Tactic: 8234775147403903473 Time: 0.094592 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 8524082966802584889 [03/25/2022-13:24:35] [V] [TRT] Tactic: 8524082966802584889 Time: 0.081408 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 8684013308930763400 [03/25/2022-13:24:35] [V] [TRT] Tactic: 8684013308930763400 Time: 0.109312 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 8765382722978397630 [03/25/2022-13:24:35] [V] [TRT] Tactic: 8765382722978397630 Time: 0.08256 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 8843193587782643431 [03/25/2022-13:24:35] [V] [TRT] Tactic: 8843193587782643431 Time: 0.13312 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 8883810517410230831 [03/25/2022-13:24:35] [V] [TRT] Tactic: 8883810517410230831 Time: 0.116992 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 8930797211803511337 [03/25/2022-13:24:35] [V] [TRT] Tactic: 8930797211803511337 Time: 0.138368 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 8935070489925739043 [03/25/2022-13:24:35] [V] [TRT] Tactic: 8935070489925739043 Time: 0.099328 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1 Tactic: 9062173295331155069 [03/25/2022-13:24:35] [V] [TRT] Tactic: 9062173295331155069 Time: 0.177024 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_small_nt_v1 Tactic: -9118785798277698619 [03/25/2022-13:24:35] [V] [TRT] Tactic: -9118785798277698619 Time: 0.132224 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -8985599729413291927 [03/25/2022-13:24:35] [V] [TRT] Tactic: -8985599729413291927 Time: 0.120576 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -8972697510150675429 [03/25/2022-13:24:35] [V] [TRT] Tactic: -8972697510150675429 Time: 0.135936 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -8943710627305202139 [03/25/2022-13:24:35] [V] [TRT] Tactic: -8943710627305202139 Time: 0.12352 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -8859846367886814331 [03/25/2022-13:24:35] [V] [TRT] Tactic: -8859846367886814331 Time: 0.140672 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -8638624340850784688 [03/25/2022-13:24:35] [V] [TRT] Tactic: -8638624340850784688 Time: 0.143744 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -8556775352640313933 [03/25/2022-13:24:35] [V] [TRT] Tactic: -8556775352640313933 Time: 0.105344 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -8382298409581540699 [03/25/2022-13:24:35] [V] [TRT] Tactic: -8382298409581540699 Time: 0.168064 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -8172318747337038866 [03/25/2022-13:24:35] [V] [TRT] Tactic: -8172318747337038866 Time: 0.159232 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -8038164441468184723 [03/25/2022-13:24:35] [V] [TRT] Tactic: -8038164441468184723 Time: 0.109696 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -7844028314176826857 [03/25/2022-13:24:35] [V] [TRT] Tactic: -7844028314176826857 Time: 0.139904 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -7674507941016740570 [03/25/2022-13:24:35] [V] [TRT] Tactic: -7674507941016740570 Time: 0.10368 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -7364286662638617917 [03/25/2022-13:24:35] [V] [TRT] Tactic: -7364286662638617917 Time: 0.109824 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r1s1 Tactic: -7361755530333096258 [03/25/2022-13:24:35] [V] [TRT] Tactic: -7361755530333096258 Time: 0.152448 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_t1r1s1 Tactic: -7289760022626653388 [03/25/2022-13:24:35] [V] [TRT] Tactic: -7289760022626653388 Time: 0.117888 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -7106539943789766885 [03/25/2022-13:24:35] [V] [TRT] Tactic: -7106539943789766885 Time: 0.15552 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -6969478418607271266 [03/25/2022-13:24:35] [V] [TRT] Tactic: -6969478418607271266 Time: 0.155008 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -6930438165437733000 [03/25/2022-13:24:35] [V] [TRT] Tactic: -6930438165437733000 Time: 0.192256 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -6879607992933502380 [03/25/2022-13:24:35] [V] [TRT] Tactic: -6879607992933502380 Time: 0.100096 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -6839669803644810934 [03/25/2022-13:24:35] [V] [TRT] Tactic: -6839669803644810934 Time: 0.117888 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -6812830108414456369 [03/25/2022-13:24:35] [V] [TRT] Tactic: -6812830108414456369 Time: 0.117248 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -6779804930216439173 [03/25/2022-13:24:35] [V] [TRT] Tactic: -6779804930216439173 Time: 0.115328 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -6527178416855951297 [03/25/2022-13:24:35] [V] [TRT] Tactic: -6527178416855951297 Time: 0.17536 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -6510232214299595844 [03/25/2022-13:24:35] [V] [TRT] Tactic: -6510232214299595844 Time: 0.181504 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_small_nt_v1 Tactic: -6400348606759295499 [03/25/2022-13:24:35] [V] [TRT] Tactic: -6400348606759295499 Time: 0.127872 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -6346247605026339453 [03/25/2022-13:24:35] [V] [TRT] Tactic: -6346247605026339453 Time: 0.12288 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -6232597026469067819 [03/25/2022-13:24:35] [V] [TRT] Tactic: -6232597026469067819 Time: 0.142848 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_small_nt_v1 Tactic: -5980889159865208399 [03/25/2022-13:24:35] [V] [TRT] Tactic: -5980889159865208399 Time: 0.155904 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_medium_nt_v1 Tactic: -5766140806760372989 [03/25/2022-13:24:35] [V] [TRT] Tactic: -5766140806760372989 Time: 0.13632 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -5697614955743334137 [03/25/2022-13:24:35] [V] [TRT] Tactic: -5697614955743334137 Time: 0.130176 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -5671123121710113970 [03/25/2022-13:24:35] [V] [TRT] Tactic: -5671123121710113970 Time: 0.108928 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -5615581362569252260 [03/25/2022-13:24:35] [V] [TRT] Tactic: -5615581362569252260 Time: 0.14848 [03/25/2022-13:24:35] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -5562968047117507056 [03/25/2022-13:24:36] [V] [TRT] Tactic: -5562968047117507056 Time: 0.114432 [03/25/2022-13:24:36] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -5516472881360101487 [03/25/2022-13:24:36] [V] [TRT] Tactic: -5516472881360101487 Time: 0.14144 [03/25/2022-13:24:36] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -5311474420963248369 [03/25/2022-13:24:36] [V] [TRT] Tactic: -5311474420963248369 Time: 0.15488 [03/25/2022-13:24:36] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -5170003087447722174 [03/25/2022-13:24:36] [V] [TRT] Tactic: -5170003087447722174 Time: 0.159872 [03/25/2022-13:24:36] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -4889586143772361690 [03/25/2022-13:24:36] [V] [TRT] Tactic: -4889586143772361690 Time: 0.109696 [03/25/2022-13:24:36] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -4889498558023475527 [03/25/2022-13:24:36] [V] [TRT] Tactic: -4889498558023475527 Time: 0.117632 [03/25/2022-13:24:36] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -4849712423393454704 [03/25/2022-13:24:36] [V] [TRT] Tactic: -4849712423393454704 Time: 0.112256 [03/25/2022-13:24:36] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: -4681913707320020520 [03/25/2022-13:24:36] [V] [TRT] Tactic: -4681913707320020520 Time: 0.115456 [03/25/2022-13:24:36] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_medium_nt_v1 Tactic: -4516822589357530549 [03/25/2022-13:24:36] [V] [TRT] Tactic: -4516822589357530549 Time: 0.140544 [03/25/2022-13:24:36] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -4455415102719506646 [03/25/2022-13:24:36] [V] [TRT] Tactic: -4455415102719506646 Time: 0.128 [03/25/2022-13:24:36] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -4425346730823666456 [03/25/2022-13:24:36] [V] [TRT] Tactic: -4425346730823666456 Time: 0.152832 [03/25/2022-13:24:36] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r1s1 Tactic: -4260476497340370474 [03/25/2022-13:24:36] [V] [TRT] Tactic: -4260476497340370474 Time: 0.172928 [03/25/2022-13:24:36] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -4182501876984672402 [03/25/2022-13:24:36] [V] [TRT] Tactic: -4182501876984672402 Time: 0.113792 [03/25/2022-13:24:36] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_t1r1s1 Tactic: -4151617293257698859 [03/25/2022-13:24:36] [V] [TRT] Tactic: -4151617293257698859 Time: 0.117376 [03/25/2022-13:24:36] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -3862908719298381451 [03/25/2022-13:24:36] [V] [TRT] Tactic: -3862908719298381451 Time: 0.1152 [03/25/2022-13:24:36] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -3825889760337461729 [03/25/2022-13:24:36] [V] [TRT] Tactic: -3825889760337461729 Time: 0.169728 [03/25/2022-13:24:36] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -3797022944823726673 [03/25/2022-13:24:36] [V] [TRT] Tactic: -3797022944823726673 Time: 0.125824 [03/25/2022-13:24:36] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -3613322253849278738 [03/25/2022-13:24:36] [V] [TRT] Tactic: -3613322253849278738 Time: 0.202112 [03/25/2022-13:24:36] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -3577322188448771475 [03/25/2022-13:24:36] [V] [TRT] Tactic: -3577322188448771475 Time: 0.149376 [03/25/2022-13:24:36] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16_t1r1s1 Tactic: -3531681826488401618 [03/25/2022-13:24:36] [V] [TRT] Tactic: -3531681826488401618 Time: 0.160768 [03/25/2022-13:24:36] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -3305554949874552860 [03/25/2022-13:24:36] [V] [TRT] Tactic: -3305554949874552860 Time: 0.176512 [03/25/2022-13:24:36] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -3288585994448820820 [03/25/2022-13:24:36] [V] [TRT] Tactic: -3288585994448820820 Time: 0.128256 [03/25/2022-13:24:36] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -2754311112012636251 [03/25/2022-13:24:36] [V] [TRT] Tactic: -2754311112012636251 Time: 0.156544 [03/25/2022-13:24:36] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -2432868635536396215 [03/25/2022-13:24:36] [V] [TRT] Tactic: -2432868635536396215 Time: 0.120064 [03/25/2022-13:24:36] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -2379804152300264660 [03/25/2022-13:24:36] [V] [TRT] Tactic: -2379804152300264660 Time: 0.137984 [03/25/2022-13:24:36] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: -2352253835013627337 [03/25/2022-13:24:36] [V] [TRT] Tactic: -2352253835013627337 Time: 0.106496 [03/25/2022-13:24:36] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -2335587136911650799 [03/25/2022-13:24:36] [V] [TRT] Tactic: -2335587136911650799 Time: 0.134784 [03/25/2022-13:24:36] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r1s1 Tactic: -2315453944962430928 [03/25/2022-13:24:36] [V] [TRT] Tactic: -2315453944962430928 Time: 0.120704 [03/25/2022-13:24:36] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: -2238364958919154661 [03/25/2022-13:24:36] [V] [TRT] Tactic: -2238364958919154661 Time: 0.163584 [03/25/2022-13:24:36] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -1916483171117495388 [03/25/2022-13:24:36] [V] [TRT] Tactic: -1916483171117495388 Time: 0.147584 [03/25/2022-13:24:36] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -1740762957710554518 [03/25/2022-13:24:36] [V] [TRT] Tactic: -1740762957710554518 Time: 0.177664 [03/25/2022-13:24:36] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -1549742793039499659 [03/25/2022-13:24:36] [V] [TRT] Tactic: -1549742793039499659 Time: 0.146816 [03/25/2022-13:24:36] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -1499578657823798783 [03/25/2022-13:24:36] [V] [TRT] Tactic: -1499578657823798783 Time: 0.130432 [03/25/2022-13:24:36] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -1494157908358500249 [03/25/2022-13:24:36] [V] [TRT] Tactic: -1494157908358500249 Time: 0.146176 [03/25/2022-13:24:36] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_t1r1s1 Tactic: -1328736756812546664 [03/25/2022-13:24:36] [V] [TRT] Tactic: -1328736756812546664 Time: 0.12032 [03/25/2022-13:24:36] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -1006589727652607355 [03/25/2022-13:24:36] [V] [TRT] Tactic: -1006589727652607355 Time: 0.156672 [03/25/2022-13:24:36] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -713022856474991236 [03/25/2022-13:24:36] [V] [TRT] Tactic: -713022856474991236 Time: 0.19712 [03/25/2022-13:24:36] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -619668460699260222 [03/25/2022-13:24:36] [V] [TRT] Tactic: -619668460699260222 Time: 0.161408 [03/25/2022-13:24:36] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -405554772060757402 [03/25/2022-13:24:36] [V] [TRT] Tactic: -405554772060757402 Time: 0.116608 [03/25/2022-13:24:36] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -375949437730908730 [03/25/2022-13:24:36] [V] [TRT] Tactic: -375949437730908730 Time: 0.131072 [03/25/2022-13:24:36] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: -233227833606287806 [03/25/2022-13:24:36] [V] [TRT] Tactic: -233227833606287806 Time: 0.12096 [03/25/2022-13:24:36] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -111878368089469751 [03/25/2022-13:24:36] [V] [TRT] Tactic: -111878368089469751 Time: 0.14656 [03/25/2022-13:24:36] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -48936598874722005 [03/25/2022-13:24:36] [V] [TRT] Tactic: -48936598874722005 Time: 0.098688 [03/25/2022-13:24:36] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -19707840769375107 [03/25/2022-13:24:36] [V] [TRT] Tactic: -19707840769375107 Time: 0.144768 [03/25/2022-13:24:36] [V] [TRT] Fastest Tactic: 2376898825218218566 Time: 0.07616 [03/25/2022-13:24:36] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: 2376898825218218566 [03/25/2022-13:24:36] [V] [TRT] =============== Computing costs for [03/25/2022-13:24:36] [V] [TRT] *************** Autotuning format combination: Int8(50176,196:4,14,1) -> Int8(12544,196:4,14,1) *************** [03/25/2022-13:24:36] [V] [TRT] --------------- Timing Runner: sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 (CudaDepthwiseConvolution) [03/25/2022-13:24:36] [V] [TRT] CudaDepthwiseConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:36] [V] [TRT] --------------- Timing Runner: sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 (FusedConvActConvolution) [03/25/2022-13:24:36] [V] [TRT] FusedConvActConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:36] [V] [TRT] --------------- Timing Runner: sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 (CaskConvolution) [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_medium_nn_v1 Tactic: 175853789719975416 [03/25/2022-13:24:36] [V] [TRT] Tactic: 175853789719975416 Time: 0.305664 [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_medium_nn_v1 Tactic: 2171150287007712632 [03/25/2022-13:24:36] [V] [TRT] Tactic: 2171150287007712632 Time: 0.279424 [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_small_nn_v1 Tactic: 2234457234705232274 [03/25/2022-13:24:36] [V] [TRT] Tactic: 2234457234705232274 Time: 0.271744 [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_medium_nn_v1 Tactic: 5834048089706882838 [03/25/2022-13:24:36] [V] [TRT] Tactic: 5834048089706882838 Time: 0.273408 [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_interior_nn_v1 Tactic: 6299962968199310600 [03/25/2022-13:24:36] [V] [TRT] Tactic: 6299962968199310600 Time: 0.261504 [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_interior_nn_v1 Tactic: 6341572697076960911 [03/25/2022-13:24:36] [V] [TRT] Tactic: 6341572697076960911 Time: 0.268032 [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_small_nn_v1 Tactic: -8626990807754934295 [03/25/2022-13:24:36] [V] [TRT] Tactic: -8626990807754934295 Time: 0.302464 [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_interior_nn_v1 Tactic: -8498217049614706532 [03/25/2022-13:24:36] [V] [TRT] Tactic: -8498217049614706532 Time: 0.263168 [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_small_nn_v1 Tactic: -7303593854972602201 [03/25/2022-13:24:36] [V] [TRT] Tactic: -7303593854972602201 Time: 0.27136 [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_medium_nn_v1 Tactic: -6585664687867083638 [03/25/2022-13:24:36] [V] [TRT] Tactic: -6585664687867083638 Time: 0.2624 [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_interior_nn_v1 Tactic: -3326139578711341011 [03/25/2022-13:24:36] [V] [TRT] Tactic: -3326139578711341011 Time: 0.283648 [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_small_nn_v1 Tactic: -683636008127039856 [03/25/2022-13:24:36] [V] [TRT] Tactic: -683636008127039856 Time: 0.263168 [03/25/2022-13:24:36] [V] [TRT] Fastest Tactic: 6299962968199310600 Time: 0.261504 [03/25/2022-13:24:36] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: 6299962968199310600 [03/25/2022-13:24:36] [V] [TRT] *************** Autotuning format combination: Int8(50176,196:4,14,1) -> Int8(1568,196:32,14,1) *************** [03/25/2022-13:24:36] [V] [TRT] --------------- Timing Runner: sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 (CaskConvolution) [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_small_c32_nn_v1 Tactic: 1100922622480907544 [03/25/2022-13:24:36] [V] [TRT] Tactic: 1100922622480907544 Time: 0.301952 [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_interior_c32_nn_v1 Tactic: 2855900226702061782 [03/25/2022-13:24:36] [V] [TRT] Tactic: 2855900226702061782 Time: 0.26176 [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_medium_c32_nn_v1 Tactic: 3606311198834416176 [03/25/2022-13:24:36] [V] [TRT] Tactic: 3606311198834416176 Time: 0.272768 [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_small_c32_nn_v1 Tactic: 4325765560739862899 [03/25/2022-13:24:36] [V] [TRT] Tactic: 4325765560739862899 Time: 0.263424 [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_interior_c32_nn_v1 Tactic: 8803458114157674373 [03/25/2022-13:24:36] [V] [TRT] Tactic: 8803458114157674373 Time: 0.262784 [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_interior_c32_nn_v1 Tactic: -6934773036503365000 [03/25/2022-13:24:36] [V] [TRT] Tactic: -6934773036503365000 Time: 0.283264 [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_interior_c32_nn_v1 Tactic: -4431642509665791294 [03/25/2022-13:24:36] [V] [TRT] Tactic: -4431642509665791294 Time: 0.267136 [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_medium_c32_nn_v1 Tactic: -4255737803793506479 [03/25/2022-13:24:36] [V] [TRT] Tactic: -4255737803793506479 Time: 0.263424 [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_small_c32_nn_v1 Tactic: -3958182351168863467 [03/25/2022-13:24:36] [V] [TRT] Tactic: -3958182351168863467 Time: 0.270848 [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_medium_c32_nn_v1 Tactic: -3111968753064955248 [03/25/2022-13:24:36] [V] [TRT] Tactic: -3111968753064955248 Time: 0.278784 [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_medium_c32_nn_v1 Tactic: -1492575840277333548 [03/25/2022-13:24:36] [V] [TRT] Tactic: -1492575840277333548 Time: 0.304128 [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_small_c32_nn_v1 Tactic: -868495160148524802 [03/25/2022-13:24:36] [V] [TRT] Tactic: -868495160148524802 Time: 0.271488 [03/25/2022-13:24:36] [V] [TRT] Fastest Tactic: 2855900226702061782 Time: 0.26176 [03/25/2022-13:24:36] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: 2855900226702061782 [03/25/2022-13:24:36] [V] [TRT] *************** Autotuning format combination: Int8(6272,196:32,14,1) -> Int8(1568,196:32,14,1) *************** [03/25/2022-13:24:36] [V] [TRT] --------------- Timing Runner: sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 (CudaGroupConvolution) [03/25/2022-13:24:36] [V] [TRT] CudaGroupConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:36] [V] [TRT] --------------- Timing Runner: sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 (CudaDepthwiseConvolution) [03/25/2022-13:24:36] [V] [TRT] CudaDepthwiseConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:36] [V] [TRT] --------------- Timing Runner: sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 (FusedConvActConvolution) [03/25/2022-13:24:36] [V] [TRT] FusedConvActConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:36] [V] [TRT] --------------- Timing Runner: sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 (CaskConvolution) [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16_t1r1s1 Tactic: 68468667201176803 [03/25/2022-13:24:36] [V] [TRT] Tactic: 68468667201176803 Time: 0.09024 [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 125145153013230687 [03/25/2022-13:24:36] [V] [TRT] Tactic: 125145153013230687 Time: 0.08128 [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_interior_nt_v1 Tactic: 434957160407688216 [03/25/2022-13:24:36] [V] [TRT] Tactic: 434957160407688216 Time: 0.093312 [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 805889586762897346 [03/25/2022-13:24:36] [V] [TRT] Tactic: 805889586762897346 Time: 0.055296 [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: 857001784974286465 [03/25/2022-13:24:36] [V] [TRT] Tactic: 857001784974286465 Time: 0.04736 [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 1214130898909872671 [03/25/2022-13:24:36] [V] [TRT] Tactic: 1214130898909872671 Time: 0.110208 [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_conv_fprop_smallk_i8i8_i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_warptilesize16x32_stage5_warpsize1x4x2_r1s1_u1v1_hw0_c1024_scalebias_relu Tactic: 1263683011321748626 [03/25/2022-13:24:36] [V] [TRT] Tactic: 1263683011321748626 Time: 0.047744 [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 1278425129871930205 [03/25/2022-13:24:36] [V] [TRT] Tactic: 1278425129871930205 Time: 0.05376 [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 1583811548148740665 [03/25/2022-13:24:36] [V] [TRT] Tactic: 1583811548148740665 Time: 0.082048 [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 1701344857577810806 [03/25/2022-13:24:36] [V] [TRT] Tactic: 1701344857577810806 Time: 0.072576 [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 1797231177354918208 [03/25/2022-13:24:36] [V] [TRT] Tactic: 1797231177354918208 Time: 0.097792 [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2004812516525036381 [03/25/2022-13:24:36] [V] [TRT] Tactic: 2004812516525036381 Time: 0.073472 [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2030033463723799063 [03/25/2022-13:24:36] [V] [TRT] Tactic: 2030033463723799063 Time: 0.058752 [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_interior_nt_v1 Tactic: 2346437292116182513 [03/25/2022-13:24:36] [V] [TRT] Tactic: 2346437292116182513 Time: 0.091776 [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2376898825218218566 [03/25/2022-13:24:36] [V] [TRT] Tactic: 2376898825218218566 Time: 0.053632 [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_interior_nt_v1 Tactic: 2522133112320625287 [03/25/2022-13:24:36] [V] [TRT] Tactic: 2522133112320625287 Time: 0.089728 [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2548171972648455240 [03/25/2022-13:24:36] [V] [TRT] Tactic: 2548171972648455240 Time: 0.057728 [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 2548946449357458230 [03/25/2022-13:24:36] [V] [TRT] Tactic: 2548946449357458230 Time: 0.107008 [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: 2570666021825229009 [03/25/2022-13:24:36] [V] [TRT] Tactic: 2570666021825229009 Time: 0.093696 [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 2678520742286844763 [03/25/2022-13:24:36] [V] [TRT] Tactic: 2678520742286844763 Time: 0.13696 [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 2756291002030759362 [03/25/2022-13:24:36] [V] [TRT] Tactic: 2756291002030759362 Time: 0.070656 [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2972948223367788520 [03/25/2022-13:24:36] [V] [TRT] Tactic: 2972948223367788520 Time: 0.058624 [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_medium_nt_v1 Tactic: 2985940154541537814 [03/25/2022-13:24:36] [V] [TRT] Tactic: 2985940154541537814 Time: 0.091392 [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 3043273137345374664 [03/25/2022-13:24:36] [V] [TRT] Tactic: 3043273137345374664 Time: 0.109952 [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 3221677093659484230 [03/25/2022-13:24:36] [V] [TRT] Tactic: 3221677093659484230 Time: 0.089728 [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: 3242897809704328258 [03/25/2022-13:24:36] [V] [TRT] Tactic: 3242897809704328258 Time: 0.093568 [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 3312456766204252694 [03/25/2022-13:24:36] [V] [TRT] Tactic: 3312456766204252694 Time: 0.119168 [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 3538565962642681625 [03/25/2022-13:24:36] [V] [TRT] Tactic: 3538565962642681625 Time: 0.08128 [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 3541919052468401776 [03/25/2022-13:24:36] [V] [TRT] Tactic: 3541919052468401776 Time: 0.085632 [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 3593397928177382100 [03/25/2022-13:24:36] [V] [TRT] Tactic: 3593397928177382100 Time: 0.111104 [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 3670282018109435863 [03/25/2022-13:24:36] [V] [TRT] Tactic: 3670282018109435863 Time: 0.068224 [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 3671413346254027573 [03/25/2022-13:24:36] [V] [TRT] Tactic: 3671413346254027573 Time: 0.070528 [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_medium_nt_v1 Tactic: 3899284354987683408 [03/25/2022-13:24:36] [V] [TRT] Tactic: 3899284354987683408 Time: 0.092544 [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 3927509214678622419 [03/25/2022-13:24:36] [V] [TRT] Tactic: 3927509214678622419 Time: 0.092544 [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 4112572034735311841 [03/25/2022-13:24:36] [V] [TRT] Tactic: 4112572034735311841 Time: 0.139136 [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 4239974928951431644 [03/25/2022-13:24:36] [V] [TRT] Tactic: 4239974928951431644 Time: 0.084736 [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 4610760414797216079 [03/25/2022-13:24:36] [V] [TRT] Tactic: 4610760414797216079 Time: 0.071936 [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_small_nt_v1 Tactic: 4717285412741024953 [03/25/2022-13:24:36] [V] [TRT] Tactic: 4717285412741024953 Time: 0.091008 [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 4796956614760326119 [03/25/2022-13:24:36] [V] [TRT] Tactic: 4796956614760326119 Time: 0.064512 [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: 4909502217677847353 [03/25/2022-13:24:36] [V] [TRT] Tactic: 4909502217677847353 Time: 0.04928 [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 4919361344804309192 [03/25/2022-13:24:36] [V] [TRT] Tactic: 4919361344804309192 Time: 0.108544 [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 5043674678294309681 [03/25/2022-13:24:36] [V] [TRT] Tactic: 5043674678294309681 Time: 0.071552 [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_interior_nt_v1 Tactic: 5126565865931538390 [03/25/2022-13:24:36] [V] [TRT] Tactic: 5126565865931538390 Time: 0.092672 [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5204702486885981735 [03/25/2022-13:24:36] [V] [TRT] Tactic: 5204702486885981735 Time: 0.069504 [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 5375256703210220108 [03/25/2022-13:24:36] [V] [TRT] Tactic: 5375256703210220108 Time: 0.067072 [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 5424258848951129084 [03/25/2022-13:24:36] [V] [TRT] Tactic: 5424258848951129084 Time: 0.04992 [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5443897483205284103 [03/25/2022-13:24:36] [V] [TRT] Tactic: 5443897483205284103 Time: 0.071808 [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5707566217891294846 [03/25/2022-13:24:36] [V] [TRT] Tactic: 5707566217891294846 Time: 0.059904 [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5986622376339202983 [03/25/2022-13:24:36] [V] [TRT] Tactic: 5986622376339202983 Time: 0.081024 [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6007888770437705057 [03/25/2022-13:24:36] [V] [TRT] Tactic: 6007888770437705057 Time: 0.062592 [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 6405251167055673379 [03/25/2022-13:24:36] [V] [TRT] Tactic: 6405251167055673379 Time: 0.06528 [03/25/2022-13:24:36] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 6433368103202497147 [03/25/2022-13:24:37] [V] [TRT] Tactic: 6433368103202497147 Time: 0.061312 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6441948709525127755 [03/25/2022-13:24:37] [V] [TRT] Tactic: 6441948709525127755 Time: 0.113024 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 6443933097134654777 [03/25/2022-13:24:37] [V] [TRT] Tactic: 6443933097134654777 Time: 0.064512 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6457435868048963632 [03/25/2022-13:24:37] [V] [TRT] Tactic: 6457435868048963632 Time: 0.072064 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1 Tactic: 6510345569544721081 [03/25/2022-13:24:37] [V] [TRT] Tactic: 6510345569544721081 Time: 0.09408 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 6793988781414507278 [03/25/2022-13:24:37] [V] [TRT] Tactic: 6793988781414507278 Time: 0.055552 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 6880710371738875469 [03/25/2022-13:24:37] [V] [TRT] Tactic: 6880710371738875469 Time: 0.076672 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6925201228918187099 [03/25/2022-13:24:37] [V] [TRT] Tactic: 6925201228918187099 Time: 0.056448 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6991524515605108718 [03/25/2022-13:24:37] [V] [TRT] Tactic: 6991524515605108718 Time: 0.088832 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 7245509442265271220 [03/25/2022-13:24:37] [V] [TRT] Tactic: 7245509442265271220 Time: 0.080768 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 7318929579222925725 [03/25/2022-13:24:37] [V] [TRT] Tactic: 7318929579222925725 Time: 0.065408 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 7731430299029542276 [03/25/2022-13:24:37] [V] [TRT] Tactic: 7731430299029542276 Time: 0.055296 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 7738495016763012180 [03/25/2022-13:24:37] [V] [TRT] Tactic: 7738495016763012180 Time: 0.056064 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: 7886967395128926382 [03/25/2022-13:24:37] [V] [TRT] Tactic: 7886967395128926382 Time: 0.06208 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 8142283985160822229 [03/25/2022-13:24:37] [V] [TRT] Tactic: 8142283985160822229 Time: 0.057088 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 8173975624668590862 [03/25/2022-13:24:37] [V] [TRT] Tactic: 8173975624668590862 Time: 0.058496 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 8234775147403903473 [03/25/2022-13:24:37] [V] [TRT] Tactic: 8234775147403903473 Time: 0.058496 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 8524082966802584889 [03/25/2022-13:24:37] [V] [TRT] Tactic: 8524082966802584889 Time: 0.060672 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 8684013308930763400 [03/25/2022-13:24:37] [V] [TRT] Tactic: 8684013308930763400 Time: 0.088832 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 8765382722978397630 [03/25/2022-13:24:37] [V] [TRT] Tactic: 8765382722978397630 Time: 0.061312 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 8843193587782643431 [03/25/2022-13:24:37] [V] [TRT] Tactic: 8843193587782643431 Time: 0.080384 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 8883810517410230831 [03/25/2022-13:24:37] [V] [TRT] Tactic: 8883810517410230831 Time: 0.062464 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 8930797211803511337 [03/25/2022-13:24:37] [V] [TRT] Tactic: 8930797211803511337 Time: 0.083456 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 8935070489925739043 [03/25/2022-13:24:37] [V] [TRT] Tactic: 8935070489925739043 Time: 0.059776 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1 Tactic: 9062173295331155069 [03/25/2022-13:24:37] [V] [TRT] Tactic: 9062173295331155069 Time: 0.1376 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_small_nt_v1 Tactic: -9118785798277698619 [03/25/2022-13:24:37] [V] [TRT] Tactic: -9118785798277698619 Time: 0.090496 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -8985599729413291927 [03/25/2022-13:24:37] [V] [TRT] Tactic: -8985599729413291927 Time: 0.06592 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -8972697510150675429 [03/25/2022-13:24:37] [V] [TRT] Tactic: -8972697510150675429 Time: 0.084608 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -8943710627305202139 [03/25/2022-13:24:37] [V] [TRT] Tactic: -8943710627305202139 Time: 0.093056 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -8859846367886814331 [03/25/2022-13:24:37] [V] [TRT] Tactic: -8859846367886814331 Time: 0.09792 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -8638624340850784688 [03/25/2022-13:24:37] [V] [TRT] Tactic: -8638624340850784688 Time: 0.083072 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -8556775352640313933 [03/25/2022-13:24:37] [V] [TRT] Tactic: -8556775352640313933 Time: 0.059008 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -8382298409581540699 [03/25/2022-13:24:37] [V] [TRT] Tactic: -8382298409581540699 Time: 0.135168 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -8172318747337038866 [03/25/2022-13:24:37] [V] [TRT] Tactic: -8172318747337038866 Time: 0.089344 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -8038164441468184723 [03/25/2022-13:24:37] [V] [TRT] Tactic: -8038164441468184723 Time: 0.060416 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -7844028314176826857 [03/25/2022-13:24:37] [V] [TRT] Tactic: -7844028314176826857 Time: 0.110976 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -7674507941016740570 [03/25/2022-13:24:37] [V] [TRT] Tactic: -7674507941016740570 Time: 0.049408 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -7364286662638617917 [03/25/2022-13:24:37] [V] [TRT] Tactic: -7364286662638617917 Time: 0.0544 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r1s1 Tactic: -7361755530333096258 [03/25/2022-13:24:37] [V] [TRT] Tactic: -7361755530333096258 Time: 0.108032 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_t1r1s1 Tactic: -7289760022626653388 [03/25/2022-13:24:37] [V] [TRT] Tactic: -7289760022626653388 Time: 0.062464 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -7106539943789766885 [03/25/2022-13:24:37] [V] [TRT] Tactic: -7106539943789766885 Time: 0.088192 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -6969478418607271266 [03/25/2022-13:24:37] [V] [TRT] Tactic: -6969478418607271266 Time: 0.08832 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -6930438165437733000 [03/25/2022-13:24:37] [V] [TRT] Tactic: -6930438165437733000 Time: 0.139136 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -6879607992933502380 [03/25/2022-13:24:37] [V] [TRT] Tactic: -6879607992933502380 Time: 0.064128 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -6839669803644810934 [03/25/2022-13:24:37] [V] [TRT] Tactic: -6839669803644810934 Time: 0.070784 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -6812830108414456369 [03/25/2022-13:24:37] [V] [TRT] Tactic: -6812830108414456369 Time: 0.071552 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -6779804930216439173 [03/25/2022-13:24:37] [V] [TRT] Tactic: -6779804930216439173 Time: 0.046464 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -6527178416855951297 [03/25/2022-13:24:37] [V] [TRT] Tactic: -6527178416855951297 Time: 0.110592 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -6510232214299595844 [03/25/2022-13:24:37] [V] [TRT] Tactic: -6510232214299595844 Time: 0.110976 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_small_nt_v1 Tactic: -6400348606759295499 [03/25/2022-13:24:37] [V] [TRT] Tactic: -6400348606759295499 Time: 0.08832 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -6346247605026339453 [03/25/2022-13:24:37] [V] [TRT] Tactic: -6346247605026339453 Time: 0.092288 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -6232597026469067819 [03/25/2022-13:24:37] [V] [TRT] Tactic: -6232597026469067819 Time: 0.079616 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_small_nt_v1 Tactic: -5980889159865208399 [03/25/2022-13:24:37] [V] [TRT] Tactic: -5980889159865208399 Time: 0.091136 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_medium_nt_v1 Tactic: -5766140806760372989 [03/25/2022-13:24:37] [V] [TRT] Tactic: -5766140806760372989 Time: 0.093312 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -5697614955743334137 [03/25/2022-13:24:37] [V] [TRT] Tactic: -5697614955743334137 Time: 0.086784 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -5671123121710113970 [03/25/2022-13:24:37] [V] [TRT] Tactic: -5671123121710113970 Time: 0.087296 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -5615581362569252260 [03/25/2022-13:24:37] [V] [TRT] Tactic: -5615581362569252260 Time: 0.095616 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -5562968047117507056 [03/25/2022-13:24:37] [V] [TRT] Tactic: -5562968047117507056 Time: 0.062592 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -5516472881360101487 [03/25/2022-13:24:37] [V] [TRT] Tactic: -5516472881360101487 Time: 0.082432 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -5311474420963248369 [03/25/2022-13:24:37] [V] [TRT] Tactic: -5311474420963248369 Time: 0.119808 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -5170003087447722174 [03/25/2022-13:24:37] [V] [TRT] Tactic: -5170003087447722174 Time: 0.113536 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -4889586143772361690 [03/25/2022-13:24:37] [V] [TRT] Tactic: -4889586143772361690 Time: 0.070016 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -4889498558023475527 [03/25/2022-13:24:37] [V] [TRT] Tactic: -4889498558023475527 Time: 0.061696 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -4849712423393454704 [03/25/2022-13:24:37] [V] [TRT] Tactic: -4849712423393454704 Time: 0.070144 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: -4681913707320020520 [03/25/2022-13:24:37] [V] [TRT] Tactic: -4681913707320020520 Time: 0.046592 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_medium_nt_v1 Tactic: -4516822589357530549 [03/25/2022-13:24:37] [V] [TRT] Tactic: -4516822589357530549 Time: 0.094464 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -4455415102719506646 [03/25/2022-13:24:37] [V] [TRT] Tactic: -4455415102719506646 Time: 0.080512 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -4425346730823666456 [03/25/2022-13:24:37] [V] [TRT] Tactic: -4425346730823666456 Time: 0.085248 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r1s1 Tactic: -4260476497340370474 [03/25/2022-13:24:37] [V] [TRT] Tactic: -4260476497340370474 Time: 0.135936 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -4182501876984672402 [03/25/2022-13:24:37] [V] [TRT] Tactic: -4182501876984672402 Time: 0.086528 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_t1r1s1 Tactic: -4151617293257698859 [03/25/2022-13:24:37] [V] [TRT] Tactic: -4151617293257698859 Time: 0.064768 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -3862908719298381451 [03/25/2022-13:24:37] [V] [TRT] Tactic: -3862908719298381451 Time: 0.047232 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -3825889760337461729 [03/25/2022-13:24:37] [V] [TRT] Tactic: -3825889760337461729 Time: 0.10944 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -3797022944823726673 [03/25/2022-13:24:37] [V] [TRT] Tactic: -3797022944823726673 Time: 0.079232 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -3613322253849278738 [03/25/2022-13:24:37] [V] [TRT] Tactic: -3613322253849278738 Time: 0.14272 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -3577322188448771475 [03/25/2022-13:24:37] [V] [TRT] Tactic: -3577322188448771475 Time: 0.087296 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16_t1r1s1 Tactic: -3531681826488401618 [03/25/2022-13:24:37] [V] [TRT] Tactic: -3531681826488401618 Time: 0.08896 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -3305554949874552860 [03/25/2022-13:24:37] [V] [TRT] Tactic: -3305554949874552860 Time: 0.136576 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -3288585994448820820 [03/25/2022-13:24:37] [V] [TRT] Tactic: -3288585994448820820 Time: 0.080768 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -2754311112012636251 [03/25/2022-13:24:37] [V] [TRT] Tactic: -2754311112012636251 Time: 0.08896 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -2432868635536396215 [03/25/2022-13:24:37] [V] [TRT] Tactic: -2432868635536396215 Time: 0.081536 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -2379804152300264660 [03/25/2022-13:24:37] [V] [TRT] Tactic: -2379804152300264660 Time: 0.1088 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: -2352253835013627337 [03/25/2022-13:24:37] [V] [TRT] Tactic: -2352253835013627337 Time: 0.050432 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -2335587136911650799 [03/25/2022-13:24:37] [V] [TRT] Tactic: -2335587136911650799 Time: 0.073728 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r1s1 Tactic: -2315453944962430928 [03/25/2022-13:24:37] [V] [TRT] Tactic: -2315453944962430928 Time: 0.059136 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: -2238364958919154661 [03/25/2022-13:24:37] [V] [TRT] Tactic: -2238364958919154661 Time: 0.093184 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -1916483171117495388 [03/25/2022-13:24:37] [V] [TRT] Tactic: -1916483171117495388 Time: 0.08896 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -1740762957710554518 [03/25/2022-13:24:37] [V] [TRT] Tactic: -1740762957710554518 Time: 0.136832 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -1549742793039499659 [03/25/2022-13:24:37] [V] [TRT] Tactic: -1549742793039499659 Time: 0.080128 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -1499578657823798783 [03/25/2022-13:24:37] [V] [TRT] Tactic: -1499578657823798783 Time: 0.081024 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -1494157908358500249 [03/25/2022-13:24:37] [V] [TRT] Tactic: -1494157908358500249 Time: 0.099968 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_t1r1s1 Tactic: -1328736756812546664 [03/25/2022-13:24:37] [V] [TRT] Tactic: -1328736756812546664 Time: 0.082048 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -1006589727652607355 [03/25/2022-13:24:37] [V] [TRT] Tactic: -1006589727652607355 Time: 0.096896 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -713022856474991236 [03/25/2022-13:24:37] [V] [TRT] Tactic: -713022856474991236 Time: 0.141184 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -619668460699260222 [03/25/2022-13:24:37] [V] [TRT] Tactic: -619668460699260222 Time: 0.093568 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -405554772060757402 [03/25/2022-13:24:37] [V] [TRT] Tactic: -405554772060757402 Time: 0.069888 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -375949437730908730 [03/25/2022-13:24:37] [V] [TRT] Tactic: -375949437730908730 Time: 0.068352 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: -233227833606287806 [03/25/2022-13:24:37] [V] [TRT] Tactic: -233227833606287806 Time: 0.071296 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -111878368089469751 [03/25/2022-13:24:37] [V] [TRT] Tactic: -111878368089469751 Time: 0.080256 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -48936598874722005 [03/25/2022-13:24:37] [V] [TRT] Tactic: -48936598874722005 Time: 0.061824 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -19707840769375107 [03/25/2022-13:24:37] [V] [TRT] Tactic: -19707840769375107 Time: 0.086912 [03/25/2022-13:24:37] [V] [TRT] Fastest Tactic: -6779804930216439173 Time: 0.046464 [03/25/2022-13:24:37] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: -6779804930216439173 [03/25/2022-13:24:37] [V] [TRT] =============== Computing costs for [03/25/2022-13:24:37] [V] [TRT] *************** Autotuning format combination: Int8(12544,196:4,14,1) -> Int8(12544,196:4,14,1) *************** [03/25/2022-13:24:37] [V] [TRT] --------------- Timing Runner: sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 (CudaDepthwiseConvolution) [03/25/2022-13:24:37] [V] [TRT] CudaDepthwiseConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:37] [V] [TRT] --------------- Timing Runner: sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 (FusedConvActConvolution) [03/25/2022-13:24:37] [V] [TRT] FusedConvActConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:37] [V] [TRT] --------------- Timing Runner: sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 (CaskConvolution) [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_medium_nn_v1 Tactic: 175853789719975416 [03/25/2022-13:24:37] [V] [TRT] Tactic: 175853789719975416 Time: 0.646144 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_medium_nn_v1 Tactic: 2171150287007712632 [03/25/2022-13:24:37] [V] [TRT] Tactic: 2171150287007712632 Time: 0.620672 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_small_nn_v1 Tactic: 2234457234705232274 [03/25/2022-13:24:37] [V] [TRT] Tactic: 2234457234705232274 Time: 0.585472 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_medium_nn_v1 Tactic: 5834048089706882838 [03/25/2022-13:24:37] [V] [TRT] Tactic: 5834048089706882838 Time: 0.603392 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_small_nn_v1 Tactic: -8626990807754934295 [03/25/2022-13:24:37] [V] [TRT] Tactic: -8626990807754934295 Time: 0.627328 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_small_nn_v1 Tactic: -7303593854972602201 [03/25/2022-13:24:37] [V] [TRT] Tactic: -7303593854972602201 Time: 0.5824 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_medium_nn_v1 Tactic: -6585664687867083638 [03/25/2022-13:24:37] [V] [TRT] Tactic: -6585664687867083638 Time: 0.57344 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_xregs_large_nn_v1 Tactic: -3730012925709297561 [03/25/2022-13:24:37] [V] [TRT] Tactic: -3730012925709297561 Time: 0.58688 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_xregs_large_nn_v1 Tactic: -2277259417488004546 [03/25/2022-13:24:37] [V] [TRT] Tactic: -2277259417488004546 Time: 0.614912 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_small_nn_v1 Tactic: -683636008127039856 [03/25/2022-13:24:37] [V] [TRT] Tactic: -683636008127039856 Time: 0.566912 [03/25/2022-13:24:37] [V] [TRT] Fastest Tactic: -683636008127039856 Time: 0.566912 [03/25/2022-13:24:37] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: -683636008127039856 [03/25/2022-13:24:37] [V] [TRT] *************** Autotuning format combination: Int8(12544,196:4,14,1) -> Int8(1568,196:32,14,1) *************** [03/25/2022-13:24:37] [V] [TRT] --------------- Timing Runner: sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 (CaskConvolution) [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_xregs_large_c32_nn_v1 Tactic: 984309058095623735 [03/25/2022-13:24:37] [V] [TRT] Tactic: 984309058095623735 Time: 0.482304 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_small_c32_nn_v1 Tactic: 1100922622480907544 [03/25/2022-13:24:37] [V] [TRT] Tactic: 1100922622480907544 Time: 0.515584 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_xregs_large_c32_nn_v1 Tactic: 3238312825609165543 [03/25/2022-13:24:37] [V] [TRT] Tactic: 3238312825609165543 Time: 0.505472 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_medium_c32_nn_v1 Tactic: 3606311198834416176 [03/25/2022-13:24:37] [V] [TRT] Tactic: 3606311198834416176 Time: 0.496128 [03/25/2022-13:24:37] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_small_c32_nn_v1 Tactic: 4325765560739862899 [03/25/2022-13:24:38] [V] [TRT] Tactic: 4325765560739862899 Time: 0.466944 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_medium_c32_nn_v1 Tactic: -4255737803793506479 [03/25/2022-13:24:38] [V] [TRT] Tactic: -4255737803793506479 Time: 0.471552 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_small_c32_nn_v1 Tactic: -3958182351168863467 [03/25/2022-13:24:38] [V] [TRT] Tactic: -3958182351168863467 Time: 0.47872 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_medium_c32_nn_v1 Tactic: -3111968753064955248 [03/25/2022-13:24:38] [V] [TRT] Tactic: -3111968753064955248 Time: 0.509696 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_medium_c32_nn_v1 Tactic: -1492575840277333548 [03/25/2022-13:24:38] [V] [TRT] Tactic: -1492575840277333548 Time: 0.531328 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_small_c32_nn_v1 Tactic: -868495160148524802 [03/25/2022-13:24:38] [V] [TRT] Tactic: -868495160148524802 Time: 0.480256 [03/25/2022-13:24:38] [V] [TRT] Fastest Tactic: 4325765560739862899 Time: 0.466944 [03/25/2022-13:24:38] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: 4325765560739862899 [03/25/2022-13:24:38] [V] [TRT] *************** Autotuning format combination: Int8(1568,196:32,14,1) -> Int8(1568,196:32,14,1) *************** [03/25/2022-13:24:38] [V] [TRT] --------------- Timing Runner: sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 (CudaGroupConvolution) [03/25/2022-13:24:38] [V] [TRT] CudaGroupConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:38] [V] [TRT] --------------- Timing Runner: sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 (CudaDepthwiseConvolution) [03/25/2022-13:24:38] [V] [TRT] CudaDepthwiseConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:38] [V] [TRT] --------------- Timing Runner: sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 (FusedConvActConvolution) [03/25/2022-13:24:38] [V] [TRT] FusedConvActConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:38] [V] [TRT] --------------- Timing Runner: sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 (CaskConvolution) [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_epifadd Tactic: 177040020707947851 [03/25/2022-13:24:38] [V] [TRT] Tactic: 177040020707947851 Time: 0.155136 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 184229963126259101 [03/25/2022-13:24:38] [V] [TRT] Tactic: 184229963126259101 Time: 0.110464 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 289888059097454627 [03/25/2022-13:24:38] [V] [TRT] Tactic: 289888059097454627 Time: 0.13824 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32 Tactic: 328135613486708155 [03/25/2022-13:24:38] [V] [TRT] Tactic: 328135613486708155 Time: 0.228736 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 680740992583869928 [03/25/2022-13:24:38] [V] [TRT] Tactic: 680740992583869928 Time: 0.14336 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32 Tactic: 1111159740952609683 [03/25/2022-13:24:38] [V] [TRT] Tactic: 1111159740952609683 Time: 0.118784 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: 1134860903395928905 [03/25/2022-13:24:38] [V] [TRT] Tactic: 1134860903395928905 Time: 0.10752 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: 1276591930377039442 [03/25/2022-13:24:38] [V] [TRT] Tactic: 1276591930377039442 Time: 0.115072 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r3s3 Tactic: 1388866374720163187 [03/25/2022-13:24:38] [V] [TRT] Tactic: 1388866374720163187 Time: 0.168064 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32 Tactic: 1399501420456320585 [03/25/2022-13:24:38] [V] [TRT] Tactic: 1399501420456320585 Time: 0.140544 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: 1550399266192842845 [03/25/2022-13:24:38] [V] [TRT] Tactic: 1550399266192842845 Time: 0.142592 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 1572887561103143487 [03/25/2022-13:24:38] [V] [TRT] Tactic: 1572887561103143487 Time: 0.107648 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 1853122447892949466 [03/25/2022-13:24:38] [V] [TRT] Tactic: 1853122447892949466 Time: 0.140672 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: 2133329569091732311 [03/25/2022-13:24:38] [V] [TRT] Tactic: 2133329569091732311 Time: 0.138752 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32 Tactic: 2325023763229477890 [03/25/2022-13:24:38] [V] [TRT] Tactic: 2325023763229477890 Time: 0.077824 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: 2579824863892891529 [03/25/2022-13:24:38] [V] [TRT] Tactic: 2579824863892891529 Time: 0.191616 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32 Tactic: 2783960536172159663 [03/25/2022-13:24:38] [V] [TRT] Tactic: 2783960536172159663 Time: 0.076416 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32 Tactic: 2821711838552913693 [03/25/2022-13:24:38] [V] [TRT] Tactic: 2821711838552913693 Time: 0.11392 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64 Tactic: 2945009978756227538 [03/25/2022-13:24:38] [V] [TRT] Tactic: 2945009978756227538 Time: 0.080768 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_medium_nt_v1 Tactic: 2985940154541537814 [03/25/2022-13:24:38] [V] [TRT] Tactic: 2985940154541537814 Time: 0.14336 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 3284282970967328046 [03/25/2022-13:24:38] [V] [TRT] Tactic: 3284282970967328046 Time: 0.165632 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 3401614690060226673 [03/25/2022-13:24:38] [V] [TRT] Tactic: 3401614690060226673 Time: 0.139264 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32 Tactic: 3456719996792527006 [03/25/2022-13:24:38] [V] [TRT] Tactic: 3456719996792527006 Time: 0.101504 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 3512426920013359699 [03/25/2022-13:24:38] [V] [TRT] Tactic: 3512426920013359699 Time: 0.104576 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64 Tactic: 3651043333819148268 [03/25/2022-13:24:38] [V] [TRT] Tactic: 3651043333819148268 Time: 0.064384 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_medium_nt_v1 Tactic: 3899284354987683408 [03/25/2022-13:24:38] [V] [TRT] Tactic: 3899284354987683408 Time: 0.144384 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32 Tactic: 4042202769383439184 [03/25/2022-13:24:38] [V] [TRT] Tactic: 4042202769383439184 Time: 0.087168 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_large_nt_v1 Tactic: 4182625619810185112 [03/25/2022-13:24:38] [V] [TRT] Tactic: 4182625619810185112 Time: 0.15488 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 4214794893922618058 [03/25/2022-13:24:38] [V] [TRT] Tactic: 4214794893922618058 Time: 0.137216 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_epifadd Tactic: 4259547356717612415 [03/25/2022-13:24:38] [V] [TRT] Tactic: 4259547356717612415 Time: 0.109952 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 4384868749799132354 [03/25/2022-13:24:38] [V] [TRT] Tactic: 4384868749799132354 Time: 0.192 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64 Tactic: 4414594337986714263 [03/25/2022-13:24:38] [V] [TRT] Tactic: 4414594337986714263 Time: 0.063744 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_small_nt_v1 Tactic: 4717285412741024953 [03/25/2022-13:24:38] [V] [TRT] Tactic: 4717285412741024953 Time: 0.141312 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32 Tactic: 4734519122557206480 [03/25/2022-13:24:38] [V] [TRT] Tactic: 4734519122557206480 Time: 0.079872 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: 4922297020351187339 [03/25/2022-13:24:38] [V] [TRT] Tactic: 4922297020351187339 Time: 0.116864 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16_t1r3s3 Tactic: 4931167631624420067 [03/25/2022-13:24:38] [V] [TRT] Tactic: 4931167631624420067 Time: 0.13888 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_epifadd Tactic: 5121596860264626879 [03/25/2022-13:24:38] [V] [TRT] Tactic: 5121596860264626879 Time: 0.0768 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 5136656982162849059 [03/25/2022-13:24:38] [V] [TRT] Tactic: 5136656982162849059 Time: 0.165888 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: 5158259316594207439 [03/25/2022-13:24:38] [V] [TRT] Tactic: 5158259316594207439 Time: 0.086528 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 5189825015507701541 [03/25/2022-13:24:38] [V] [TRT] Tactic: 5189825015507701541 Time: 0.247424 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32 Tactic: 5424417905073460656 [03/25/2022-13:24:38] [V] [TRT] Tactic: 5424417905073460656 Time: 0.11584 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32 Tactic: 5442043907221427810 [03/25/2022-13:24:38] [V] [TRT] Tactic: 5442043907221427810 Time: 0.090624 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_t1r3s3 Tactic: 5544365258913999384 [03/25/2022-13:24:38] [V] [TRT] Tactic: 5544365258913999384 Time: 0.08896 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r3s3 Tactic: 5641967928706599451 [03/25/2022-13:24:38] [V] [TRT] Tactic: 5641967928706599451 Time: 0.222336 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 5721595115357140131 [03/25/2022-13:24:38] [V] [TRT] Tactic: 5721595115357140131 Time: 0.110336 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 5966973378912044513 [03/25/2022-13:24:38] [V] [TRT] Tactic: 5966973378912044513 Time: 0.0768 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 6004789655466615912 [03/25/2022-13:24:38] [V] [TRT] Tactic: 6004789655466615912 Time: 0.107648 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r3s3 Tactic: 6146901278630392829 [03/25/2022-13:24:38] [V] [TRT] Tactic: 6146901278630392829 Time: 0.079744 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16 Tactic: 6394572396369862482 [03/25/2022-13:24:38] [V] [TRT] Tactic: 6394572396369862482 Time: 0.223488 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_epifadd Tactic: 6434020722187266170 [03/25/2022-13:24:38] [V] [TRT] Tactic: 6434020722187266170 Time: 0.076032 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32 Tactic: 6781129591847482048 [03/25/2022-13:24:38] [V] [TRT] Tactic: 6781129591847482048 Time: 0.09024 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 6984451771200230840 [03/25/2022-13:24:38] [V] [TRT] Tactic: 6984451771200230840 Time: 0.116352 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 7048234086361926570 [03/25/2022-13:24:38] [V] [TRT] Tactic: 7048234086361926570 Time: 0.147456 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32 Tactic: 7077570591813340966 [03/25/2022-13:24:38] [V] [TRT] Tactic: 7077570591813340966 Time: 0.088064 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 7191893591576074000 [03/25/2022-13:24:38] [V] [TRT] Tactic: 7191893591576074000 Time: 0.139264 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_t1r3s3 Tactic: 7429976449747682901 [03/25/2022-13:24:38] [V] [TRT] Tactic: 7429976449747682901 Time: 0.103552 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 7438984192263206338 [03/25/2022-13:24:38] [V] [TRT] Tactic: 7438984192263206338 Time: 0.085376 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: 7504901284678552178 [03/25/2022-13:24:38] [V] [TRT] Tactic: 7504901284678552178 Time: 0.075776 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 8096257414008860171 [03/25/2022-13:24:38] [V] [TRT] Tactic: 8096257414008860171 Time: 0.088064 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 8128112048355596715 [03/25/2022-13:24:38] [V] [TRT] Tactic: 8128112048355596715 Time: 0.088064 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_indexed_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: 8751622450593766232 [03/25/2022-13:24:38] [V] [TRT] Tactic: 8751622450593766232 Time: 0.08192 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_indexed_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32 Tactic: 9064458886956700976 [03/25/2022-13:24:38] [V] [TRT] Tactic: 9064458886956700976 Time: 0.0832 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 9143438935315839085 [03/25/2022-13:24:38] [V] [TRT] Tactic: 9143438935315839085 Time: 0.137088 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32 Tactic: -9165697322068360861 [03/25/2022-13:24:38] [V] [TRT] Tactic: -9165697322068360861 Time: 0.078848 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_small_nt_v1 Tactic: -9118785798277698619 [03/25/2022-13:24:38] [V] [TRT] Tactic: -9118785798277698619 Time: 0.137088 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: -9108166971364503411 [03/25/2022-13:24:38] [V] [TRT] Tactic: -9108166971364503411 Time: 0.149632 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16 Tactic: -8861822316054763526 [03/25/2022-13:24:38] [V] [TRT] Tactic: -8861822316054763526 Time: 0.139264 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64 Tactic: -8791277710877987710 [03/25/2022-13:24:38] [V] [TRT] Tactic: -8791277710877987710 Time: 0.10304 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: -8691377209893505057 [03/25/2022-13:24:38] [V] [TRT] Tactic: -8691377209893505057 Time: 0.076544 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: -8520292213102999339 [03/25/2022-13:24:38] [V] [TRT] Tactic: -8520292213102999339 Time: 0.11712 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16_t1r3s3 Tactic: -8475551154769412306 [03/25/2022-13:24:38] [V] [TRT] Tactic: -8475551154769412306 Time: 0.141312 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -8417388128970254446 [03/25/2022-13:24:38] [V] [TRT] Tactic: -8417388128970254446 Time: 0.116736 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32 Tactic: -8263994888336646547 [03/25/2022-13:24:38] [V] [TRT] Tactic: -8263994888336646547 Time: 0.0768 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32 Tactic: -8205948405243401049 [03/25/2022-13:24:38] [V] [TRT] Tactic: -8205948405243401049 Time: 0.143232 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: -7992068592656168418 [03/25/2022-13:24:38] [V] [TRT] Tactic: -7992068592656168418 Time: 0.087936 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_t1r3s3 Tactic: -7898477046581738867 [03/25/2022-13:24:38] [V] [TRT] Tactic: -7898477046581738867 Time: 0.11072 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_epifadd Tactic: -7842775553137511386 [03/25/2022-13:24:38] [V] [TRT] Tactic: -7842775553137511386 Time: 0.077696 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32 Tactic: -7683887278997527517 [03/25/2022-13:24:38] [V] [TRT] Tactic: -7683887278997527517 Time: 0.130176 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: -7381370635708568663 [03/25/2022-13:24:38] [V] [TRT] Tactic: -7381370635708568663 Time: 0.089088 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -7129320389887881029 [03/25/2022-13:24:38] [V] [TRT] Tactic: -7129320389887881029 Time: 0.104576 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -6959995514028471820 [03/25/2022-13:24:38] [V] [TRT] Tactic: -6959995514028471820 Time: 0.123136 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_small_nt_v1 Tactic: -6400348606759295499 [03/25/2022-13:24:38] [V] [TRT] Tactic: -6400348606759295499 Time: 0.139264 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: -6371781333659293809 [03/25/2022-13:24:38] [V] [TRT] Tactic: -6371781333659293809 Time: 0.147072 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32 Tactic: -6256128573036943404 [03/25/2022-13:24:38] [V] [TRT] Tactic: -6256128573036943404 Time: 0.115712 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_small_nt_v1 Tactic: -5980889159865208399 [03/25/2022-13:24:38] [V] [TRT] Tactic: -5980889159865208399 Time: 0.141312 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_medium_nt_v1 Tactic: -5766140806760372989 [03/25/2022-13:24:38] [V] [TRT] Tactic: -5766140806760372989 Time: 0.147584 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: -5709079507616090666 [03/25/2022-13:24:38] [V] [TRT] Tactic: -5709079507616090666 Time: 0.075904 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: -5698636014239116282 [03/25/2022-13:24:38] [V] [TRT] Tactic: -5698636014239116282 Time: 0.07616 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16 Tactic: -5180570335464125033 [03/25/2022-13:24:38] [V] [TRT] Tactic: -5180570335464125033 Time: 0.145664 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -4933563390723451692 [03/25/2022-13:24:38] [V] [TRT] Tactic: -4933563390723451692 Time: 0.105472 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_medium_nt_v1 Tactic: -4516822589357530549 [03/25/2022-13:24:38] [V] [TRT] Tactic: -4516822589357530549 Time: 0.14848 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64 Tactic: -4232916483289779353 [03/25/2022-13:24:38] [V] [TRT] Tactic: -4232916483289779353 Time: 0.16576 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64 Tactic: -3460842194336717186 [03/25/2022-13:24:38] [V] [TRT] Tactic: -3460842194336717186 Time: 0.081792 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: -3413217501222406256 [03/25/2022-13:24:38] [V] [TRT] Tactic: -3413217501222406256 Time: 0.074624 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: -3280888557222886418 [03/25/2022-13:24:38] [V] [TRT] Tactic: -3280888557222886418 Time: 0.092544 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -3238475748440751107 [03/25/2022-13:24:38] [V] [TRT] Tactic: -3238475748440751107 Time: 0.085248 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: -3182884991006484042 [03/25/2022-13:24:38] [V] [TRT] Tactic: -3182884991006484042 Time: 0.076928 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -3173468756112541306 [03/25/2022-13:24:38] [V] [TRT] Tactic: -3173468756112541306 Time: 0.14016 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_large_nt_v1 Tactic: -2917455979290586480 [03/25/2022-13:24:38] [V] [TRT] Tactic: -2917455979290586480 Time: 0.142336 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -2741641298163591508 [03/25/2022-13:24:38] [V] [TRT] Tactic: -2741641298163591508 Time: 0.08704 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_large_nt_v1 Tactic: -2571022005763160364 [03/25/2022-13:24:38] [V] [TRT] Tactic: -2571022005763160364 Time: 0.147328 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: -2499089240293650188 [03/25/2022-13:24:38] [V] [TRT] Tactic: -2499089240293650188 Time: 0.144384 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: -2328318099174473157 [03/25/2022-13:24:38] [V] [TRT] Tactic: -2328318099174473157 Time: 0.150528 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: -2083778562631872334 [03/25/2022-13:24:38] [V] [TRT] Tactic: -2083778562631872334 Time: 0.090112 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32 Tactic: -2054375205435666404 [03/25/2022-13:24:38] [V] [TRT] Tactic: -2054375205435666404 Time: 0.104064 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -1546787387293556842 [03/25/2022-13:24:38] [V] [TRT] Tactic: -1546787387293556842 Time: 0.076032 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32 Tactic: -1498626619443284096 [03/25/2022-13:24:38] [V] [TRT] Tactic: -1498626619443284096 Time: 0.109824 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -1471245223605064669 [03/25/2022-13:24:38] [V] [TRT] Tactic: -1471245223605064669 Time: 0.12672 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32 Tactic: -1283580231568512025 [03/25/2022-13:24:38] [V] [TRT] Tactic: -1283580231568512025 Time: 0.169216 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: -1224421172675151280 [03/25/2022-13:24:38] [V] [TRT] Tactic: -1224421172675151280 Time: 0.07488 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_epifadd Tactic: -1173968681844185579 [03/25/2022-13:24:38] [V] [TRT] Tactic: -1173968681844185579 Time: 0.169984 [03/25/2022-13:24:38] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -921247911551089037 [03/25/2022-13:24:39] [V] [TRT] Tactic: -921247911551089037 Time: 0.075904 [03/25/2022-13:24:39] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32 Tactic: -762222380308749469 [03/25/2022-13:24:39] [V] [TRT] Tactic: -762222380308749469 Time: 0.106624 [03/25/2022-13:24:39] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: -556794153877490941 [03/25/2022-13:24:39] [V] [TRT] Tactic: -556794153877490941 Time: 0.10752 [03/25/2022-13:24:39] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -516725800067794372 [03/25/2022-13:24:39] [V] [TRT] Tactic: -516725800067794372 Time: 0.077312 [03/25/2022-13:24:39] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_large_nt_v1 Tactic: -428104331444385564 [03/25/2022-13:24:39] [V] [TRT] Tactic: -428104331444385564 Time: 0.1504 [03/25/2022-13:24:39] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16 Tactic: -366411318217594794 [03/25/2022-13:24:39] [V] [TRT] Tactic: -366411318217594794 Time: 0.16896 [03/25/2022-13:24:39] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16 Tactic: -351548418071036983 [03/25/2022-13:24:39] [V] [TRT] Tactic: -351548418071036983 Time: 0.13952 [03/25/2022-13:24:39] [V] [TRT] Fastest Tactic: 4414594337986714263 Time: 0.063744 [03/25/2022-13:24:39] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: 4414594337986714263 [03/25/2022-13:24:39] [V] [TRT] =============== Computing costs for [03/25/2022-13:24:39] [V] [TRT] *************** Autotuning format combination: Int8(12544,196:4,14,1), Int8(50176,196:4,14,1) -> Int8(50176,196:4,14,1) *************** [03/25/2022-13:24:39] [V] [TRT] *************** Autotuning format combination: Int8(12544,196:4,14,1), Int8(6272,196:32,14,1) -> Int8(6272,196:32,14,1) *************** [03/25/2022-13:24:39] [V] [TRT] *************** Autotuning format combination: Int8(1568,196:32,14,1), Int8(6272,196:32,14,1) -> Int8(6272,196:32,14,1) *************** [03/25/2022-13:24:39] [V] [TRT] =============== Computing costs for [03/25/2022-13:24:39] [V] [TRT] *************** Autotuning format combination: Int8(50176,196:4,14,1) -> Int8(12544,196:4,14,1) *************** [03/25/2022-13:24:39] [V] [TRT] *************** Autotuning format combination: Int8(50176,196:4,14,1) -> Int8(1568,196:32,14,1) *************** [03/25/2022-13:24:39] [V] [TRT] *************** Autotuning format combination: Int8(6272,196:32,14,1) -> Int8(1568,196:32,14,1) *************** [03/25/2022-13:24:39] [V] [TRT] Skip timing cache hit (epiFadd mismatch) for node: sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540, LayerImpl: CaskConvolution, tactic: -6779804930216439173 [03/25/2022-13:24:39] [V] [TRT] --------------- Timing Runner: sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 (CudaGroupConvolution) [03/25/2022-13:24:39] [V] [TRT] CudaGroupConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:39] [V] [TRT] --------------- Timing Runner: sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 (CudaDepthwiseConvolution) [03/25/2022-13:24:39] [V] [TRT] CudaDepthwiseConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:39] [V] [TRT] --------------- Timing Runner: sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 (FusedConvActConvolution) [03/25/2022-13:24:39] [V] [TRT] FusedConvActConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:39] [V] [TRT] --------------- Timing Runner: sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 (CaskConvolution) [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16_t1r1s1 Tactic: 68468667201176803 [03/25/2022-13:24:39] [V] [TRT] Tactic: 68468667201176803 Time: 0.074752 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 125145153013230687 [03/25/2022-13:24:39] [V] [TRT] Tactic: 125145153013230687 Time: 0.067072 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_interior_nt_v1 Tactic: 434957160407688216 [03/25/2022-13:24:39] [V] [TRT] Tactic: 434957160407688216 Time: 0.078592 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 805889586762897346 [03/25/2022-13:24:39] [V] [TRT] Tactic: 805889586762897346 Time: 0.046336 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: 857001784974286465 [03/25/2022-13:24:39] [V] [TRT] Tactic: 857001784974286465 Time: 0.039168 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 1214130898909872671 [03/25/2022-13:24:39] [V] [TRT] Tactic: 1214130898909872671 Time: 0.090752 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_conv_fprop_smallk_i8i8_i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_warptilesize16x32_stage5_warpsize1x4x2_r1s1_u1v1_hw0_c1024_scalebias_relu Tactic: 1263683011321748626 [03/25/2022-13:24:39] [V] [TRT] Tactic: 1263683011321748626 Time: 0.039552 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 1278425129871930205 [03/25/2022-13:24:39] [V] [TRT] Tactic: 1278425129871930205 Time: 0.045312 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 1583811548148740665 [03/25/2022-13:24:39] [V] [TRT] Tactic: 1583811548148740665 Time: 0.068224 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 1701344857577810806 [03/25/2022-13:24:39] [V] [TRT] Tactic: 1701344857577810806 Time: 0.060672 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 1797231177354918208 [03/25/2022-13:24:39] [V] [TRT] Tactic: 1797231177354918208 Time: 0.081408 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2004812516525036381 [03/25/2022-13:24:39] [V] [TRT] Tactic: 2004812516525036381 Time: 0.060672 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2030033463723799063 [03/25/2022-13:24:39] [V] [TRT] Tactic: 2030033463723799063 Time: 0.050304 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_interior_nt_v1 Tactic: 2346437292116182513 [03/25/2022-13:24:39] [V] [TRT] Tactic: 2346437292116182513 Time: 0.07616 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2376898825218218566 [03/25/2022-13:24:39] [V] [TRT] Tactic: 2376898825218218566 Time: 0.044416 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_interior_nt_v1 Tactic: 2522133112320625287 [03/25/2022-13:24:39] [V] [TRT] Tactic: 2522133112320625287 Time: 0.074752 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2548171972648455240 [03/25/2022-13:24:39] [V] [TRT] Tactic: 2548171972648455240 Time: 0.047872 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 2548946449357458230 [03/25/2022-13:24:39] [V] [TRT] Tactic: 2548946449357458230 Time: 0.0896 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: 2570666021825229009 [03/25/2022-13:24:39] [V] [TRT] Tactic: 2570666021825229009 Time: 0.077824 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 2678520742286844763 [03/25/2022-13:24:39] [V] [TRT] Tactic: 2678520742286844763 Time: 0.11328 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 2756291002030759362 [03/25/2022-13:24:39] [V] [TRT] Tactic: 2756291002030759362 Time: 0.058496 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2972948223367788520 [03/25/2022-13:24:39] [V] [TRT] Tactic: 2972948223367788520 Time: 0.048768 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_medium_nt_v1 Tactic: 2985940154541537814 [03/25/2022-13:24:39] [V] [TRT] Tactic: 2985940154541537814 Time: 0.076032 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 3043273137345374664 [03/25/2022-13:24:39] [V] [TRT] Tactic: 3043273137345374664 Time: 0.09088 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 3221677093659484230 [03/25/2022-13:24:39] [V] [TRT] Tactic: 3221677093659484230 Time: 0.075008 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: 3242897809704328258 [03/25/2022-13:24:39] [V] [TRT] Tactic: 3242897809704328258 Time: 0.078336 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 3312456766204252694 [03/25/2022-13:24:39] [V] [TRT] Tactic: 3312456766204252694 Time: 0.099456 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 3538565962642681625 [03/25/2022-13:24:39] [V] [TRT] Tactic: 3538565962642681625 Time: 0.067072 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 3541919052468401776 [03/25/2022-13:24:39] [V] [TRT] Tactic: 3541919052468401776 Time: 0.070912 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 3593397928177382100 [03/25/2022-13:24:39] [V] [TRT] Tactic: 3593397928177382100 Time: 0.091392 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 3670282018109435863 [03/25/2022-13:24:39] [V] [TRT] Tactic: 3670282018109435863 Time: 0.056704 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 3671413346254027573 [03/25/2022-13:24:39] [V] [TRT] Tactic: 3671413346254027573 Time: 0.058752 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_medium_nt_v1 Tactic: 3899284354987683408 [03/25/2022-13:24:39] [V] [TRT] Tactic: 3899284354987683408 Time: 0.077056 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 3927509214678622419 [03/25/2022-13:24:39] [V] [TRT] Tactic: 3927509214678622419 Time: 0.077056 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 4112572034735311841 [03/25/2022-13:24:39] [V] [TRT] Tactic: 4112572034735311841 Time: 0.115584 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 4239974928951431644 [03/25/2022-13:24:39] [V] [TRT] Tactic: 4239974928951431644 Time: 0.070016 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 4610760414797216079 [03/25/2022-13:24:39] [V] [TRT] Tactic: 4610760414797216079 Time: 0.059776 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_small_nt_v1 Tactic: 4717285412741024953 [03/25/2022-13:24:39] [V] [TRT] Tactic: 4717285412741024953 Time: 0.075776 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 4796956614760326119 [03/25/2022-13:24:39] [V] [TRT] Tactic: 4796956614760326119 Time: 0.053632 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: 4909502217677847353 [03/25/2022-13:24:39] [V] [TRT] Tactic: 4909502217677847353 Time: 0.04096 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 4919361344804309192 [03/25/2022-13:24:39] [V] [TRT] Tactic: 4919361344804309192 Time: 0.0896 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 5043674678294309681 [03/25/2022-13:24:39] [V] [TRT] Tactic: 5043674678294309681 Time: 0.059008 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_interior_nt_v1 Tactic: 5126565865931538390 [03/25/2022-13:24:39] [V] [TRT] Tactic: 5126565865931538390 Time: 0.077312 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5204702486885981735 [03/25/2022-13:24:39] [V] [TRT] Tactic: 5204702486885981735 Time: 0.058112 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 5375256703210220108 [03/25/2022-13:24:39] [V] [TRT] Tactic: 5375256703210220108 Time: 0.055296 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 5424258848951129084 [03/25/2022-13:24:39] [V] [TRT] Tactic: 5424258848951129084 Time: 0.041216 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5443897483205284103 [03/25/2022-13:24:39] [V] [TRT] Tactic: 5443897483205284103 Time: 0.060544 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5707566217891294846 [03/25/2022-13:24:39] [V] [TRT] Tactic: 5707566217891294846 Time: 0.049536 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5986622376339202983 [03/25/2022-13:24:39] [V] [TRT] Tactic: 5986622376339202983 Time: 0.0672 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6007888770437705057 [03/25/2022-13:24:39] [V] [TRT] Tactic: 6007888770437705057 Time: 0.053632 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 6405251167055673379 [03/25/2022-13:24:39] [V] [TRT] Tactic: 6405251167055673379 Time: 0.054528 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 6433368103202497147 [03/25/2022-13:24:39] [V] [TRT] Tactic: 6433368103202497147 Time: 0.050688 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6441948709525127755 [03/25/2022-13:24:39] [V] [TRT] Tactic: 6441948709525127755 Time: 0.093312 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 6443933097134654777 [03/25/2022-13:24:39] [V] [TRT] Tactic: 6443933097134654777 Time: 0.053504 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6457435868048963632 [03/25/2022-13:24:39] [V] [TRT] Tactic: 6457435868048963632 Time: 0.059648 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1 Tactic: 6510345569544721081 [03/25/2022-13:24:39] [V] [TRT] Tactic: 6510345569544721081 Time: 0.077952 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 6793988781414507278 [03/25/2022-13:24:39] [V] [TRT] Tactic: 6793988781414507278 Time: 0.046464 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 6880710371738875469 [03/25/2022-13:24:39] [V] [TRT] Tactic: 6880710371738875469 Time: 0.063232 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6925201228918187099 [03/25/2022-13:24:39] [V] [TRT] Tactic: 6925201228918187099 Time: 0.047232 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6991524515605108718 [03/25/2022-13:24:39] [V] [TRT] Tactic: 6991524515605108718 Time: 0.0736 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 7245509442265271220 [03/25/2022-13:24:39] [V] [TRT] Tactic: 7245509442265271220 Time: 0.066176 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 7318929579222925725 [03/25/2022-13:24:39] [V] [TRT] Tactic: 7318929579222925725 Time: 0.054272 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 7731430299029542276 [03/25/2022-13:24:39] [V] [TRT] Tactic: 7731430299029542276 Time: 0.045824 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 7738495016763012180 [03/25/2022-13:24:39] [V] [TRT] Tactic: 7738495016763012180 Time: 0.046336 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: 7886967395128926382 [03/25/2022-13:24:39] [V] [TRT] Tactic: 7886967395128926382 Time: 0.0512 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 8142283985160822229 [03/25/2022-13:24:39] [V] [TRT] Tactic: 8142283985160822229 Time: 0.048128 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 8173975624668590862 [03/25/2022-13:24:39] [V] [TRT] Tactic: 8173975624668590862 Time: 0.048384 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 8234775147403903473 [03/25/2022-13:24:39] [V] [TRT] Tactic: 8234775147403903473 Time: 0.049024 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 8524082966802584889 [03/25/2022-13:24:39] [V] [TRT] Tactic: 8524082966802584889 Time: 0.050432 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 8684013308930763400 [03/25/2022-13:24:39] [V] [TRT] Tactic: 8684013308930763400 Time: 0.073344 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 8765382722978397630 [03/25/2022-13:24:39] [V] [TRT] Tactic: 8765382722978397630 Time: 0.050816 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 8843193587782643431 [03/25/2022-13:24:39] [V] [TRT] Tactic: 8843193587782643431 Time: 0.067072 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 8883810517410230831 [03/25/2022-13:24:39] [V] [TRT] Tactic: 8883810517410230831 Time: 0.05184 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 8930797211803511337 [03/25/2022-13:24:39] [V] [TRT] Tactic: 8930797211803511337 Time: 0.068224 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 8935070489925739043 [03/25/2022-13:24:39] [V] [TRT] Tactic: 8935070489925739043 Time: 0.05056 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1 Tactic: 9062173295331155069 [03/25/2022-13:24:39] [V] [TRT] Tactic: 9062173295331155069 Time: 0.11328 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_small_nt_v1 Tactic: -9118785798277698619 [03/25/2022-13:24:39] [V] [TRT] Tactic: -9118785798277698619 Time: 0.075904 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -8985599729413291927 [03/25/2022-13:24:39] [V] [TRT] Tactic: -8985599729413291927 Time: 0.054528 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -8972697510150675429 [03/25/2022-13:24:39] [V] [TRT] Tactic: -8972697510150675429 Time: 0.06976 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -8943710627305202139 [03/25/2022-13:24:39] [V] [TRT] Tactic: -8943710627305202139 Time: 0.07744 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -8859846367886814331 [03/25/2022-13:24:39] [V] [TRT] Tactic: -8859846367886814331 Time: 0.084096 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -8638624340850784688 [03/25/2022-13:24:39] [V] [TRT] Tactic: -8638624340850784688 Time: 0.074752 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -8556775352640313933 [03/25/2022-13:24:39] [V] [TRT] Tactic: -8556775352640313933 Time: 0.05888 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -8382298409581540699 [03/25/2022-13:24:39] [V] [TRT] Tactic: -8382298409581540699 Time: 0.13504 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -8172318747337038866 [03/25/2022-13:24:39] [V] [TRT] Tactic: -8172318747337038866 Time: 0.08896 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -8038164441468184723 [03/25/2022-13:24:39] [V] [TRT] Tactic: -8038164441468184723 Time: 0.060032 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -7844028314176826857 [03/25/2022-13:24:39] [V] [TRT] Tactic: -7844028314176826857 Time: 0.111104 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -7674507941016740570 [03/25/2022-13:24:39] [V] [TRT] Tactic: -7674507941016740570 Time: 0.049152 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -7364286662638617917 [03/25/2022-13:24:39] [V] [TRT] Tactic: -7364286662638617917 Time: 0.054144 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r1s1 Tactic: -7361755530333096258 [03/25/2022-13:24:39] [V] [TRT] Tactic: -7361755530333096258 Time: 0.107776 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_t1r1s1 Tactic: -7289760022626653388 [03/25/2022-13:24:39] [V] [TRT] Tactic: -7289760022626653388 Time: 0.062336 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -7106539943789766885 [03/25/2022-13:24:39] [V] [TRT] Tactic: -7106539943789766885 Time: 0.088192 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -6969478418607271266 [03/25/2022-13:24:39] [V] [TRT] Tactic: -6969478418607271266 Time: 0.08832 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -6930438165437733000 [03/25/2022-13:24:39] [V] [TRT] Tactic: -6930438165437733000 Time: 0.137216 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -6879607992933502380 [03/25/2022-13:24:39] [V] [TRT] Tactic: -6879607992933502380 Time: 0.063616 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -6839669803644810934 [03/25/2022-13:24:39] [V] [TRT] Tactic: -6839669803644810934 Time: 0.070656 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -6812830108414456369 [03/25/2022-13:24:39] [V] [TRT] Tactic: -6812830108414456369 Time: 0.07104 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -6779804930216439173 [03/25/2022-13:24:39] [V] [TRT] Tactic: -6779804930216439173 Time: 0.046336 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -6527178416855951297 [03/25/2022-13:24:39] [V] [TRT] Tactic: -6527178416855951297 Time: 0.109696 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -6510232214299595844 [03/25/2022-13:24:39] [V] [TRT] Tactic: -6510232214299595844 Time: 0.109696 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_small_nt_v1 Tactic: -6400348606759295499 [03/25/2022-13:24:39] [V] [TRT] Tactic: -6400348606759295499 Time: 0.08832 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -6346247605026339453 [03/25/2022-13:24:39] [V] [TRT] Tactic: -6346247605026339453 Time: 0.091648 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -6232597026469067819 [03/25/2022-13:24:39] [V] [TRT] Tactic: -6232597026469067819 Time: 0.079232 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_small_nt_v1 Tactic: -5980889159865208399 [03/25/2022-13:24:39] [V] [TRT] Tactic: -5980889159865208399 Time: 0.090752 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_medium_nt_v1 Tactic: -5766140806760372989 [03/25/2022-13:24:39] [V] [TRT] Tactic: -5766140806760372989 Time: 0.09344 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -5697614955743334137 [03/25/2022-13:24:39] [V] [TRT] Tactic: -5697614955743334137 Time: 0.0864 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -5671123121710113970 [03/25/2022-13:24:39] [V] [TRT] Tactic: -5671123121710113970 Time: 0.086912 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -5615581362569252260 [03/25/2022-13:24:39] [V] [TRT] Tactic: -5615581362569252260 Time: 0.09536 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -5562968047117507056 [03/25/2022-13:24:39] [V] [TRT] Tactic: -5562968047117507056 Time: 0.062592 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -5516472881360101487 [03/25/2022-13:24:39] [V] [TRT] Tactic: -5516472881360101487 Time: 0.082304 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -5311474420963248369 [03/25/2022-13:24:39] [V] [TRT] Tactic: -5311474420963248369 Time: 0.119936 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -5170003087447722174 [03/25/2022-13:24:39] [V] [TRT] Tactic: -5170003087447722174 Time: 0.113408 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -4889586143772361690 [03/25/2022-13:24:39] [V] [TRT] Tactic: -4889586143772361690 Time: 0.070656 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -4889498558023475527 [03/25/2022-13:24:39] [V] [TRT] Tactic: -4889498558023475527 Time: 0.061696 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -4849712423393454704 [03/25/2022-13:24:39] [V] [TRT] Tactic: -4849712423393454704 Time: 0.070272 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: -4681913707320020520 [03/25/2022-13:24:39] [V] [TRT] Tactic: -4681913707320020520 Time: 0.046592 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_medium_nt_v1 Tactic: -4516822589357530549 [03/25/2022-13:24:39] [V] [TRT] Tactic: -4516822589357530549 Time: 0.094592 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -4455415102719506646 [03/25/2022-13:24:39] [V] [TRT] Tactic: -4455415102719506646 Time: 0.080512 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -4425346730823666456 [03/25/2022-13:24:39] [V] [TRT] Tactic: -4425346730823666456 Time: 0.085376 [03/25/2022-13:24:39] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r1s1 Tactic: -4260476497340370474 [03/25/2022-13:24:40] [V] [TRT] Tactic: -4260476497340370474 Time: 0.135808 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -4182501876984672402 [03/25/2022-13:24:40] [V] [TRT] Tactic: -4182501876984672402 Time: 0.087296 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_t1r1s1 Tactic: -4151617293257698859 [03/25/2022-13:24:40] [V] [TRT] Tactic: -4151617293257698859 Time: 0.06464 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -3862908719298381451 [03/25/2022-13:24:40] [V] [TRT] Tactic: -3862908719298381451 Time: 0.04736 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -3825889760337461729 [03/25/2022-13:24:40] [V] [TRT] Tactic: -3825889760337461729 Time: 0.1088 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -3797022944823726673 [03/25/2022-13:24:40] [V] [TRT] Tactic: -3797022944823726673 Time: 0.079488 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -3613322253849278738 [03/25/2022-13:24:40] [V] [TRT] Tactic: -3613322253849278738 Time: 0.140288 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -3577322188448771475 [03/25/2022-13:24:40] [V] [TRT] Tactic: -3577322188448771475 Time: 0.086784 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16_t1r1s1 Tactic: -3531681826488401618 [03/25/2022-13:24:40] [V] [TRT] Tactic: -3531681826488401618 Time: 0.089088 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -3305554949874552860 [03/25/2022-13:24:40] [V] [TRT] Tactic: -3305554949874552860 Time: 0.136448 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -3288585994448820820 [03/25/2022-13:24:40] [V] [TRT] Tactic: -3288585994448820820 Time: 0.080896 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -2754311112012636251 [03/25/2022-13:24:40] [V] [TRT] Tactic: -2754311112012636251 Time: 0.088704 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -2432868635536396215 [03/25/2022-13:24:40] [V] [TRT] Tactic: -2432868635536396215 Time: 0.081024 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -2379804152300264660 [03/25/2022-13:24:40] [V] [TRT] Tactic: -2379804152300264660 Time: 0.108288 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: -2352253835013627337 [03/25/2022-13:24:40] [V] [TRT] Tactic: -2352253835013627337 Time: 0.050048 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -2335587136911650799 [03/25/2022-13:24:40] [V] [TRT] Tactic: -2335587136911650799 Time: 0.0736 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r1s1 Tactic: -2315453944962430928 [03/25/2022-13:24:40] [V] [TRT] Tactic: -2315453944962430928 Time: 0.05952 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: -2238364958919154661 [03/25/2022-13:24:40] [V] [TRT] Tactic: -2238364958919154661 Time: 0.093312 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -1916483171117495388 [03/25/2022-13:24:40] [V] [TRT] Tactic: -1916483171117495388 Time: 0.08896 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -1740762957710554518 [03/25/2022-13:24:40] [V] [TRT] Tactic: -1740762957710554518 Time: 0.13696 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -1549742793039499659 [03/25/2022-13:24:40] [V] [TRT] Tactic: -1549742793039499659 Time: 0.08 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -1499578657823798783 [03/25/2022-13:24:40] [V] [TRT] Tactic: -1499578657823798783 Time: 0.081152 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -1494157908358500249 [03/25/2022-13:24:40] [V] [TRT] Tactic: -1494157908358500249 Time: 0.098944 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_t1r1s1 Tactic: -1328736756812546664 [03/25/2022-13:24:40] [V] [TRT] Tactic: -1328736756812546664 Time: 0.081792 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -1006589727652607355 [03/25/2022-13:24:40] [V] [TRT] Tactic: -1006589727652607355 Time: 0.097024 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -713022856474991236 [03/25/2022-13:24:40] [V] [TRT] Tactic: -713022856474991236 Time: 0.141056 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -619668460699260222 [03/25/2022-13:24:40] [V] [TRT] Tactic: -619668460699260222 Time: 0.093568 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -405554772060757402 [03/25/2022-13:24:40] [V] [TRT] Tactic: -405554772060757402 Time: 0.070016 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -375949437730908730 [03/25/2022-13:24:40] [V] [TRT] Tactic: -375949437730908730 Time: 0.068736 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: -233227833606287806 [03/25/2022-13:24:40] [V] [TRT] Tactic: -233227833606287806 Time: 0.07168 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -111878368089469751 [03/25/2022-13:24:40] [V] [TRT] Tactic: -111878368089469751 Time: 0.080384 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -48936598874722005 [03/25/2022-13:24:40] [V] [TRT] Tactic: -48936598874722005 Time: 0.062208 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -19707840769375107 [03/25/2022-13:24:40] [V] [TRT] Tactic: -19707840769375107 Time: 0.086784 [03/25/2022-13:24:40] [V] [TRT] Fastest Tactic: 857001784974286465 Time: 0.039168 [03/25/2022-13:24:40] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: 857001784974286465 [03/25/2022-13:24:40] [V] [TRT] =============== Computing costs for [03/25/2022-13:24:40] [V] [TRT] *************** Autotuning format combination: Int8(12544,196:4,14,1) -> Int8(12544,196:4,14,1) *************** [03/25/2022-13:24:40] [V] [TRT] *************** Autotuning format combination: Int8(12544,196:4,14,1) -> Int8(1568,196:32,14,1) *************** [03/25/2022-13:24:40] [V] [TRT] *************** Autotuning format combination: Int8(1568,196:32,14,1) -> Int8(1568,196:32,14,1) *************** [03/25/2022-13:24:40] [V] [TRT] Skip timing cache hit (epiFadd mismatch) for node: sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555, LayerImpl: CaskConvolution, tactic: 4414594337986714263 [03/25/2022-13:24:40] [V] [TRT] --------------- Timing Runner: sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 (CudaGroupConvolution) [03/25/2022-13:24:40] [V] [TRT] CudaGroupConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:40] [V] [TRT] --------------- Timing Runner: sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 (CudaDepthwiseConvolution) [03/25/2022-13:24:40] [V] [TRT] CudaDepthwiseConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:40] [V] [TRT] --------------- Timing Runner: sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 (FusedConvActConvolution) [03/25/2022-13:24:40] [V] [TRT] FusedConvActConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:40] [V] [TRT] --------------- Timing Runner: sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 (CaskConvolution) [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_epifadd Tactic: 177040020707947851 [03/25/2022-13:24:40] [V] [TRT] Tactic: 177040020707947851 Time: 0.1888 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 184229963126259101 [03/25/2022-13:24:40] [V] [TRT] Tactic: 184229963126259101 Time: 0.133888 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 289888059097454627 [03/25/2022-13:24:40] [V] [TRT] Tactic: 289888059097454627 Time: 0.167808 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32 Tactic: 328135613486708155 [03/25/2022-13:24:40] [V] [TRT] Tactic: 328135613486708155 Time: 0.276608 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 680740992583869928 [03/25/2022-13:24:40] [V] [TRT] Tactic: 680740992583869928 Time: 0.17408 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32 Tactic: 1111159740952609683 [03/25/2022-13:24:40] [V] [TRT] Tactic: 1111159740952609683 Time: 0.145152 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: 1134860903395928905 [03/25/2022-13:24:40] [V] [TRT] Tactic: 1134860903395928905 Time: 0.130944 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: 1276591930377039442 [03/25/2022-13:24:40] [V] [TRT] Tactic: 1276591930377039442 Time: 0.140416 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r3s3 Tactic: 1388866374720163187 [03/25/2022-13:24:40] [V] [TRT] Tactic: 1388866374720163187 Time: 0.204544 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32 Tactic: 1399501420456320585 [03/25/2022-13:24:40] [V] [TRT] Tactic: 1399501420456320585 Time: 0.170496 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: 1550399266192842845 [03/25/2022-13:24:40] [V] [TRT] Tactic: 1550399266192842845 Time: 0.17344 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 1572887561103143487 [03/25/2022-13:24:40] [V] [TRT] Tactic: 1572887561103143487 Time: 0.130944 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 1853122447892949466 [03/25/2022-13:24:40] [V] [TRT] Tactic: 1853122447892949466 Time: 0.170624 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: 2133329569091732311 [03/25/2022-13:24:40] [V] [TRT] Tactic: 2133329569091732311 Time: 0.168832 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32 Tactic: 2325023763229477890 [03/25/2022-13:24:40] [V] [TRT] Tactic: 2325023763229477890 Time: 0.095104 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: 2579824863892891529 [03/25/2022-13:24:40] [V] [TRT] Tactic: 2579824863892891529 Time: 0.233472 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32 Tactic: 2783960536172159663 [03/25/2022-13:24:40] [V] [TRT] Tactic: 2783960536172159663 Time: 0.0928 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32 Tactic: 2821711838552913693 [03/25/2022-13:24:40] [V] [TRT] Tactic: 2821711838552913693 Time: 0.137472 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64 Tactic: 2945009978756227538 [03/25/2022-13:24:40] [V] [TRT] Tactic: 2945009978756227538 Time: 0.098304 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_medium_nt_v1 Tactic: 2985940154541537814 [03/25/2022-13:24:40] [V] [TRT] Tactic: 2985940154541537814 Time: 0.174336 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 3284282970967328046 [03/25/2022-13:24:40] [V] [TRT] Tactic: 3284282970967328046 Time: 0.201344 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 3401614690060226673 [03/25/2022-13:24:40] [V] [TRT] Tactic: 3401614690060226673 Time: 0.169472 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32 Tactic: 3456719996792527006 [03/25/2022-13:24:40] [V] [TRT] Tactic: 3456719996792527006 Time: 0.123648 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 3512426920013359699 [03/25/2022-13:24:40] [V] [TRT] Tactic: 3512426920013359699 Time: 0.127104 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64 Tactic: 3651043333819148268 [03/25/2022-13:24:40] [V] [TRT] Tactic: 3651043333819148268 Time: 0.078208 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_medium_nt_v1 Tactic: 3899284354987683408 [03/25/2022-13:24:40] [V] [TRT] Tactic: 3899284354987683408 Time: 0.17536 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32 Tactic: 4042202769383439184 [03/25/2022-13:24:40] [V] [TRT] Tactic: 4042202769383439184 Time: 0.106112 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_large_nt_v1 Tactic: 4182625619810185112 [03/25/2022-13:24:40] [V] [TRT] Tactic: 4182625619810185112 Time: 0.189056 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 4214794893922618058 [03/25/2022-13:24:40] [V] [TRT] Tactic: 4214794893922618058 Time: 0.166912 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_epifadd Tactic: 4259547356717612415 [03/25/2022-13:24:40] [V] [TRT] Tactic: 4259547356717612415 Time: 0.134016 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 4384868749799132354 [03/25/2022-13:24:40] [V] [TRT] Tactic: 4384868749799132354 Time: 0.2336 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64 Tactic: 4414594337986714263 [03/25/2022-13:24:40] [V] [TRT] Tactic: 4414594337986714263 Time: 0.077696 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_small_nt_v1 Tactic: 4717285412741024953 [03/25/2022-13:24:40] [V] [TRT] Tactic: 4717285412741024953 Time: 0.171904 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32 Tactic: 4734519122557206480 [03/25/2022-13:24:40] [V] [TRT] Tactic: 4734519122557206480 Time: 0.09728 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: 4922297020351187339 [03/25/2022-13:24:40] [V] [TRT] Tactic: 4922297020351187339 Time: 0.141952 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16_t1r3s3 Tactic: 4931167631624420067 [03/25/2022-13:24:40] [V] [TRT] Tactic: 4931167631624420067 Time: 0.16896 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_epifadd Tactic: 5121596860264626879 [03/25/2022-13:24:40] [V] [TRT] Tactic: 5121596860264626879 Time: 0.093952 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 5136656982162849059 [03/25/2022-13:24:40] [V] [TRT] Tactic: 5136656982162849059 Time: 0.20224 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: 5158259316594207439 [03/25/2022-13:24:40] [V] [TRT] Tactic: 5158259316594207439 Time: 0.105472 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 5189825015507701541 [03/25/2022-13:24:40] [V] [TRT] Tactic: 5189825015507701541 Time: 0.30144 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32 Tactic: 5424417905073460656 [03/25/2022-13:24:40] [V] [TRT] Tactic: 5424417905073460656 Time: 0.141184 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32 Tactic: 5442043907221427810 [03/25/2022-13:24:40] [V] [TRT] Tactic: 5442043907221427810 Time: 0.110336 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_t1r3s3 Tactic: 5544365258913999384 [03/25/2022-13:24:40] [V] [TRT] Tactic: 5544365258913999384 Time: 0.107904 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r3s3 Tactic: 5641967928706599451 [03/25/2022-13:24:40] [V] [TRT] Tactic: 5641967928706599451 Time: 0.27072 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 5721595115357140131 [03/25/2022-13:24:40] [V] [TRT] Tactic: 5721595115357140131 Time: 0.134144 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 5966973378912044513 [03/25/2022-13:24:40] [V] [TRT] Tactic: 5966973378912044513 Time: 0.093568 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 6004789655466615912 [03/25/2022-13:24:40] [V] [TRT] Tactic: 6004789655466615912 Time: 0.130944 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r3s3 Tactic: 6146901278630392829 [03/25/2022-13:24:40] [V] [TRT] Tactic: 6146901278630392829 Time: 0.097152 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16 Tactic: 6394572396369862482 [03/25/2022-13:24:40] [V] [TRT] Tactic: 6394572396369862482 Time: 0.272256 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_epifadd Tactic: 6434020722187266170 [03/25/2022-13:24:40] [V] [TRT] Tactic: 6434020722187266170 Time: 0.092672 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32 Tactic: 6781129591847482048 [03/25/2022-13:24:40] [V] [TRT] Tactic: 6781129591847482048 Time: 0.110464 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 6984451771200230840 [03/25/2022-13:24:40] [V] [TRT] Tactic: 6984451771200230840 Time: 0.141696 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 7048234086361926570 [03/25/2022-13:24:40] [V] [TRT] Tactic: 7048234086361926570 Time: 0.179712 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32 Tactic: 7077570591813340966 [03/25/2022-13:24:40] [V] [TRT] Tactic: 7077570591813340966 Time: 0.106624 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 7191893591576074000 [03/25/2022-13:24:40] [V] [TRT] Tactic: 7191893591576074000 Time: 0.1696 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_t1r3s3 Tactic: 7429976449747682901 [03/25/2022-13:24:40] [V] [TRT] Tactic: 7429976449747682901 Time: 0.125696 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 7438984192263206338 [03/25/2022-13:24:40] [V] [TRT] Tactic: 7438984192263206338 Time: 0.103808 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: 7504901284678552178 [03/25/2022-13:24:40] [V] [TRT] Tactic: 7504901284678552178 Time: 0.092672 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 8096257414008860171 [03/25/2022-13:24:40] [V] [TRT] Tactic: 8096257414008860171 Time: 0.107136 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 8128112048355596715 [03/25/2022-13:24:40] [V] [TRT] Tactic: 8128112048355596715 Time: 0.107136 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_indexed_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: 8751622450593766232 [03/25/2022-13:24:40] [V] [TRT] Tactic: 8751622450593766232 Time: 0.099584 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_indexed_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32 Tactic: 9064458886956700976 [03/25/2022-13:24:40] [V] [TRT] Tactic: 9064458886956700976 Time: 0.101376 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 9143438935315839085 [03/25/2022-13:24:40] [V] [TRT] Tactic: 9143438935315839085 Time: 0.167424 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32 Tactic: -9165697322068360861 [03/25/2022-13:24:40] [V] [TRT] Tactic: -9165697322068360861 Time: 0.096256 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_small_nt_v1 Tactic: -9118785798277698619 [03/25/2022-13:24:40] [V] [TRT] Tactic: -9118785798277698619 Time: 0.166912 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: -9108166971364503411 [03/25/2022-13:24:40] [V] [TRT] Tactic: -9108166971364503411 Time: 0.182272 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16 Tactic: -8861822316054763526 [03/25/2022-13:24:40] [V] [TRT] Tactic: -8861822316054763526 Time: 0.169728 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64 Tactic: -8791277710877987710 [03/25/2022-13:24:40] [V] [TRT] Tactic: -8791277710877987710 Time: 0.125568 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: -8691377209893505057 [03/25/2022-13:24:40] [V] [TRT] Tactic: -8691377209893505057 Time: 0.093312 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: -8520292213102999339 [03/25/2022-13:24:40] [V] [TRT] Tactic: -8520292213102999339 Time: 0.142336 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16_t1r3s3 Tactic: -8475551154769412306 [03/25/2022-13:24:40] [V] [TRT] Tactic: -8475551154769412306 Time: 0.171904 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -8417388128970254446 [03/25/2022-13:24:40] [V] [TRT] Tactic: -8417388128970254446 Time: 0.142336 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32 Tactic: -8263994888336646547 [03/25/2022-13:24:40] [V] [TRT] Tactic: -8263994888336646547 Time: 0.093184 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32 Tactic: -8205948405243401049 [03/25/2022-13:24:40] [V] [TRT] Tactic: -8205948405243401049 Time: 0.17408 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: -7992068592656168418 [03/25/2022-13:24:40] [V] [TRT] Tactic: -7992068592656168418 Time: 0.10688 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_t1r3s3 Tactic: -7898477046581738867 [03/25/2022-13:24:40] [V] [TRT] Tactic: -7898477046581738867 Time: 0.134912 [03/25/2022-13:24:40] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_epifadd Tactic: -7842775553137511386 [03/25/2022-13:24:41] [V] [TRT] Tactic: -7842775553137511386 Time: 0.094208 [03/25/2022-13:24:41] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32 Tactic: -7683887278997527517 [03/25/2022-13:24:41] [V] [TRT] Tactic: -7683887278997527517 Time: 0.15872 [03/25/2022-13:24:41] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: -7381370635708568663 [03/25/2022-13:24:41] [V] [TRT] Tactic: -7381370635708568663 Time: 0.108672 [03/25/2022-13:24:41] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -7129320389887881029 [03/25/2022-13:24:41] [V] [TRT] Tactic: -7129320389887881029 Time: 0.127488 [03/25/2022-13:24:41] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -6959995514028471820 [03/25/2022-13:24:41] [V] [TRT] Tactic: -6959995514028471820 Time: 0.149504 [03/25/2022-13:24:41] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_small_nt_v1 Tactic: -6400348606759295499 [03/25/2022-13:24:41] [V] [TRT] Tactic: -6400348606759295499 Time: 0.169088 [03/25/2022-13:24:41] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: -6371781333659293809 [03/25/2022-13:24:41] [V] [TRT] Tactic: -6371781333659293809 Time: 0.1792 [03/25/2022-13:24:41] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32 Tactic: -6256128573036943404 [03/25/2022-13:24:41] [V] [TRT] Tactic: -6256128573036943404 Time: 0.140928 [03/25/2022-13:24:41] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_small_nt_v1 Tactic: -5980889159865208399 [03/25/2022-13:24:41] [V] [TRT] Tactic: -5980889159865208399 Time: 0.172032 [03/25/2022-13:24:41] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_medium_nt_v1 Tactic: -5766140806760372989 [03/25/2022-13:24:41] [V] [TRT] Tactic: -5766140806760372989 Time: 0.180096 [03/25/2022-13:24:41] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: -5709079507616090666 [03/25/2022-13:24:41] [V] [TRT] Tactic: -5709079507616090666 Time: 0.092416 [03/25/2022-13:24:41] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: -5698636014239116282 [03/25/2022-13:24:41] [V] [TRT] Tactic: -5698636014239116282 Time: 0.092672 [03/25/2022-13:24:41] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16 Tactic: -5180570335464125033 [03/25/2022-13:24:41] [V] [TRT] Tactic: -5180570335464125033 Time: 0.177536 [03/25/2022-13:24:41] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -4933563390723451692 [03/25/2022-13:24:41] [V] [TRT] Tactic: -4933563390723451692 Time: 0.128256 [03/25/2022-13:24:41] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_medium_nt_v1 Tactic: -4516822589357530549 [03/25/2022-13:24:41] [V] [TRT] Tactic: -4516822589357530549 Time: 0.180992 [03/25/2022-13:24:41] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64 Tactic: -4232916483289779353 [03/25/2022-13:24:41] [V] [TRT] Tactic: -4232916483289779353 Time: 0.20096 [03/25/2022-13:24:41] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64 Tactic: -3460842194336717186 [03/25/2022-13:24:41] [V] [TRT] Tactic: -3460842194336717186 Time: 0.099328 [03/25/2022-13:24:41] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: -3413217501222406256 [03/25/2022-13:24:41] [V] [TRT] Tactic: -3413217501222406256 Time: 0.090624 [03/25/2022-13:24:41] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: -3280888557222886418 [03/25/2022-13:24:41] [V] [TRT] Tactic: -3280888557222886418 Time: 0.112768 [03/25/2022-13:24:41] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -3238475748440751107 [03/25/2022-13:24:41] [V] [TRT] Tactic: -3238475748440751107 Time: 0.103808 [03/25/2022-13:24:41] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: -3182884991006484042 [03/25/2022-13:24:41] [V] [TRT] Tactic: -3182884991006484042 Time: 0.093824 [03/25/2022-13:24:41] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -3173468756112541306 [03/25/2022-13:24:41] [V] [TRT] Tactic: -3173468756112541306 Time: 0.170624 [03/25/2022-13:24:41] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_large_nt_v1 Tactic: -2917455979290586480 [03/25/2022-13:24:41] [V] [TRT] Tactic: -2917455979290586480 Time: 0.173184 [03/25/2022-13:24:41] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -2741641298163591508 [03/25/2022-13:24:41] [V] [TRT] Tactic: -2741641298163591508 Time: 0.106368 [03/25/2022-13:24:41] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_large_nt_v1 Tactic: -2571022005763160364 [03/25/2022-13:24:41] [V] [TRT] Tactic: -2571022005763160364 Time: 0.179072 [03/25/2022-13:24:41] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: -2499089240293650188 [03/25/2022-13:24:41] [V] [TRT] Tactic: -2499089240293650188 Time: 0.175616 [03/25/2022-13:24:41] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: -2328318099174473157 [03/25/2022-13:24:41] [V] [TRT] Tactic: -2328318099174473157 Time: 0.183168 [03/25/2022-13:24:41] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: -2083778562631872334 [03/25/2022-13:24:41] [V] [TRT] Tactic: -2083778562631872334 Time: 0.109312 [03/25/2022-13:24:41] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32 Tactic: -2054375205435666404 [03/25/2022-13:24:41] [V] [TRT] Tactic: -2054375205435666404 Time: 0.126848 [03/25/2022-13:24:41] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -1546787387293556842 [03/25/2022-13:24:41] [V] [TRT] Tactic: -1546787387293556842 Time: 0.092928 [03/25/2022-13:24:41] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32 Tactic: -1498626619443284096 [03/25/2022-13:24:41] [V] [TRT] Tactic: -1498626619443284096 Time: 0.133632 [03/25/2022-13:24:41] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -1471245223605064669 [03/25/2022-13:24:41] [V] [TRT] Tactic: -1471245223605064669 Time: 0.154624 [03/25/2022-13:24:41] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32 Tactic: -1283580231568512025 [03/25/2022-13:24:41] [V] [TRT] Tactic: -1283580231568512025 Time: 0.206464 [03/25/2022-13:24:41] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: -1224421172675151280 [03/25/2022-13:24:41] [V] [TRT] Tactic: -1224421172675151280 Time: 0.09088 [03/25/2022-13:24:41] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_epifadd Tactic: -1173968681844185579 [03/25/2022-13:24:41] [V] [TRT] Tactic: -1173968681844185579 Time: 0.206208 [03/25/2022-13:24:41] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -921247911551089037 [03/25/2022-13:24:41] [V] [TRT] Tactic: -921247911551089037 Time: 0.092672 [03/25/2022-13:24:41] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32 Tactic: -762222380308749469 [03/25/2022-13:24:41] [V] [TRT] Tactic: -762222380308749469 Time: 0.129664 [03/25/2022-13:24:41] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: -556794153877490941 [03/25/2022-13:24:41] [V] [TRT] Tactic: -556794153877490941 Time: 0.130432 [03/25/2022-13:24:41] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -516725800067794372 [03/25/2022-13:24:41] [V] [TRT] Tactic: -516725800067794372 Time: 0.09408 [03/25/2022-13:24:41] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_large_nt_v1 Tactic: -428104331444385564 [03/25/2022-13:24:41] [V] [TRT] Tactic: -428104331444385564 Time: 0.183296 [03/25/2022-13:24:41] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16 Tactic: -366411318217594794 [03/25/2022-13:24:41] [V] [TRT] Tactic: -366411318217594794 Time: 0.205824 [03/25/2022-13:24:41] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16 Tactic: -351548418071036983 [03/25/2022-13:24:41] [V] [TRT] Tactic: -351548418071036983 Time: 0.170112 [03/25/2022-13:24:41] [V] [TRT] Fastest Tactic: 4414594337986714263 Time: 0.077696 [03/25/2022-13:24:41] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: 4414594337986714263 [03/25/2022-13:24:41] [V] [TRT] =============== Computing costs for [03/25/2022-13:24:41] [V] [TRT] *************** Autotuning format combination: Int8(12544,196:4,14,1), Int8(50176,196:4,14,1) -> Int8(50176,196:4,14,1) *************** [03/25/2022-13:24:41] [V] [TRT] *************** Autotuning format combination: Int8(12544,196:4,14,1), Int8(6272,196:32,14,1) -> Int8(6272,196:32,14,1) *************** [03/25/2022-13:24:41] [V] [TRT] *************** Autotuning format combination: Int8(1568,196:32,14,1), Int8(6272,196:32,14,1) -> Int8(6272,196:32,14,1) *************** [03/25/2022-13:24:41] [V] [TRT] =============== Computing costs for [03/25/2022-13:24:41] [V] [TRT] *************** Autotuning format combination: Int8(50176,196:4,14,1) -> Int8(12544,196:4,14,1) *************** [03/25/2022-13:24:41] [V] [TRT] *************** Autotuning format combination: Int8(50176,196:4,14,1) -> Int8(1568,196:32,14,1) *************** [03/25/2022-13:24:41] [V] [TRT] *************** Autotuning format combination: Int8(6272,196:32,14,1) -> Int8(1568,196:32,14,1) *************** [03/25/2022-13:24:41] [V] [TRT] Skip timing cache hit (epiFadd mismatch) for node: sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592, LayerImpl: CaskConvolution, tactic: 857001784974286465 [03/25/2022-13:24:41] [V] [TRT] --------------- Timing Runner: sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 (CudaGroupConvolution) [03/25/2022-13:24:41] [V] [TRT] CudaGroupConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:41] [V] [TRT] --------------- Timing Runner: sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 (CudaDepthwiseConvolution) [03/25/2022-13:24:41] [V] [TRT] CudaDepthwiseConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:41] [V] [TRT] --------------- Timing Runner: sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 (FusedConvActConvolution) [03/25/2022-13:24:41] [V] [TRT] FusedConvActConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:41] [V] [TRT] --------------- Timing Runner: sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 (CaskConvolution) [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16_t1r1s1 Tactic: 68468667201176803 [03/25/2022-13:24:41] [V] [TRT] Tactic: 68468667201176803 Time: 0.089984 [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 125145153013230687 [03/25/2022-13:24:41] [V] [TRT] Tactic: 125145153013230687 Time: 0.081152 [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_interior_nt_v1 Tactic: 434957160407688216 [03/25/2022-13:24:41] [V] [TRT] Tactic: 434957160407688216 Time: 0.093056 [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 805889586762897346 [03/25/2022-13:24:41] [V] [TRT] Tactic: 805889586762897346 Time: 0.055552 [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: 857001784974286465 [03/25/2022-13:24:41] [V] [TRT] Tactic: 857001784974286465 Time: 0.04736 [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 1214130898909872671 [03/25/2022-13:24:41] [V] [TRT] Tactic: 1214130898909872671 Time: 0.110208 [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_conv_fprop_smallk_i8i8_i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_warptilesize16x32_stage5_warpsize1x4x2_r1s1_u1v1_hw0_c1024_scalebias_relu Tactic: 1263683011321748626 [03/25/2022-13:24:41] [V] [TRT] Tactic: 1263683011321748626 Time: 0.047872 [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 1278425129871930205 [03/25/2022-13:24:41] [V] [TRT] Tactic: 1278425129871930205 Time: 0.053888 [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 1583811548148740665 [03/25/2022-13:24:41] [V] [TRT] Tactic: 1583811548148740665 Time: 0.082048 [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 1701344857577810806 [03/25/2022-13:24:41] [V] [TRT] Tactic: 1701344857577810806 Time: 0.072576 [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 1797231177354918208 [03/25/2022-13:24:41] [V] [TRT] Tactic: 1797231177354918208 Time: 0.098048 [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2004812516525036381 [03/25/2022-13:24:41] [V] [TRT] Tactic: 2004812516525036381 Time: 0.073216 [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2030033463723799063 [03/25/2022-13:24:41] [V] [TRT] Tactic: 2030033463723799063 Time: 0.058496 [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_interior_nt_v1 Tactic: 2346437292116182513 [03/25/2022-13:24:41] [V] [TRT] Tactic: 2346437292116182513 Time: 0.091648 [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2376898825218218566 [03/25/2022-13:24:41] [V] [TRT] Tactic: 2376898825218218566 Time: 0.053504 [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_interior_nt_v1 Tactic: 2522133112320625287 [03/25/2022-13:24:41] [V] [TRT] Tactic: 2522133112320625287 Time: 0.089856 [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2548171972648455240 [03/25/2022-13:24:41] [V] [TRT] Tactic: 2548171972648455240 Time: 0.057856 [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 2548946449357458230 [03/25/2022-13:24:41] [V] [TRT] Tactic: 2548946449357458230 Time: 0.10688 [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: 2570666021825229009 [03/25/2022-13:24:41] [V] [TRT] Tactic: 2570666021825229009 Time: 0.09408 [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 2678520742286844763 [03/25/2022-13:24:41] [V] [TRT] Tactic: 2678520742286844763 Time: 0.13696 [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 2756291002030759362 [03/25/2022-13:24:41] [V] [TRT] Tactic: 2756291002030759362 Time: 0.070272 [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2972948223367788520 [03/25/2022-13:24:41] [V] [TRT] Tactic: 2972948223367788520 Time: 0.058624 [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_medium_nt_v1 Tactic: 2985940154541537814 [03/25/2022-13:24:41] [V] [TRT] Tactic: 2985940154541537814 Time: 0.091136 [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 3043273137345374664 [03/25/2022-13:24:41] [V] [TRT] Tactic: 3043273137345374664 Time: 0.109824 [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 3221677093659484230 [03/25/2022-13:24:41] [V] [TRT] Tactic: 3221677093659484230 Time: 0.089472 [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: 3242897809704328258 [03/25/2022-13:24:41] [V] [TRT] Tactic: 3242897809704328258 Time: 0.094208 [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 3312456766204252694 [03/25/2022-13:24:41] [V] [TRT] Tactic: 3312456766204252694 Time: 0.118912 [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 3538565962642681625 [03/25/2022-13:24:41] [V] [TRT] Tactic: 3538565962642681625 Time: 0.080768 [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 3541919052468401776 [03/25/2022-13:24:41] [V] [TRT] Tactic: 3541919052468401776 Time: 0.085632 [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 3593397928177382100 [03/25/2022-13:24:41] [V] [TRT] Tactic: 3593397928177382100 Time: 0.110592 [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 3670282018109435863 [03/25/2022-13:24:41] [V] [TRT] Tactic: 3670282018109435863 Time: 0.068096 [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 3671413346254027573 [03/25/2022-13:24:41] [V] [TRT] Tactic: 3671413346254027573 Time: 0.070656 [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_medium_nt_v1 Tactic: 3899284354987683408 [03/25/2022-13:24:41] [V] [TRT] Tactic: 3899284354987683408 Time: 0.092544 [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 3927509214678622419 [03/25/2022-13:24:41] [V] [TRT] Tactic: 3927509214678622419 Time: 0.092544 [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 4112572034735311841 [03/25/2022-13:24:41] [V] [TRT] Tactic: 4112572034735311841 Time: 0.14016 [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 4239974928951431644 [03/25/2022-13:24:41] [V] [TRT] Tactic: 4239974928951431644 Time: 0.084352 [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 4610760414797216079 [03/25/2022-13:24:41] [V] [TRT] Tactic: 4610760414797216079 Time: 0.072064 [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_small_nt_v1 Tactic: 4717285412741024953 [03/25/2022-13:24:41] [V] [TRT] Tactic: 4717285412741024953 Time: 0.091264 [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 4796956614760326119 [03/25/2022-13:24:41] [V] [TRT] Tactic: 4796956614760326119 Time: 0.064512 [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: 4909502217677847353 [03/25/2022-13:24:41] [V] [TRT] Tactic: 4909502217677847353 Time: 0.04928 [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 4919361344804309192 [03/25/2022-13:24:41] [V] [TRT] Tactic: 4919361344804309192 Time: 0.108032 [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 5043674678294309681 [03/25/2022-13:24:41] [V] [TRT] Tactic: 5043674678294309681 Time: 0.071552 [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_interior_nt_v1 Tactic: 5126565865931538390 [03/25/2022-13:24:41] [V] [TRT] Tactic: 5126565865931538390 Time: 0.092544 [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5204702486885981735 [03/25/2022-13:24:41] [V] [TRT] Tactic: 5204702486885981735 Time: 0.069248 [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 5375256703210220108 [03/25/2022-13:24:41] [V] [TRT] Tactic: 5375256703210220108 Time: 0.066816 [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 5424258848951129084 [03/25/2022-13:24:41] [V] [TRT] Tactic: 5424258848951129084 Time: 0.04992 [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5443897483205284103 [03/25/2022-13:24:41] [V] [TRT] Tactic: 5443897483205284103 Time: 0.071808 [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5707566217891294846 [03/25/2022-13:24:41] [V] [TRT] Tactic: 5707566217891294846 Time: 0.060032 [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5986622376339202983 [03/25/2022-13:24:41] [V] [TRT] Tactic: 5986622376339202983 Time: 0.081408 [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6007888770437705057 [03/25/2022-13:24:41] [V] [TRT] Tactic: 6007888770437705057 Time: 0.06272 [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 6405251167055673379 [03/25/2022-13:24:41] [V] [TRT] Tactic: 6405251167055673379 Time: 0.065408 [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 6433368103202497147 [03/25/2022-13:24:41] [V] [TRT] Tactic: 6433368103202497147 Time: 0.061568 [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6441948709525127755 [03/25/2022-13:24:41] [V] [TRT] Tactic: 6441948709525127755 Time: 0.112512 [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 6443933097134654777 [03/25/2022-13:24:41] [V] [TRT] Tactic: 6443933097134654777 Time: 0.064384 [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6457435868048963632 [03/25/2022-13:24:41] [V] [TRT] Tactic: 6457435868048963632 Time: 0.072576 [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1 Tactic: 6510345569544721081 [03/25/2022-13:24:41] [V] [TRT] Tactic: 6510345569544721081 Time: 0.09408 [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 6793988781414507278 [03/25/2022-13:24:41] [V] [TRT] Tactic: 6793988781414507278 Time: 0.05568 [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 6880710371738875469 [03/25/2022-13:24:41] [V] [TRT] Tactic: 6880710371738875469 Time: 0.076544 [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6925201228918187099 [03/25/2022-13:24:41] [V] [TRT] Tactic: 6925201228918187099 Time: 0.056704 [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6991524515605108718 [03/25/2022-13:24:41] [V] [TRT] Tactic: 6991524515605108718 Time: 0.088576 [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 7245509442265271220 [03/25/2022-13:24:41] [V] [TRT] Tactic: 7245509442265271220 Time: 0.081152 [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 7318929579222925725 [03/25/2022-13:24:41] [V] [TRT] Tactic: 7318929579222925725 Time: 0.06528 [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 7731430299029542276 [03/25/2022-13:24:41] [V] [TRT] Tactic: 7731430299029542276 Time: 0.055296 [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 7738495016763012180 [03/25/2022-13:24:41] [V] [TRT] Tactic: 7738495016763012180 Time: 0.055808 [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: 7886967395128926382 [03/25/2022-13:24:41] [V] [TRT] Tactic: 7886967395128926382 Time: 0.061824 [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 8142283985160822229 [03/25/2022-13:24:41] [V] [TRT] Tactic: 8142283985160822229 Time: 0.057472 [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 8173975624668590862 [03/25/2022-13:24:41] [V] [TRT] Tactic: 8173975624668590862 Time: 0.058624 [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 8234775147403903473 [03/25/2022-13:24:41] [V] [TRT] Tactic: 8234775147403903473 Time: 0.058496 [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 8524082966802584889 [03/25/2022-13:24:41] [V] [TRT] Tactic: 8524082966802584889 Time: 0.060416 [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 8684013308930763400 [03/25/2022-13:24:41] [V] [TRT] Tactic: 8684013308930763400 Time: 0.088832 [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 8765382722978397630 [03/25/2022-13:24:41] [V] [TRT] Tactic: 8765382722978397630 Time: 0.06144 [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 8843193587782643431 [03/25/2022-13:24:41] [V] [TRT] Tactic: 8843193587782643431 Time: 0.08 [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 8883810517410230831 [03/25/2022-13:24:41] [V] [TRT] Tactic: 8883810517410230831 Time: 0.06272 [03/25/2022-13:24:41] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 8930797211803511337 [03/25/2022-13:24:42] [V] [TRT] Tactic: 8930797211803511337 Time: 0.082688 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 8935070489925739043 [03/25/2022-13:24:42] [V] [TRT] Tactic: 8935070489925739043 Time: 0.06016 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1 Tactic: 9062173295331155069 [03/25/2022-13:24:42] [V] [TRT] Tactic: 9062173295331155069 Time: 0.137472 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_small_nt_v1 Tactic: -9118785798277698619 [03/25/2022-13:24:42] [V] [TRT] Tactic: -9118785798277698619 Time: 0.09088 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -8985599729413291927 [03/25/2022-13:24:42] [V] [TRT] Tactic: -8985599729413291927 Time: 0.066048 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -8972697510150675429 [03/25/2022-13:24:42] [V] [TRT] Tactic: -8972697510150675429 Time: 0.08448 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -8943710627305202139 [03/25/2022-13:24:42] [V] [TRT] Tactic: -8943710627305202139 Time: 0.0928 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -8859846367886814331 [03/25/2022-13:24:42] [V] [TRT] Tactic: -8859846367886814331 Time: 0.098048 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -8638624340850784688 [03/25/2022-13:24:42] [V] [TRT] Tactic: -8638624340850784688 Time: 0.083072 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -8556775352640313933 [03/25/2022-13:24:42] [V] [TRT] Tactic: -8556775352640313933 Time: 0.05888 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -8382298409581540699 [03/25/2022-13:24:42] [V] [TRT] Tactic: -8382298409581540699 Time: 0.135296 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -8172318747337038866 [03/25/2022-13:24:42] [V] [TRT] Tactic: -8172318747337038866 Time: 0.089344 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -8038164441468184723 [03/25/2022-13:24:42] [V] [TRT] Tactic: -8038164441468184723 Time: 0.060416 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -7844028314176826857 [03/25/2022-13:24:42] [V] [TRT] Tactic: -7844028314176826857 Time: 0.111616 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -7674507941016740570 [03/25/2022-13:24:42] [V] [TRT] Tactic: -7674507941016740570 Time: 0.049408 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -7364286662638617917 [03/25/2022-13:24:42] [V] [TRT] Tactic: -7364286662638617917 Time: 0.0544 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r1s1 Tactic: -7361755530333096258 [03/25/2022-13:24:42] [V] [TRT] Tactic: -7361755530333096258 Time: 0.10816 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_t1r1s1 Tactic: -7289760022626653388 [03/25/2022-13:24:42] [V] [TRT] Tactic: -7289760022626653388 Time: 0.062336 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -7106539943789766885 [03/25/2022-13:24:42] [V] [TRT] Tactic: -7106539943789766885 Time: 0.088064 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -6969478418607271266 [03/25/2022-13:24:42] [V] [TRT] Tactic: -6969478418607271266 Time: 0.088448 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -6930438165437733000 [03/25/2022-13:24:42] [V] [TRT] Tactic: -6930438165437733000 Time: 0.139136 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -6879607992933502380 [03/25/2022-13:24:42] [V] [TRT] Tactic: -6879607992933502380 Time: 0.064128 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -6839669803644810934 [03/25/2022-13:24:42] [V] [TRT] Tactic: -6839669803644810934 Time: 0.070912 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -6812830108414456369 [03/25/2022-13:24:42] [V] [TRT] Tactic: -6812830108414456369 Time: 0.071424 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -6779804930216439173 [03/25/2022-13:24:42] [V] [TRT] Tactic: -6779804930216439173 Time: 0.046592 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -6527178416855951297 [03/25/2022-13:24:42] [V] [TRT] Tactic: -6527178416855951297 Time: 0.11008 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -6510232214299595844 [03/25/2022-13:24:42] [V] [TRT] Tactic: -6510232214299595844 Time: 0.11008 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_small_nt_v1 Tactic: -6400348606759295499 [03/25/2022-13:24:42] [V] [TRT] Tactic: -6400348606759295499 Time: 0.088704 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -6346247605026339453 [03/25/2022-13:24:42] [V] [TRT] Tactic: -6346247605026339453 Time: 0.091904 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -6232597026469067819 [03/25/2022-13:24:42] [V] [TRT] Tactic: -6232597026469067819 Time: 0.07936 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_small_nt_v1 Tactic: -5980889159865208399 [03/25/2022-13:24:42] [V] [TRT] Tactic: -5980889159865208399 Time: 0.091136 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_medium_nt_v1 Tactic: -5766140806760372989 [03/25/2022-13:24:42] [V] [TRT] Tactic: -5766140806760372989 Time: 0.093312 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -5697614955743334137 [03/25/2022-13:24:42] [V] [TRT] Tactic: -5697614955743334137 Time: 0.086784 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -5671123121710113970 [03/25/2022-13:24:42] [V] [TRT] Tactic: -5671123121710113970 Time: 0.0864 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -5615581362569252260 [03/25/2022-13:24:42] [V] [TRT] Tactic: -5615581362569252260 Time: 0.095488 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -5562968047117507056 [03/25/2022-13:24:42] [V] [TRT] Tactic: -5562968047117507056 Time: 0.062592 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -5516472881360101487 [03/25/2022-13:24:42] [V] [TRT] Tactic: -5516472881360101487 Time: 0.082432 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -5311474420963248369 [03/25/2022-13:24:42] [V] [TRT] Tactic: -5311474420963248369 Time: 0.119936 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -5170003087447722174 [03/25/2022-13:24:42] [V] [TRT] Tactic: -5170003087447722174 Time: 0.112512 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -4889586143772361690 [03/25/2022-13:24:42] [V] [TRT] Tactic: -4889586143772361690 Time: 0.070272 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -4889498558023475527 [03/25/2022-13:24:42] [V] [TRT] Tactic: -4889498558023475527 Time: 0.061952 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -4849712423393454704 [03/25/2022-13:24:42] [V] [TRT] Tactic: -4849712423393454704 Time: 0.070528 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: -4681913707320020520 [03/25/2022-13:24:42] [V] [TRT] Tactic: -4681913707320020520 Time: 0.046336 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_medium_nt_v1 Tactic: -4516822589357530549 [03/25/2022-13:24:42] [V] [TRT] Tactic: -4516822589357530549 Time: 0.094464 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -4455415102719506646 [03/25/2022-13:24:42] [V] [TRT] Tactic: -4455415102719506646 Time: 0.08064 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -4425346730823666456 [03/25/2022-13:24:42] [V] [TRT] Tactic: -4425346730823666456 Time: 0.085504 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r1s1 Tactic: -4260476497340370474 [03/25/2022-13:24:42] [V] [TRT] Tactic: -4260476497340370474 Time: 0.135808 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -4182501876984672402 [03/25/2022-13:24:42] [V] [TRT] Tactic: -4182501876984672402 Time: 0.08704 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_t1r1s1 Tactic: -4151617293257698859 [03/25/2022-13:24:42] [V] [TRT] Tactic: -4151617293257698859 Time: 0.064512 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -3862908719298381451 [03/25/2022-13:24:42] [V] [TRT] Tactic: -3862908719298381451 Time: 0.04736 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -3825889760337461729 [03/25/2022-13:24:42] [V] [TRT] Tactic: -3825889760337461729 Time: 0.108928 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -3797022944823726673 [03/25/2022-13:24:42] [V] [TRT] Tactic: -3797022944823726673 Time: 0.079232 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -3613322253849278738 [03/25/2022-13:24:42] [V] [TRT] Tactic: -3613322253849278738 Time: 0.142464 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -3577322188448771475 [03/25/2022-13:24:42] [V] [TRT] Tactic: -3577322188448771475 Time: 0.087296 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16_t1r1s1 Tactic: -3531681826488401618 [03/25/2022-13:24:42] [V] [TRT] Tactic: -3531681826488401618 Time: 0.08896 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -3305554949874552860 [03/25/2022-13:24:42] [V] [TRT] Tactic: -3305554949874552860 Time: 0.136704 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -3288585994448820820 [03/25/2022-13:24:42] [V] [TRT] Tactic: -3288585994448820820 Time: 0.08064 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -2754311112012636251 [03/25/2022-13:24:42] [V] [TRT] Tactic: -2754311112012636251 Time: 0.088832 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -2432868635536396215 [03/25/2022-13:24:42] [V] [TRT] Tactic: -2432868635536396215 Time: 0.081152 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -2379804152300264660 [03/25/2022-13:24:42] [V] [TRT] Tactic: -2379804152300264660 Time: 0.108544 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: -2352253835013627337 [03/25/2022-13:24:42] [V] [TRT] Tactic: -2352253835013627337 Time: 0.050304 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -2335587136911650799 [03/25/2022-13:24:42] [V] [TRT] Tactic: -2335587136911650799 Time: 0.073344 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r1s1 Tactic: -2315453944962430928 [03/25/2022-13:24:42] [V] [TRT] Tactic: -2315453944962430928 Time: 0.059264 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: -2238364958919154661 [03/25/2022-13:24:42] [V] [TRT] Tactic: -2238364958919154661 Time: 0.093952 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -1916483171117495388 [03/25/2022-13:24:42] [V] [TRT] Tactic: -1916483171117495388 Time: 0.088832 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -1740762957710554518 [03/25/2022-13:24:42] [V] [TRT] Tactic: -1740762957710554518 Time: 0.136832 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -1549742793039499659 [03/25/2022-13:24:42] [V] [TRT] Tactic: -1549742793039499659 Time: 0.080128 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -1499578657823798783 [03/25/2022-13:24:42] [V] [TRT] Tactic: -1499578657823798783 Time: 0.080896 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -1494157908358500249 [03/25/2022-13:24:42] [V] [TRT] Tactic: -1494157908358500249 Time: 0.098816 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_t1r1s1 Tactic: -1328736756812546664 [03/25/2022-13:24:42] [V] [TRT] Tactic: -1328736756812546664 Time: 0.081792 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -1006589727652607355 [03/25/2022-13:24:42] [V] [TRT] Tactic: -1006589727652607355 Time: 0.096896 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -713022856474991236 [03/25/2022-13:24:42] [V] [TRT] Tactic: -713022856474991236 Time: 0.141184 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -619668460699260222 [03/25/2022-13:24:42] [V] [TRT] Tactic: -619668460699260222 Time: 0.093184 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -405554772060757402 [03/25/2022-13:24:42] [V] [TRT] Tactic: -405554772060757402 Time: 0.069632 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -375949437730908730 [03/25/2022-13:24:42] [V] [TRT] Tactic: -375949437730908730 Time: 0.068864 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: -233227833606287806 [03/25/2022-13:24:42] [V] [TRT] Tactic: -233227833606287806 Time: 0.07168 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -111878368089469751 [03/25/2022-13:24:42] [V] [TRT] Tactic: -111878368089469751 Time: 0.080256 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -48936598874722005 [03/25/2022-13:24:42] [V] [TRT] Tactic: -48936598874722005 Time: 0.062208 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -19707840769375107 [03/25/2022-13:24:42] [V] [TRT] Tactic: -19707840769375107 Time: 0.086784 [03/25/2022-13:24:42] [V] [TRT] Fastest Tactic: -4681913707320020520 Time: 0.046336 [03/25/2022-13:24:42] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: -4681913707320020520 [03/25/2022-13:24:42] [V] [TRT] =============== Computing costs for [03/25/2022-13:24:42] [V] [TRT] *************** Autotuning format combination: Int8(12544,196:4,14,1) -> Int8(12544,196:4,14,1) *************** [03/25/2022-13:24:42] [V] [TRT] *************** Autotuning format combination: Int8(12544,196:4,14,1) -> Int8(1568,196:32,14,1) *************** [03/25/2022-13:24:42] [V] [TRT] *************** Autotuning format combination: Int8(1568,196:32,14,1) -> Int8(1568,196:32,14,1) *************** [03/25/2022-13:24:42] [V] [TRT] Skip timing cache hit (epiFadd mismatch) for node: sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607, LayerImpl: CaskConvolution, tactic: 4414594337986714263 [03/25/2022-13:24:42] [V] [TRT] --------------- Timing Runner: sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 (CudaGroupConvolution) [03/25/2022-13:24:42] [V] [TRT] CudaGroupConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:42] [V] [TRT] --------------- Timing Runner: sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 (CudaDepthwiseConvolution) [03/25/2022-13:24:42] [V] [TRT] CudaDepthwiseConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:42] [V] [TRT] --------------- Timing Runner: sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 (FusedConvActConvolution) [03/25/2022-13:24:42] [V] [TRT] FusedConvActConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:42] [V] [TRT] --------------- Timing Runner: sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 (CaskConvolution) [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_epifadd Tactic: 177040020707947851 [03/25/2022-13:24:42] [V] [TRT] Tactic: 177040020707947851 Time: 0.189184 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 184229963126259101 [03/25/2022-13:24:42] [V] [TRT] Tactic: 184229963126259101 Time: 0.134144 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 289888059097454627 [03/25/2022-13:24:42] [V] [TRT] Tactic: 289888059097454627 Time: 0.168064 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32 Tactic: 328135613486708155 [03/25/2022-13:24:42] [V] [TRT] Tactic: 328135613486708155 Time: 0.276992 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 680740992583869928 [03/25/2022-13:24:42] [V] [TRT] Tactic: 680740992583869928 Time: 0.174208 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32 Tactic: 1111159740952609683 [03/25/2022-13:24:42] [V] [TRT] Tactic: 1111159740952609683 Time: 0.144768 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: 1134860903395928905 [03/25/2022-13:24:42] [V] [TRT] Tactic: 1134860903395928905 Time: 0.131072 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: 1276591930377039442 [03/25/2022-13:24:42] [V] [TRT] Tactic: 1276591930377039442 Time: 0.139136 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r3s3 Tactic: 1388866374720163187 [03/25/2022-13:24:42] [V] [TRT] Tactic: 1388866374720163187 Time: 0.2048 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32 Tactic: 1399501420456320585 [03/25/2022-13:24:42] [V] [TRT] Tactic: 1399501420456320585 Time: 0.170752 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: 1550399266192842845 [03/25/2022-13:24:42] [V] [TRT] Tactic: 1550399266192842845 Time: 0.173696 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 1572887561103143487 [03/25/2022-13:24:42] [V] [TRT] Tactic: 1572887561103143487 Time: 0.131072 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 1853122447892949466 [03/25/2022-13:24:42] [V] [TRT] Tactic: 1853122447892949466 Time: 0.170368 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: 2133329569091732311 [03/25/2022-13:24:42] [V] [TRT] Tactic: 2133329569091732311 Time: 0.168704 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32 Tactic: 2325023763229477890 [03/25/2022-13:24:42] [V] [TRT] Tactic: 2325023763229477890 Time: 0.095232 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: 2579824863892891529 [03/25/2022-13:24:42] [V] [TRT] Tactic: 2579824863892891529 Time: 0.233344 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32 Tactic: 2783960536172159663 [03/25/2022-13:24:42] [V] [TRT] Tactic: 2783960536172159663 Time: 0.0928 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32 Tactic: 2821711838552913693 [03/25/2022-13:24:42] [V] [TRT] Tactic: 2821711838552913693 Time: 0.1376 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64 Tactic: 2945009978756227538 [03/25/2022-13:24:42] [V] [TRT] Tactic: 2945009978756227538 Time: 0.098432 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_medium_nt_v1 Tactic: 2985940154541537814 [03/25/2022-13:24:42] [V] [TRT] Tactic: 2985940154541537814 Time: 0.174208 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 3284282970967328046 [03/25/2022-13:24:42] [V] [TRT] Tactic: 3284282970967328046 Time: 0.201472 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 3401614690060226673 [03/25/2022-13:24:42] [V] [TRT] Tactic: 3401614690060226673 Time: 0.1696 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32 Tactic: 3456719996792527006 [03/25/2022-13:24:42] [V] [TRT] Tactic: 3456719996792527006 Time: 0.123648 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 3512426920013359699 [03/25/2022-13:24:42] [V] [TRT] Tactic: 3512426920013359699 Time: 0.127232 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64 Tactic: 3651043333819148268 [03/25/2022-13:24:42] [V] [TRT] Tactic: 3651043333819148268 Time: 0.077952 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_medium_nt_v1 Tactic: 3899284354987683408 [03/25/2022-13:24:42] [V] [TRT] Tactic: 3899284354987683408 Time: 0.175744 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32 Tactic: 4042202769383439184 [03/25/2022-13:24:42] [V] [TRT] Tactic: 4042202769383439184 Time: 0.10624 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_large_nt_v1 Tactic: 4182625619810185112 [03/25/2022-13:24:42] [V] [TRT] Tactic: 4182625619810185112 Time: 0.1888 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 4214794893922618058 [03/25/2022-13:24:42] [V] [TRT] Tactic: 4214794893922618058 Time: 0.167168 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_epifadd Tactic: 4259547356717612415 [03/25/2022-13:24:42] [V] [TRT] Tactic: 4259547356717612415 Time: 0.133888 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 4384868749799132354 [03/25/2022-13:24:42] [V] [TRT] Tactic: 4384868749799132354 Time: 0.233728 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64 Tactic: 4414594337986714263 [03/25/2022-13:24:42] [V] [TRT] Tactic: 4414594337986714263 Time: 0.077568 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_small_nt_v1 Tactic: 4717285412741024953 [03/25/2022-13:24:42] [V] [TRT] Tactic: 4717285412741024953 Time: 0.171904 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32 Tactic: 4734519122557206480 [03/25/2022-13:24:42] [V] [TRT] Tactic: 4734519122557206480 Time: 0.09728 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: 4922297020351187339 [03/25/2022-13:24:42] [V] [TRT] Tactic: 4922297020351187339 Time: 0.142336 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16_t1r3s3 Tactic: 4931167631624420067 [03/25/2022-13:24:42] [V] [TRT] Tactic: 4931167631624420067 Time: 0.169088 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_epifadd Tactic: 5121596860264626879 [03/25/2022-13:24:42] [V] [TRT] Tactic: 5121596860264626879 Time: 0.093696 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 5136656982162849059 [03/25/2022-13:24:42] [V] [TRT] Tactic: 5136656982162849059 Time: 0.201984 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: 5158259316594207439 [03/25/2022-13:24:42] [V] [TRT] Tactic: 5158259316594207439 Time: 0.105344 [03/25/2022-13:24:42] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 5189825015507701541 [03/25/2022-13:24:43] [V] [TRT] Tactic: 5189825015507701541 Time: 0.30144 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32 Tactic: 5424417905073460656 [03/25/2022-13:24:43] [V] [TRT] Tactic: 5424417905073460656 Time: 0.141312 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32 Tactic: 5442043907221427810 [03/25/2022-13:24:43] [V] [TRT] Tactic: 5442043907221427810 Time: 0.110336 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_t1r3s3 Tactic: 5544365258913999384 [03/25/2022-13:24:43] [V] [TRT] Tactic: 5544365258913999384 Time: 0.107904 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r3s3 Tactic: 5641967928706599451 [03/25/2022-13:24:43] [V] [TRT] Tactic: 5641967928706599451 Time: 0.270848 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 5721595115357140131 [03/25/2022-13:24:43] [V] [TRT] Tactic: 5721595115357140131 Time: 0.1344 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 5966973378912044513 [03/25/2022-13:24:43] [V] [TRT] Tactic: 5966973378912044513 Time: 0.093568 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 6004789655466615912 [03/25/2022-13:24:43] [V] [TRT] Tactic: 6004789655466615912 Time: 0.131072 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r3s3 Tactic: 6146901278630392829 [03/25/2022-13:24:43] [V] [TRT] Tactic: 6146901278630392829 Time: 0.09728 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16 Tactic: 6394572396369862482 [03/25/2022-13:24:43] [V] [TRT] Tactic: 6394572396369862482 Time: 0.272256 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_epifadd Tactic: 6434020722187266170 [03/25/2022-13:24:43] [V] [TRT] Tactic: 6434020722187266170 Time: 0.092672 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32 Tactic: 6781129591847482048 [03/25/2022-13:24:43] [V] [TRT] Tactic: 6781129591847482048 Time: 0.110464 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 6984451771200230840 [03/25/2022-13:24:43] [V] [TRT] Tactic: 6984451771200230840 Time: 0.14144 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 7048234086361926570 [03/25/2022-13:24:43] [V] [TRT] Tactic: 7048234086361926570 Time: 0.17984 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32 Tactic: 7077570591813340966 [03/25/2022-13:24:43] [V] [TRT] Tactic: 7077570591813340966 Time: 0.106496 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 7191893591576074000 [03/25/2022-13:24:43] [V] [TRT] Tactic: 7191893591576074000 Time: 0.169856 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_t1r3s3 Tactic: 7429976449747682901 [03/25/2022-13:24:43] [V] [TRT] Tactic: 7429976449747682901 Time: 0.125312 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 7438984192263206338 [03/25/2022-13:24:43] [V] [TRT] Tactic: 7438984192263206338 Time: 0.104064 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: 7504901284678552178 [03/25/2022-13:24:43] [V] [TRT] Tactic: 7504901284678552178 Time: 0.092288 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 8096257414008860171 [03/25/2022-13:24:43] [V] [TRT] Tactic: 8096257414008860171 Time: 0.107136 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 8128112048355596715 [03/25/2022-13:24:43] [V] [TRT] Tactic: 8128112048355596715 Time: 0.107264 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_indexed_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: 8751622450593766232 [03/25/2022-13:24:43] [V] [TRT] Tactic: 8751622450593766232 Time: 0.099712 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_indexed_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32 Tactic: 9064458886956700976 [03/25/2022-13:24:43] [V] [TRT] Tactic: 9064458886956700976 Time: 0.101376 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 9143438935315839085 [03/25/2022-13:24:43] [V] [TRT] Tactic: 9143438935315839085 Time: 0.166656 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32 Tactic: -9165697322068360861 [03/25/2022-13:24:43] [V] [TRT] Tactic: -9165697322068360861 Time: 0.096128 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_small_nt_v1 Tactic: -9118785798277698619 [03/25/2022-13:24:43] [V] [TRT] Tactic: -9118785798277698619 Time: 0.166912 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: -9108166971364503411 [03/25/2022-13:24:43] [V] [TRT] Tactic: -9108166971364503411 Time: 0.182272 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16 Tactic: -8861822316054763526 [03/25/2022-13:24:43] [V] [TRT] Tactic: -8861822316054763526 Time: 0.169856 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64 Tactic: -8791277710877987710 [03/25/2022-13:24:43] [V] [TRT] Tactic: -8791277710877987710 Time: 0.125696 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: -8691377209893505057 [03/25/2022-13:24:43] [V] [TRT] Tactic: -8691377209893505057 Time: 0.093184 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: -8520292213102999339 [03/25/2022-13:24:43] [V] [TRT] Tactic: -8520292213102999339 Time: 0.14208 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16_t1r3s3 Tactic: -8475551154769412306 [03/25/2022-13:24:43] [V] [TRT] Tactic: -8475551154769412306 Time: 0.171648 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -8417388128970254446 [03/25/2022-13:24:43] [V] [TRT] Tactic: -8417388128970254446 Time: 0.142336 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32 Tactic: -8263994888336646547 [03/25/2022-13:24:43] [V] [TRT] Tactic: -8263994888336646547 Time: 0.093056 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32 Tactic: -8205948405243401049 [03/25/2022-13:24:43] [V] [TRT] Tactic: -8205948405243401049 Time: 0.17408 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: -7992068592656168418 [03/25/2022-13:24:43] [V] [TRT] Tactic: -7992068592656168418 Time: 0.107008 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_t1r3s3 Tactic: -7898477046581738867 [03/25/2022-13:24:43] [V] [TRT] Tactic: -7898477046581738867 Time: 0.134784 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_epifadd Tactic: -7842775553137511386 [03/25/2022-13:24:43] [V] [TRT] Tactic: -7842775553137511386 Time: 0.09408 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32 Tactic: -7683887278997527517 [03/25/2022-13:24:43] [V] [TRT] Tactic: -7683887278997527517 Time: 0.158592 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: -7381370635708568663 [03/25/2022-13:24:43] [V] [TRT] Tactic: -7381370635708568663 Time: 0.108544 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -7129320389887881029 [03/25/2022-13:24:43] [V] [TRT] Tactic: -7129320389887881029 Time: 0.127104 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -6959995514028471820 [03/25/2022-13:24:43] [V] [TRT] Tactic: -6959995514028471820 Time: 0.149888 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_small_nt_v1 Tactic: -6400348606759295499 [03/25/2022-13:24:43] [V] [TRT] Tactic: -6400348606759295499 Time: 0.169088 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: -6371781333659293809 [03/25/2022-13:24:43] [V] [TRT] Tactic: -6371781333659293809 Time: 0.179072 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32 Tactic: -6256128573036943404 [03/25/2022-13:24:43] [V] [TRT] Tactic: -6256128573036943404 Time: 0.140672 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_small_nt_v1 Tactic: -5980889159865208399 [03/25/2022-13:24:43] [V] [TRT] Tactic: -5980889159865208399 Time: 0.172032 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_medium_nt_v1 Tactic: -5766140806760372989 [03/25/2022-13:24:43] [V] [TRT] Tactic: -5766140806760372989 Time: 0.179968 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: -5709079507616090666 [03/25/2022-13:24:43] [V] [TRT] Tactic: -5709079507616090666 Time: 0.092288 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: -5698636014239116282 [03/25/2022-13:24:43] [V] [TRT] Tactic: -5698636014239116282 Time: 0.092672 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16 Tactic: -5180570335464125033 [03/25/2022-13:24:43] [V] [TRT] Tactic: -5180570335464125033 Time: 0.177536 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -4933563390723451692 [03/25/2022-13:24:43] [V] [TRT] Tactic: -4933563390723451692 Time: 0.128128 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_medium_nt_v1 Tactic: -4516822589357530549 [03/25/2022-13:24:43] [V] [TRT] Tactic: -4516822589357530549 Time: 0.180992 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64 Tactic: -4232916483289779353 [03/25/2022-13:24:43] [V] [TRT] Tactic: -4232916483289779353 Time: 0.200832 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64 Tactic: -3460842194336717186 [03/25/2022-13:24:43] [V] [TRT] Tactic: -3460842194336717186 Time: 0.099456 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: -3413217501222406256 [03/25/2022-13:24:43] [V] [TRT] Tactic: -3413217501222406256 Time: 0.090496 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: -3280888557222886418 [03/25/2022-13:24:43] [V] [TRT] Tactic: -3280888557222886418 Time: 0.112512 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -3238475748440751107 [03/25/2022-13:24:43] [V] [TRT] Tactic: -3238475748440751107 Time: 0.103936 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: -3182884991006484042 [03/25/2022-13:24:43] [V] [TRT] Tactic: -3182884991006484042 Time: 0.093952 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -3173468756112541306 [03/25/2022-13:24:43] [V] [TRT] Tactic: -3173468756112541306 Time: 0.170368 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_large_nt_v1 Tactic: -2917455979290586480 [03/25/2022-13:24:43] [V] [TRT] Tactic: -2917455979290586480 Time: 0.173056 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -2741641298163591508 [03/25/2022-13:24:43] [V] [TRT] Tactic: -2741641298163591508 Time: 0.105856 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_large_nt_v1 Tactic: -2571022005763160364 [03/25/2022-13:24:43] [V] [TRT] Tactic: -2571022005763160364 Time: 0.179328 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: -2499089240293650188 [03/25/2022-13:24:43] [V] [TRT] Tactic: -2499089240293650188 Time: 0.175744 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: -2328318099174473157 [03/25/2022-13:24:43] [V] [TRT] Tactic: -2328318099174473157 Time: 0.183168 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: -2083778562631872334 [03/25/2022-13:24:43] [V] [TRT] Tactic: -2083778562631872334 Time: 0.109312 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32 Tactic: -2054375205435666404 [03/25/2022-13:24:43] [V] [TRT] Tactic: -2054375205435666404 Time: 0.126848 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -1546787387293556842 [03/25/2022-13:24:43] [V] [TRT] Tactic: -1546787387293556842 Time: 0.0928 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32 Tactic: -1498626619443284096 [03/25/2022-13:24:43] [V] [TRT] Tactic: -1498626619443284096 Time: 0.134016 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -1471245223605064669 [03/25/2022-13:24:43] [V] [TRT] Tactic: -1471245223605064669 Time: 0.15488 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32 Tactic: -1283580231568512025 [03/25/2022-13:24:43] [V] [TRT] Tactic: -1283580231568512025 Time: 0.205952 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: -1224421172675151280 [03/25/2022-13:24:43] [V] [TRT] Tactic: -1224421172675151280 Time: 0.09088 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_epifadd Tactic: -1173968681844185579 [03/25/2022-13:24:43] [V] [TRT] Tactic: -1173968681844185579 Time: 0.206976 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -921247911551089037 [03/25/2022-13:24:43] [V] [TRT] Tactic: -921247911551089037 Time: 0.092288 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32 Tactic: -762222380308749469 [03/25/2022-13:24:43] [V] [TRT] Tactic: -762222380308749469 Time: 0.129536 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: -556794153877490941 [03/25/2022-13:24:43] [V] [TRT] Tactic: -556794153877490941 Time: 0.13056 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -516725800067794372 [03/25/2022-13:24:43] [V] [TRT] Tactic: -516725800067794372 Time: 0.094208 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_large_nt_v1 Tactic: -428104331444385564 [03/25/2022-13:24:43] [V] [TRT] Tactic: -428104331444385564 Time: 0.183168 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16 Tactic: -366411318217594794 [03/25/2022-13:24:43] [V] [TRT] Tactic: -366411318217594794 Time: 0.20544 [03/25/2022-13:24:43] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16 Tactic: -351548418071036983 [03/25/2022-13:24:43] [V] [TRT] Tactic: -351548418071036983 Time: 0.17024 [03/25/2022-13:24:43] [V] [TRT] Fastest Tactic: 4414594337986714263 Time: 0.077568 [03/25/2022-13:24:43] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: 4414594337986714263 [03/25/2022-13:24:43] [V] [TRT] =============== Computing costs for [03/25/2022-13:24:43] [V] [TRT] *************** Autotuning format combination: Int8(12544,196:4,14,1), Int8(50176,196:4,14,1) -> Int8(50176,196:4,14,1) *************** [03/25/2022-13:24:43] [V] [TRT] *************** Autotuning format combination: Int8(12544,196:4,14,1), Int8(6272,196:32,14,1) -> Int8(6272,196:32,14,1) *************** [03/25/2022-13:24:43] [V] [TRT] *************** Autotuning format combination: Int8(1568,196:32,14,1), Int8(6272,196:32,14,1) -> Int8(6272,196:32,14,1) *************** [03/25/2022-13:24:43] [V] [TRT] =============== Computing costs for [03/25/2022-13:24:43] [V] [TRT] *************** Autotuning format combination: Int8(50176,196:4,14,1) -> Int8(12544,196:4,14,1) *************** [03/25/2022-13:24:43] [V] [TRT] *************** Autotuning format combination: Int8(50176,196:4,14,1) -> Int8(1568,196:32,14,1) *************** [03/25/2022-13:24:43] [V] [TRT] *************** Autotuning format combination: Int8(6272,196:32,14,1) -> Int8(1568,196:32,14,1) *************** [03/25/2022-13:24:43] [V] [TRT] Skip timing cache hit (epiFadd mismatch) for node: sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644, LayerImpl: CaskConvolution, tactic: 857001784974286465 [03/25/2022-13:24:43] [V] [TRT] --------------- Timing Runner: sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 (CudaGroupConvolution) [03/25/2022-13:24:43] [V] [TRT] CudaGroupConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:43] [V] [TRT] --------------- Timing Runner: sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 (CudaDepthwiseConvolution) [03/25/2022-13:24:43] [V] [TRT] CudaDepthwiseConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:43] [V] [TRT] --------------- Timing Runner: sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 (FusedConvActConvolution) [03/25/2022-13:24:43] [V] [TRT] FusedConvActConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:43] [V] [TRT] --------------- Timing Runner: sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 (CaskConvolution) [03/25/2022-13:24:43] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16_t1r1s1 Tactic: 68468667201176803 [03/25/2022-13:24:43] [V] [TRT] Tactic: 68468667201176803 Time: 0.090112 [03/25/2022-13:24:43] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 125145153013230687 [03/25/2022-13:24:43] [V] [TRT] Tactic: 125145153013230687 Time: 0.081408 [03/25/2022-13:24:43] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_interior_nt_v1 Tactic: 434957160407688216 [03/25/2022-13:24:43] [V] [TRT] Tactic: 434957160407688216 Time: 0.092928 [03/25/2022-13:24:43] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 805889586762897346 [03/25/2022-13:24:43] [V] [TRT] Tactic: 805889586762897346 Time: 0.055296 [03/25/2022-13:24:43] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: 857001784974286465 [03/25/2022-13:24:43] [V] [TRT] Tactic: 857001784974286465 Time: 0.047104 [03/25/2022-13:24:43] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 1214130898909872671 [03/25/2022-13:24:43] [V] [TRT] Tactic: 1214130898909872671 Time: 0.110336 [03/25/2022-13:24:43] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_conv_fprop_smallk_i8i8_i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_warptilesize16x32_stage5_warpsize1x4x2_r1s1_u1v1_hw0_c1024_scalebias_relu Tactic: 1263683011321748626 [03/25/2022-13:24:43] [V] [TRT] Tactic: 1263683011321748626 Time: 0.047744 [03/25/2022-13:24:43] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 1278425129871930205 [03/25/2022-13:24:43] [V] [TRT] Tactic: 1278425129871930205 Time: 0.053504 [03/25/2022-13:24:43] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 1583811548148740665 [03/25/2022-13:24:43] [V] [TRT] Tactic: 1583811548148740665 Time: 0.082176 [03/25/2022-13:24:43] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 1701344857577810806 [03/25/2022-13:24:43] [V] [TRT] Tactic: 1701344857577810806 Time: 0.072704 [03/25/2022-13:24:43] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 1797231177354918208 [03/25/2022-13:24:43] [V] [TRT] Tactic: 1797231177354918208 Time: 0.09792 [03/25/2022-13:24:43] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2004812516525036381 [03/25/2022-13:24:43] [V] [TRT] Tactic: 2004812516525036381 Time: 0.0736 [03/25/2022-13:24:43] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2030033463723799063 [03/25/2022-13:24:43] [V] [TRT] Tactic: 2030033463723799063 Time: 0.058624 [03/25/2022-13:24:43] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_interior_nt_v1 Tactic: 2346437292116182513 [03/25/2022-13:24:43] [V] [TRT] Tactic: 2346437292116182513 Time: 0.091648 [03/25/2022-13:24:43] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2376898825218218566 [03/25/2022-13:24:43] [V] [TRT] Tactic: 2376898825218218566 Time: 0.05376 [03/25/2022-13:24:43] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_interior_nt_v1 Tactic: 2522133112320625287 [03/25/2022-13:24:43] [V] [TRT] Tactic: 2522133112320625287 Time: 0.089856 [03/25/2022-13:24:43] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2548171972648455240 [03/25/2022-13:24:43] [V] [TRT] Tactic: 2548171972648455240 Time: 0.057728 [03/25/2022-13:24:43] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 2548946449357458230 [03/25/2022-13:24:43] [V] [TRT] Tactic: 2548946449357458230 Time: 0.107264 [03/25/2022-13:24:43] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: 2570666021825229009 [03/25/2022-13:24:43] [V] [TRT] Tactic: 2570666021825229009 Time: 0.093696 [03/25/2022-13:24:43] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 2678520742286844763 [03/25/2022-13:24:43] [V] [TRT] Tactic: 2678520742286844763 Time: 0.13696 [03/25/2022-13:24:43] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 2756291002030759362 [03/25/2022-13:24:43] [V] [TRT] Tactic: 2756291002030759362 Time: 0.070528 [03/25/2022-13:24:43] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2972948223367788520 [03/25/2022-13:24:43] [V] [TRT] Tactic: 2972948223367788520 Time: 0.058752 [03/25/2022-13:24:43] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_medium_nt_v1 Tactic: 2985940154541537814 [03/25/2022-13:24:43] [V] [TRT] Tactic: 2985940154541537814 Time: 0.091392 [03/25/2022-13:24:43] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 3043273137345374664 [03/25/2022-13:24:43] [V] [TRT] Tactic: 3043273137345374664 Time: 0.109824 [03/25/2022-13:24:43] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 3221677093659484230 [03/25/2022-13:24:43] [V] [TRT] Tactic: 3221677093659484230 Time: 0.088832 [03/25/2022-13:24:43] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: 3242897809704328258 [03/25/2022-13:24:43] [V] [TRT] Tactic: 3242897809704328258 Time: 0.09408 [03/25/2022-13:24:43] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 3312456766204252694 [03/25/2022-13:24:43] [V] [TRT] Tactic: 3312456766204252694 Time: 0.118528 [03/25/2022-13:24:43] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 3538565962642681625 [03/25/2022-13:24:43] [V] [TRT] Tactic: 3538565962642681625 Time: 0.080768 [03/25/2022-13:24:43] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 3541919052468401776 [03/25/2022-13:24:43] [V] [TRT] Tactic: 3541919052468401776 Time: 0.085376 [03/25/2022-13:24:43] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 3593397928177382100 [03/25/2022-13:24:43] [V] [TRT] Tactic: 3593397928177382100 Time: 0.110976 [03/25/2022-13:24:43] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 3670282018109435863 [03/25/2022-13:24:44] [V] [TRT] Tactic: 3670282018109435863 Time: 0.067712 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 3671413346254027573 [03/25/2022-13:24:44] [V] [TRT] Tactic: 3671413346254027573 Time: 0.070784 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_medium_nt_v1 Tactic: 3899284354987683408 [03/25/2022-13:24:44] [V] [TRT] Tactic: 3899284354987683408 Time: 0.092416 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 3927509214678622419 [03/25/2022-13:24:44] [V] [TRT] Tactic: 3927509214678622419 Time: 0.092416 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 4112572034735311841 [03/25/2022-13:24:44] [V] [TRT] Tactic: 4112572034735311841 Time: 0.139648 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 4239974928951431644 [03/25/2022-13:24:44] [V] [TRT] Tactic: 4239974928951431644 Time: 0.084736 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 4610760414797216079 [03/25/2022-13:24:44] [V] [TRT] Tactic: 4610760414797216079 Time: 0.072448 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_small_nt_v1 Tactic: 4717285412741024953 [03/25/2022-13:24:44] [V] [TRT] Tactic: 4717285412741024953 Time: 0.091008 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 4796956614760326119 [03/25/2022-13:24:44] [V] [TRT] Tactic: 4796956614760326119 Time: 0.064512 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: 4909502217677847353 [03/25/2022-13:24:44] [V] [TRT] Tactic: 4909502217677847353 Time: 0.04928 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 4919361344804309192 [03/25/2022-13:24:44] [V] [TRT] Tactic: 4919361344804309192 Time: 0.108672 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 5043674678294309681 [03/25/2022-13:24:44] [V] [TRT] Tactic: 5043674678294309681 Time: 0.071168 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_interior_nt_v1 Tactic: 5126565865931538390 [03/25/2022-13:24:44] [V] [TRT] Tactic: 5126565865931538390 Time: 0.092416 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5204702486885981735 [03/25/2022-13:24:44] [V] [TRT] Tactic: 5204702486885981735 Time: 0.069376 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 5375256703210220108 [03/25/2022-13:24:44] [V] [TRT] Tactic: 5375256703210220108 Time: 0.066816 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 5424258848951129084 [03/25/2022-13:24:44] [V] [TRT] Tactic: 5424258848951129084 Time: 0.049664 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5443897483205284103 [03/25/2022-13:24:44] [V] [TRT] Tactic: 5443897483205284103 Time: 0.071424 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5707566217891294846 [03/25/2022-13:24:44] [V] [TRT] Tactic: 5707566217891294846 Time: 0.059904 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5986622376339202983 [03/25/2022-13:24:44] [V] [TRT] Tactic: 5986622376339202983 Time: 0.081152 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6007888770437705057 [03/25/2022-13:24:44] [V] [TRT] Tactic: 6007888770437705057 Time: 0.062848 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 6405251167055673379 [03/25/2022-13:24:44] [V] [TRT] Tactic: 6405251167055673379 Time: 0.064768 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 6433368103202497147 [03/25/2022-13:24:44] [V] [TRT] Tactic: 6433368103202497147 Time: 0.061696 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6441948709525127755 [03/25/2022-13:24:44] [V] [TRT] Tactic: 6441948709525127755 Time: 0.112512 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 6443933097134654777 [03/25/2022-13:24:44] [V] [TRT] Tactic: 6443933097134654777 Time: 0.06464 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6457435868048963632 [03/25/2022-13:24:44] [V] [TRT] Tactic: 6457435868048963632 Time: 0.072192 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1 Tactic: 6510345569544721081 [03/25/2022-13:24:44] [V] [TRT] Tactic: 6510345569544721081 Time: 0.094592 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 6793988781414507278 [03/25/2022-13:24:44] [V] [TRT] Tactic: 6793988781414507278 Time: 0.055552 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 6880710371738875469 [03/25/2022-13:24:44] [V] [TRT] Tactic: 6880710371738875469 Time: 0.076672 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6925201228918187099 [03/25/2022-13:24:44] [V] [TRT] Tactic: 6925201228918187099 Time: 0.056576 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6991524515605108718 [03/25/2022-13:24:44] [V] [TRT] Tactic: 6991524515605108718 Time: 0.088448 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 7245509442265271220 [03/25/2022-13:24:44] [V] [TRT] Tactic: 7245509442265271220 Time: 0.080768 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 7318929579222925725 [03/25/2022-13:24:44] [V] [TRT] Tactic: 7318929579222925725 Time: 0.065024 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 7731430299029542276 [03/25/2022-13:24:44] [V] [TRT] Tactic: 7731430299029542276 Time: 0.055296 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 7738495016763012180 [03/25/2022-13:24:44] [V] [TRT] Tactic: 7738495016763012180 Time: 0.056192 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: 7886967395128926382 [03/25/2022-13:24:44] [V] [TRT] Tactic: 7886967395128926382 Time: 0.06208 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 8142283985160822229 [03/25/2022-13:24:44] [V] [TRT] Tactic: 8142283985160822229 Time: 0.057344 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 8173975624668590862 [03/25/2022-13:24:44] [V] [TRT] Tactic: 8173975624668590862 Time: 0.058624 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 8234775147403903473 [03/25/2022-13:24:44] [V] [TRT] Tactic: 8234775147403903473 Time: 0.058496 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 8524082966802584889 [03/25/2022-13:24:44] [V] [TRT] Tactic: 8524082966802584889 Time: 0.06016 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 8684013308930763400 [03/25/2022-13:24:44] [V] [TRT] Tactic: 8684013308930763400 Time: 0.088832 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 8765382722978397630 [03/25/2022-13:24:44] [V] [TRT] Tactic: 8765382722978397630 Time: 0.061184 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 8843193587782643431 [03/25/2022-13:24:44] [V] [TRT] Tactic: 8843193587782643431 Time: 0.080128 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 8883810517410230831 [03/25/2022-13:24:44] [V] [TRT] Tactic: 8883810517410230831 Time: 0.062592 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 8930797211803511337 [03/25/2022-13:24:44] [V] [TRT] Tactic: 8930797211803511337 Time: 0.081792 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 8935070489925739043 [03/25/2022-13:24:44] [V] [TRT] Tactic: 8935070489925739043 Time: 0.060032 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1 Tactic: 9062173295331155069 [03/25/2022-13:24:44] [V] [TRT] Tactic: 9062173295331155069 Time: 0.1376 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_small_nt_v1 Tactic: -9118785798277698619 [03/25/2022-13:24:44] [V] [TRT] Tactic: -9118785798277698619 Time: 0.090496 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -8985599729413291927 [03/25/2022-13:24:44] [V] [TRT] Tactic: -8985599729413291927 Time: 0.065664 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -8972697510150675429 [03/25/2022-13:24:44] [V] [TRT] Tactic: -8972697510150675429 Time: 0.084096 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -8943710627305202139 [03/25/2022-13:24:44] [V] [TRT] Tactic: -8943710627305202139 Time: 0.092672 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -8859846367886814331 [03/25/2022-13:24:44] [V] [TRT] Tactic: -8859846367886814331 Time: 0.098176 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -8638624340850784688 [03/25/2022-13:24:44] [V] [TRT] Tactic: -8638624340850784688 Time: 0.083072 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -8556775352640313933 [03/25/2022-13:24:44] [V] [TRT] Tactic: -8556775352640313933 Time: 0.05888 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -8382298409581540699 [03/25/2022-13:24:44] [V] [TRT] Tactic: -8382298409581540699 Time: 0.135168 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -8172318747337038866 [03/25/2022-13:24:44] [V] [TRT] Tactic: -8172318747337038866 Time: 0.089344 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -8038164441468184723 [03/25/2022-13:24:44] [V] [TRT] Tactic: -8038164441468184723 Time: 0.060032 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -7844028314176826857 [03/25/2022-13:24:44] [V] [TRT] Tactic: -7844028314176826857 Time: 0.110848 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -7674507941016740570 [03/25/2022-13:24:44] [V] [TRT] Tactic: -7674507941016740570 Time: 0.049408 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -7364286662638617917 [03/25/2022-13:24:44] [V] [TRT] Tactic: -7364286662638617917 Time: 0.054272 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r1s1 Tactic: -7361755530333096258 [03/25/2022-13:24:44] [V] [TRT] Tactic: -7361755530333096258 Time: 0.108544 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_t1r1s1 Tactic: -7289760022626653388 [03/25/2022-13:24:44] [V] [TRT] Tactic: -7289760022626653388 Time: 0.06272 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -7106539943789766885 [03/25/2022-13:24:44] [V] [TRT] Tactic: -7106539943789766885 Time: 0.088192 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -6969478418607271266 [03/25/2022-13:24:44] [V] [TRT] Tactic: -6969478418607271266 Time: 0.08832 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -6930438165437733000 [03/25/2022-13:24:44] [V] [TRT] Tactic: -6930438165437733000 Time: 0.138496 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -6879607992933502380 [03/25/2022-13:24:44] [V] [TRT] Tactic: -6879607992933502380 Time: 0.064128 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -6839669803644810934 [03/25/2022-13:24:44] [V] [TRT] Tactic: -6839669803644810934 Time: 0.070912 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -6812830108414456369 [03/25/2022-13:24:44] [V] [TRT] Tactic: -6812830108414456369 Time: 0.071168 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -6779804930216439173 [03/25/2022-13:24:44] [V] [TRT] Tactic: -6779804930216439173 Time: 0.046464 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -6527178416855951297 [03/25/2022-13:24:44] [V] [TRT] Tactic: -6527178416855951297 Time: 0.110592 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -6510232214299595844 [03/25/2022-13:24:44] [V] [TRT] Tactic: -6510232214299595844 Time: 0.11008 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_small_nt_v1 Tactic: -6400348606759295499 [03/25/2022-13:24:44] [V] [TRT] Tactic: -6400348606759295499 Time: 0.088576 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -6346247605026339453 [03/25/2022-13:24:44] [V] [TRT] Tactic: -6346247605026339453 Time: 0.09216 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -6232597026469067819 [03/25/2022-13:24:44] [V] [TRT] Tactic: -6232597026469067819 Time: 0.079488 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_small_nt_v1 Tactic: -5980889159865208399 [03/25/2022-13:24:44] [V] [TRT] Tactic: -5980889159865208399 Time: 0.091008 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_medium_nt_v1 Tactic: -5766140806760372989 [03/25/2022-13:24:44] [V] [TRT] Tactic: -5766140806760372989 Time: 0.09344 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -5697614955743334137 [03/25/2022-13:24:44] [V] [TRT] Tactic: -5697614955743334137 Time: 0.086528 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -5671123121710113970 [03/25/2022-13:24:44] [V] [TRT] Tactic: -5671123121710113970 Time: 0.086784 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -5615581362569252260 [03/25/2022-13:24:44] [V] [TRT] Tactic: -5615581362569252260 Time: 0.095488 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -5562968047117507056 [03/25/2022-13:24:44] [V] [TRT] Tactic: -5562968047117507056 Time: 0.062592 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -5516472881360101487 [03/25/2022-13:24:44] [V] [TRT] Tactic: -5516472881360101487 Time: 0.082176 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -5311474420963248369 [03/25/2022-13:24:44] [V] [TRT] Tactic: -5311474420963248369 Time: 0.119808 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -5170003087447722174 [03/25/2022-13:24:44] [V] [TRT] Tactic: -5170003087447722174 Time: 0.113152 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -4889586143772361690 [03/25/2022-13:24:44] [V] [TRT] Tactic: -4889586143772361690 Time: 0.070272 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -4889498558023475527 [03/25/2022-13:24:44] [V] [TRT] Tactic: -4889498558023475527 Time: 0.061824 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -4849712423393454704 [03/25/2022-13:24:44] [V] [TRT] Tactic: -4849712423393454704 Time: 0.070272 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: -4681913707320020520 [03/25/2022-13:24:44] [V] [TRT] Tactic: -4681913707320020520 Time: 0.046464 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_medium_nt_v1 Tactic: -4516822589357530549 [03/25/2022-13:24:44] [V] [TRT] Tactic: -4516822589357530549 Time: 0.094464 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -4455415102719506646 [03/25/2022-13:24:44] [V] [TRT] Tactic: -4455415102719506646 Time: 0.080512 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -4425346730823666456 [03/25/2022-13:24:44] [V] [TRT] Tactic: -4425346730823666456 Time: 0.085504 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r1s1 Tactic: -4260476497340370474 [03/25/2022-13:24:44] [V] [TRT] Tactic: -4260476497340370474 Time: 0.136064 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -4182501876984672402 [03/25/2022-13:24:44] [V] [TRT] Tactic: -4182501876984672402 Time: 0.086912 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_t1r1s1 Tactic: -4151617293257698859 [03/25/2022-13:24:44] [V] [TRT] Tactic: -4151617293257698859 Time: 0.064768 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -3862908719298381451 [03/25/2022-13:24:44] [V] [TRT] Tactic: -3862908719298381451 Time: 0.047232 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -3825889760337461729 [03/25/2022-13:24:44] [V] [TRT] Tactic: -3825889760337461729 Time: 0.108544 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -3797022944823726673 [03/25/2022-13:24:44] [V] [TRT] Tactic: -3797022944823726673 Time: 0.079104 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -3613322253849278738 [03/25/2022-13:24:44] [V] [TRT] Tactic: -3613322253849278738 Time: 0.1408 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -3577322188448771475 [03/25/2022-13:24:44] [V] [TRT] Tactic: -3577322188448771475 Time: 0.086912 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16_t1r1s1 Tactic: -3531681826488401618 [03/25/2022-13:24:44] [V] [TRT] Tactic: -3531681826488401618 Time: 0.089344 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -3305554949874552860 [03/25/2022-13:24:44] [V] [TRT] Tactic: -3305554949874552860 Time: 0.136576 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -3288585994448820820 [03/25/2022-13:24:44] [V] [TRT] Tactic: -3288585994448820820 Time: 0.08064 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -2754311112012636251 [03/25/2022-13:24:44] [V] [TRT] Tactic: -2754311112012636251 Time: 0.088576 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -2432868635536396215 [03/25/2022-13:24:44] [V] [TRT] Tactic: -2432868635536396215 Time: 0.081152 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -2379804152300264660 [03/25/2022-13:24:44] [V] [TRT] Tactic: -2379804152300264660 Time: 0.108544 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: -2352253835013627337 [03/25/2022-13:24:44] [V] [TRT] Tactic: -2352253835013627337 Time: 0.050304 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -2335587136911650799 [03/25/2022-13:24:44] [V] [TRT] Tactic: -2335587136911650799 Time: 0.073472 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r1s1 Tactic: -2315453944962430928 [03/25/2022-13:24:44] [V] [TRT] Tactic: -2315453944962430928 Time: 0.059264 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: -2238364958919154661 [03/25/2022-13:24:44] [V] [TRT] Tactic: -2238364958919154661 Time: 0.094208 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -1916483171117495388 [03/25/2022-13:24:44] [V] [TRT] Tactic: -1916483171117495388 Time: 0.088832 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -1740762957710554518 [03/25/2022-13:24:44] [V] [TRT] Tactic: -1740762957710554518 Time: 0.13696 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -1549742793039499659 [03/25/2022-13:24:44] [V] [TRT] Tactic: -1549742793039499659 Time: 0.08 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -1499578657823798783 [03/25/2022-13:24:44] [V] [TRT] Tactic: -1499578657823798783 Time: 0.081152 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -1494157908358500249 [03/25/2022-13:24:44] [V] [TRT] Tactic: -1494157908358500249 Time: 0.099584 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_t1r1s1 Tactic: -1328736756812546664 [03/25/2022-13:24:44] [V] [TRT] Tactic: -1328736756812546664 Time: 0.081536 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -1006589727652607355 [03/25/2022-13:24:44] [V] [TRT] Tactic: -1006589727652607355 Time: 0.096896 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -713022856474991236 [03/25/2022-13:24:44] [V] [TRT] Tactic: -713022856474991236 Time: 0.141952 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -619668460699260222 [03/25/2022-13:24:44] [V] [TRT] Tactic: -619668460699260222 Time: 0.093568 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -405554772060757402 [03/25/2022-13:24:44] [V] [TRT] Tactic: -405554772060757402 Time: 0.069888 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -375949437730908730 [03/25/2022-13:24:44] [V] [TRT] Tactic: -375949437730908730 Time: 0.068736 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: -233227833606287806 [03/25/2022-13:24:44] [V] [TRT] Tactic: -233227833606287806 Time: 0.071936 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -111878368089469751 [03/25/2022-13:24:44] [V] [TRT] Tactic: -111878368089469751 Time: 0.080384 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -48936598874722005 [03/25/2022-13:24:44] [V] [TRT] Tactic: -48936598874722005 Time: 0.062208 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -19707840769375107 [03/25/2022-13:24:44] [V] [TRT] Tactic: -19707840769375107 Time: 0.086912 [03/25/2022-13:24:44] [V] [TRT] Fastest Tactic: -6779804930216439173 Time: 0.046464 [03/25/2022-13:24:44] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: -6779804930216439173 [03/25/2022-13:24:44] [V] [TRT] =============== Computing costs for [03/25/2022-13:24:44] [V] [TRT] *************** Autotuning format combination: Int8(12544,196:4,14,1) -> Int8(12544,196:4,14,1) *************** [03/25/2022-13:24:44] [V] [TRT] *************** Autotuning format combination: Int8(12544,196:4,14,1) -> Int8(1568,196:32,14,1) *************** [03/25/2022-13:24:44] [V] [TRT] *************** Autotuning format combination: Int8(1568,196:32,14,1) -> Int8(1568,196:32,14,1) *************** [03/25/2022-13:24:44] [V] [TRT] Skip timing cache hit (epiFadd mismatch) for node: sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659, LayerImpl: CaskConvolution, tactic: 4414594337986714263 [03/25/2022-13:24:44] [V] [TRT] --------------- Timing Runner: sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 (CudaGroupConvolution) [03/25/2022-13:24:44] [V] [TRT] CudaGroupConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:44] [V] [TRT] --------------- Timing Runner: sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 (CudaDepthwiseConvolution) [03/25/2022-13:24:44] [V] [TRT] CudaDepthwiseConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:44] [V] [TRT] --------------- Timing Runner: sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 (FusedConvActConvolution) [03/25/2022-13:24:44] [V] [TRT] FusedConvActConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:44] [V] [TRT] --------------- Timing Runner: sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 (CaskConvolution) [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_epifadd Tactic: 177040020707947851 [03/25/2022-13:24:44] [V] [TRT] Tactic: 177040020707947851 Time: 0.1888 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 184229963126259101 [03/25/2022-13:24:44] [V] [TRT] Tactic: 184229963126259101 Time: 0.133888 [03/25/2022-13:24:44] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 289888059097454627 [03/25/2022-13:24:45] [V] [TRT] Tactic: 289888059097454627 Time: 0.167936 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32 Tactic: 328135613486708155 [03/25/2022-13:24:45] [V] [TRT] Tactic: 328135613486708155 Time: 0.27648 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 680740992583869928 [03/25/2022-13:24:45] [V] [TRT] Tactic: 680740992583869928 Time: 0.174464 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32 Tactic: 1111159740952609683 [03/25/2022-13:24:45] [V] [TRT] Tactic: 1111159740952609683 Time: 0.14464 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: 1134860903395928905 [03/25/2022-13:24:45] [V] [TRT] Tactic: 1134860903395928905 Time: 0.131072 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: 1276591930377039442 [03/25/2022-13:24:45] [V] [TRT] Tactic: 1276591930377039442 Time: 0.139904 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r3s3 Tactic: 1388866374720163187 [03/25/2022-13:24:45] [V] [TRT] Tactic: 1388866374720163187 Time: 0.204416 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32 Tactic: 1399501420456320585 [03/25/2022-13:24:45] [V] [TRT] Tactic: 1399501420456320585 Time: 0.17088 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: 1550399266192842845 [03/25/2022-13:24:45] [V] [TRT] Tactic: 1550399266192842845 Time: 0.17344 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 1572887561103143487 [03/25/2022-13:24:45] [V] [TRT] Tactic: 1572887561103143487 Time: 0.1312 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 1853122447892949466 [03/25/2022-13:24:45] [V] [TRT] Tactic: 1853122447892949466 Time: 0.170112 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: 2133329569091732311 [03/25/2022-13:24:45] [V] [TRT] Tactic: 2133329569091732311 Time: 0.168832 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32 Tactic: 2325023763229477890 [03/25/2022-13:24:45] [V] [TRT] Tactic: 2325023763229477890 Time: 0.095232 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: 2579824863892891529 [03/25/2022-13:24:45] [V] [TRT] Tactic: 2579824863892891529 Time: 0.234112 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32 Tactic: 2783960536172159663 [03/25/2022-13:24:45] [V] [TRT] Tactic: 2783960536172159663 Time: 0.0928 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32 Tactic: 2821711838552913693 [03/25/2022-13:24:45] [V] [TRT] Tactic: 2821711838552913693 Time: 0.13696 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64 Tactic: 2945009978756227538 [03/25/2022-13:24:45] [V] [TRT] Tactic: 2945009978756227538 Time: 0.098176 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_medium_nt_v1 Tactic: 2985940154541537814 [03/25/2022-13:24:45] [V] [TRT] Tactic: 2985940154541537814 Time: 0.174336 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 3284282970967328046 [03/25/2022-13:24:45] [V] [TRT] Tactic: 3284282970967328046 Time: 0.201728 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 3401614690060226673 [03/25/2022-13:24:45] [V] [TRT] Tactic: 3401614690060226673 Time: 0.169472 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32 Tactic: 3456719996792527006 [03/25/2022-13:24:45] [V] [TRT] Tactic: 3456719996792527006 Time: 0.123776 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 3512426920013359699 [03/25/2022-13:24:45] [V] [TRT] Tactic: 3512426920013359699 Time: 0.12736 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64 Tactic: 3651043333819148268 [03/25/2022-13:24:45] [V] [TRT] Tactic: 3651043333819148268 Time: 0.07808 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_medium_nt_v1 Tactic: 3899284354987683408 [03/25/2022-13:24:45] [V] [TRT] Tactic: 3899284354987683408 Time: 0.175616 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32 Tactic: 4042202769383439184 [03/25/2022-13:24:45] [V] [TRT] Tactic: 4042202769383439184 Time: 0.106112 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_large_nt_v1 Tactic: 4182625619810185112 [03/25/2022-13:24:45] [V] [TRT] Tactic: 4182625619810185112 Time: 0.188672 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 4214794893922618058 [03/25/2022-13:24:45] [V] [TRT] Tactic: 4214794893922618058 Time: 0.16704 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_epifadd Tactic: 4259547356717612415 [03/25/2022-13:24:45] [V] [TRT] Tactic: 4259547356717612415 Time: 0.133888 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 4384868749799132354 [03/25/2022-13:24:45] [V] [TRT] Tactic: 4384868749799132354 Time: 0.233344 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64 Tactic: 4414594337986714263 [03/25/2022-13:24:45] [V] [TRT] Tactic: 4414594337986714263 Time: 0.077824 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_small_nt_v1 Tactic: 4717285412741024953 [03/25/2022-13:24:45] [V] [TRT] Tactic: 4717285412741024953 Time: 0.172032 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32 Tactic: 4734519122557206480 [03/25/2022-13:24:45] [V] [TRT] Tactic: 4734519122557206480 Time: 0.09728 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: 4922297020351187339 [03/25/2022-13:24:45] [V] [TRT] Tactic: 4922297020351187339 Time: 0.142208 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16_t1r3s3 Tactic: 4931167631624420067 [03/25/2022-13:24:45] [V] [TRT] Tactic: 4931167631624420067 Time: 0.168704 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_epifadd Tactic: 5121596860264626879 [03/25/2022-13:24:45] [V] [TRT] Tactic: 5121596860264626879 Time: 0.093952 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 5136656982162849059 [03/25/2022-13:24:45] [V] [TRT] Tactic: 5136656982162849059 Time: 0.201984 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: 5158259316594207439 [03/25/2022-13:24:45] [V] [TRT] Tactic: 5158259316594207439 Time: 0.105216 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 5189825015507701541 [03/25/2022-13:24:45] [V] [TRT] Tactic: 5189825015507701541 Time: 0.30208 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32 Tactic: 5424417905073460656 [03/25/2022-13:24:45] [V] [TRT] Tactic: 5424417905073460656 Time: 0.140928 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32 Tactic: 5442043907221427810 [03/25/2022-13:24:45] [V] [TRT] Tactic: 5442043907221427810 Time: 0.110336 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_t1r3s3 Tactic: 5544365258913999384 [03/25/2022-13:24:45] [V] [TRT] Tactic: 5544365258913999384 Time: 0.107648 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r3s3 Tactic: 5641967928706599451 [03/25/2022-13:24:45] [V] [TRT] Tactic: 5641967928706599451 Time: 0.270336 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 5721595115357140131 [03/25/2022-13:24:45] [V] [TRT] Tactic: 5721595115357140131 Time: 0.134144 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 5966973378912044513 [03/25/2022-13:24:45] [V] [TRT] Tactic: 5966973378912044513 Time: 0.093184 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 6004789655466615912 [03/25/2022-13:24:45] [V] [TRT] Tactic: 6004789655466615912 Time: 0.130944 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r3s3 Tactic: 6146901278630392829 [03/25/2022-13:24:45] [V] [TRT] Tactic: 6146901278630392829 Time: 0.097152 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16 Tactic: 6394572396369862482 [03/25/2022-13:24:45] [V] [TRT] Tactic: 6394572396369862482 Time: 0.272256 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_epifadd Tactic: 6434020722187266170 [03/25/2022-13:24:45] [V] [TRT] Tactic: 6434020722187266170 Time: 0.092416 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32 Tactic: 6781129591847482048 [03/25/2022-13:24:45] [V] [TRT] Tactic: 6781129591847482048 Time: 0.110464 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 6984451771200230840 [03/25/2022-13:24:45] [V] [TRT] Tactic: 6984451771200230840 Time: 0.141568 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 7048234086361926570 [03/25/2022-13:24:45] [V] [TRT] Tactic: 7048234086361926570 Time: 0.17984 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32 Tactic: 7077570591813340966 [03/25/2022-13:24:45] [V] [TRT] Tactic: 7077570591813340966 Time: 0.106624 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 7191893591576074000 [03/25/2022-13:24:45] [V] [TRT] Tactic: 7191893591576074000 Time: 0.169856 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_t1r3s3 Tactic: 7429976449747682901 [03/25/2022-13:24:45] [V] [TRT] Tactic: 7429976449747682901 Time: 0.125696 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 7438984192263206338 [03/25/2022-13:24:45] [V] [TRT] Tactic: 7438984192263206338 Time: 0.103808 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: 7504901284678552178 [03/25/2022-13:24:45] [V] [TRT] Tactic: 7504901284678552178 Time: 0.092416 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 8096257414008860171 [03/25/2022-13:24:45] [V] [TRT] Tactic: 8096257414008860171 Time: 0.107264 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 8128112048355596715 [03/25/2022-13:24:45] [V] [TRT] Tactic: 8128112048355596715 Time: 0.107136 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_indexed_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: 8751622450593766232 [03/25/2022-13:24:45] [V] [TRT] Tactic: 8751622450593766232 Time: 0.099712 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_indexed_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32 Tactic: 9064458886956700976 [03/25/2022-13:24:45] [V] [TRT] Tactic: 9064458886956700976 Time: 0.101504 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 9143438935315839085 [03/25/2022-13:24:45] [V] [TRT] Tactic: 9143438935315839085 Time: 0.1664 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32 Tactic: -9165697322068360861 [03/25/2022-13:24:45] [V] [TRT] Tactic: -9165697322068360861 Time: 0.096128 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_small_nt_v1 Tactic: -9118785798277698619 [03/25/2022-13:24:45] [V] [TRT] Tactic: -9118785798277698619 Time: 0.166912 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: -9108166971364503411 [03/25/2022-13:24:45] [V] [TRT] Tactic: -9108166971364503411 Time: 0.182144 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16 Tactic: -8861822316054763526 [03/25/2022-13:24:45] [V] [TRT] Tactic: -8861822316054763526 Time: 0.169728 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64 Tactic: -8791277710877987710 [03/25/2022-13:24:45] [V] [TRT] Tactic: -8791277710877987710 Time: 0.125312 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: -8691377209893505057 [03/25/2022-13:24:45] [V] [TRT] Tactic: -8691377209893505057 Time: 0.092928 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: -8520292213102999339 [03/25/2022-13:24:45] [V] [TRT] Tactic: -8520292213102999339 Time: 0.141952 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16_t1r3s3 Tactic: -8475551154769412306 [03/25/2022-13:24:45] [V] [TRT] Tactic: -8475551154769412306 Time: 0.172032 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -8417388128970254446 [03/25/2022-13:24:45] [V] [TRT] Tactic: -8417388128970254446 Time: 0.142336 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32 Tactic: -8263994888336646547 [03/25/2022-13:24:45] [V] [TRT] Tactic: -8263994888336646547 Time: 0.093184 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32 Tactic: -8205948405243401049 [03/25/2022-13:24:45] [V] [TRT] Tactic: -8205948405243401049 Time: 0.174208 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: -7992068592656168418 [03/25/2022-13:24:45] [V] [TRT] Tactic: -7992068592656168418 Time: 0.107136 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_t1r3s3 Tactic: -7898477046581738867 [03/25/2022-13:24:45] [V] [TRT] Tactic: -7898477046581738867 Time: 0.134784 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_epifadd Tactic: -7842775553137511386 [03/25/2022-13:24:45] [V] [TRT] Tactic: -7842775553137511386 Time: 0.094208 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32 Tactic: -7683887278997527517 [03/25/2022-13:24:45] [V] [TRT] Tactic: -7683887278997527517 Time: 0.15872 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: -7381370635708568663 [03/25/2022-13:24:45] [V] [TRT] Tactic: -7381370635708568663 Time: 0.108544 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -7129320389887881029 [03/25/2022-13:24:45] [V] [TRT] Tactic: -7129320389887881029 Time: 0.127872 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -6959995514028471820 [03/25/2022-13:24:45] [V] [TRT] Tactic: -6959995514028471820 Time: 0.150016 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_small_nt_v1 Tactic: -6400348606759295499 [03/25/2022-13:24:45] [V] [TRT] Tactic: -6400348606759295499 Time: 0.16896 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: -6371781333659293809 [03/25/2022-13:24:45] [V] [TRT] Tactic: -6371781333659293809 Time: 0.178944 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32 Tactic: -6256128573036943404 [03/25/2022-13:24:45] [V] [TRT] Tactic: -6256128573036943404 Time: 0.141056 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_small_nt_v1 Tactic: -5980889159865208399 [03/25/2022-13:24:45] [V] [TRT] Tactic: -5980889159865208399 Time: 0.172032 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_medium_nt_v1 Tactic: -5766140806760372989 [03/25/2022-13:24:45] [V] [TRT] Tactic: -5766140806760372989 Time: 0.180096 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: -5709079507616090666 [03/25/2022-13:24:45] [V] [TRT] Tactic: -5709079507616090666 Time: 0.092544 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: -5698636014239116282 [03/25/2022-13:24:45] [V] [TRT] Tactic: -5698636014239116282 Time: 0.0928 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16 Tactic: -5180570335464125033 [03/25/2022-13:24:45] [V] [TRT] Tactic: -5180570335464125033 Time: 0.177152 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -4933563390723451692 [03/25/2022-13:24:45] [V] [TRT] Tactic: -4933563390723451692 Time: 0.128384 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_medium_nt_v1 Tactic: -4516822589357530549 [03/25/2022-13:24:45] [V] [TRT] Tactic: -4516822589357530549 Time: 0.180992 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64 Tactic: -4232916483289779353 [03/25/2022-13:24:45] [V] [TRT] Tactic: -4232916483289779353 Time: 0.201344 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64 Tactic: -3460842194336717186 [03/25/2022-13:24:45] [V] [TRT] Tactic: -3460842194336717186 Time: 0.099456 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: -3413217501222406256 [03/25/2022-13:24:45] [V] [TRT] Tactic: -3413217501222406256 Time: 0.090752 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: -3280888557222886418 [03/25/2022-13:24:45] [V] [TRT] Tactic: -3280888557222886418 Time: 0.113024 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -3238475748440751107 [03/25/2022-13:24:45] [V] [TRT] Tactic: -3238475748440751107 Time: 0.10368 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: -3182884991006484042 [03/25/2022-13:24:45] [V] [TRT] Tactic: -3182884991006484042 Time: 0.093696 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -3173468756112541306 [03/25/2022-13:24:45] [V] [TRT] Tactic: -3173468756112541306 Time: 0.170368 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_large_nt_v1 Tactic: -2917455979290586480 [03/25/2022-13:24:45] [V] [TRT] Tactic: -2917455979290586480 Time: 0.173184 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -2741641298163591508 [03/25/2022-13:24:45] [V] [TRT] Tactic: -2741641298163591508 Time: 0.106112 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_large_nt_v1 Tactic: -2571022005763160364 [03/25/2022-13:24:45] [V] [TRT] Tactic: -2571022005763160364 Time: 0.179328 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: -2499089240293650188 [03/25/2022-13:24:45] [V] [TRT] Tactic: -2499089240293650188 Time: 0.175616 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: -2328318099174473157 [03/25/2022-13:24:45] [V] [TRT] Tactic: -2328318099174473157 Time: 0.183296 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: -2083778562631872334 [03/25/2022-13:24:45] [V] [TRT] Tactic: -2083778562631872334 Time: 0.10944 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32 Tactic: -2054375205435666404 [03/25/2022-13:24:45] [V] [TRT] Tactic: -2054375205435666404 Time: 0.127488 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -1546787387293556842 [03/25/2022-13:24:45] [V] [TRT] Tactic: -1546787387293556842 Time: 0.092928 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32 Tactic: -1498626619443284096 [03/25/2022-13:24:45] [V] [TRT] Tactic: -1498626619443284096 Time: 0.134016 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -1471245223605064669 [03/25/2022-13:24:45] [V] [TRT] Tactic: -1471245223605064669 Time: 0.15488 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32 Tactic: -1283580231568512025 [03/25/2022-13:24:45] [V] [TRT] Tactic: -1283580231568512025 Time: 0.205696 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: -1224421172675151280 [03/25/2022-13:24:45] [V] [TRT] Tactic: -1224421172675151280 Time: 0.091136 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_epifadd Tactic: -1173968681844185579 [03/25/2022-13:24:45] [V] [TRT] Tactic: -1173968681844185579 Time: 0.207104 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -921247911551089037 [03/25/2022-13:24:45] [V] [TRT] Tactic: -921247911551089037 Time: 0.092416 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32 Tactic: -762222380308749469 [03/25/2022-13:24:45] [V] [TRT] Tactic: -762222380308749469 Time: 0.12928 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: -556794153877490941 [03/25/2022-13:24:45] [V] [TRT] Tactic: -556794153877490941 Time: 0.130816 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -516725800067794372 [03/25/2022-13:24:45] [V] [TRT] Tactic: -516725800067794372 Time: 0.09408 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_large_nt_v1 Tactic: -428104331444385564 [03/25/2022-13:24:45] [V] [TRT] Tactic: -428104331444385564 Time: 0.183424 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16 Tactic: -366411318217594794 [03/25/2022-13:24:45] [V] [TRT] Tactic: -366411318217594794 Time: 0.205952 [03/25/2022-13:24:45] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16 Tactic: -351548418071036983 [03/25/2022-13:24:45] [V] [TRT] Tactic: -351548418071036983 Time: 0.169984 [03/25/2022-13:24:45] [V] [TRT] Fastest Tactic: 4414594337986714263 Time: 0.077824 [03/25/2022-13:24:45] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: 4414594337986714263 [03/25/2022-13:24:45] [V] [TRT] =============== Computing costs for [03/25/2022-13:24:45] [V] [TRT] *************** Autotuning format combination: Int8(12544,196:4,14,1), Int8(50176,196:4,14,1) -> Int8(50176,196:4,14,1) *************** [03/25/2022-13:24:45] [V] [TRT] *************** Autotuning format combination: Int8(12544,196:4,14,1), Int8(6272,196:32,14,1) -> Int8(6272,196:32,14,1) *************** [03/25/2022-13:24:45] [V] [TRT] *************** Autotuning format combination: Int8(1568,196:32,14,1), Int8(6272,196:32,14,1) -> Int8(6272,196:32,14,1) *************** [03/25/2022-13:24:46] [V] [TRT] =============== Computing costs for [03/25/2022-13:24:46] [V] [TRT] *************** Autotuning format combination: Int8(50176,196:4,14,1) -> Int8(12544,196:4,14,1) *************** [03/25/2022-13:24:46] [V] [TRT] *************** Autotuning format combination: Int8(50176,196:4,14,1) -> Int8(1568,196:32,14,1) *************** [03/25/2022-13:24:46] [V] [TRT] *************** Autotuning format combination: Int8(6272,196:32,14,1) -> Int8(1568,196:32,14,1) *************** [03/25/2022-13:24:46] [V] [TRT] Skip timing cache hit (epiFadd mismatch) for node: sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696, LayerImpl: CaskConvolution, tactic: 857001784974286465 [03/25/2022-13:24:46] [V] [TRT] --------------- Timing Runner: sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 (CudaGroupConvolution) [03/25/2022-13:24:46] [V] [TRT] CudaGroupConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:46] [V] [TRT] --------------- Timing Runner: sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 (CudaDepthwiseConvolution) [03/25/2022-13:24:46] [V] [TRT] CudaDepthwiseConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:46] [V] [TRT] --------------- Timing Runner: sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 (FusedConvActConvolution) [03/25/2022-13:24:46] [V] [TRT] FusedConvActConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:46] [V] [TRT] --------------- Timing Runner: sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 (CaskConvolution) [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16_t1r1s1 Tactic: 68468667201176803 [03/25/2022-13:24:46] [V] [TRT] Tactic: 68468667201176803 Time: 0.089984 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 125145153013230687 [03/25/2022-13:24:46] [V] [TRT] Tactic: 125145153013230687 Time: 0.081152 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_interior_nt_v1 Tactic: 434957160407688216 [03/25/2022-13:24:46] [V] [TRT] Tactic: 434957160407688216 Time: 0.093184 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 805889586762897346 [03/25/2022-13:24:46] [V] [TRT] Tactic: 805889586762897346 Time: 0.055424 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: 857001784974286465 [03/25/2022-13:24:46] [V] [TRT] Tactic: 857001784974286465 Time: 0.04736 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 1214130898909872671 [03/25/2022-13:24:46] [V] [TRT] Tactic: 1214130898909872671 Time: 0.110464 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_conv_fprop_smallk_i8i8_i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_warptilesize16x32_stage5_warpsize1x4x2_r1s1_u1v1_hw0_c1024_scalebias_relu Tactic: 1263683011321748626 [03/25/2022-13:24:46] [V] [TRT] Tactic: 1263683011321748626 Time: 0.047872 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 1278425129871930205 [03/25/2022-13:24:46] [V] [TRT] Tactic: 1278425129871930205 Time: 0.053888 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 1583811548148740665 [03/25/2022-13:24:46] [V] [TRT] Tactic: 1583811548148740665 Time: 0.082176 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 1701344857577810806 [03/25/2022-13:24:46] [V] [TRT] Tactic: 1701344857577810806 Time: 0.071936 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 1797231177354918208 [03/25/2022-13:24:46] [V] [TRT] Tactic: 1797231177354918208 Time: 0.098176 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2004812516525036381 [03/25/2022-13:24:46] [V] [TRT] Tactic: 2004812516525036381 Time: 0.073472 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2030033463723799063 [03/25/2022-13:24:46] [V] [TRT] Tactic: 2030033463723799063 Time: 0.058496 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_interior_nt_v1 Tactic: 2346437292116182513 [03/25/2022-13:24:46] [V] [TRT] Tactic: 2346437292116182513 Time: 0.09152 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2376898825218218566 [03/25/2022-13:24:46] [V] [TRT] Tactic: 2376898825218218566 Time: 0.05376 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_interior_nt_v1 Tactic: 2522133112320625287 [03/25/2022-13:24:46] [V] [TRT] Tactic: 2522133112320625287 Time: 0.089344 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2548171972648455240 [03/25/2022-13:24:46] [V] [TRT] Tactic: 2548171972648455240 Time: 0.057472 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 2548946449357458230 [03/25/2022-13:24:46] [V] [TRT] Tactic: 2548946449357458230 Time: 0.107136 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: 2570666021825229009 [03/25/2022-13:24:46] [V] [TRT] Tactic: 2570666021825229009 Time: 0.093568 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 2678520742286844763 [03/25/2022-13:24:46] [V] [TRT] Tactic: 2678520742286844763 Time: 0.137344 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 2756291002030759362 [03/25/2022-13:24:46] [V] [TRT] Tactic: 2756291002030759362 Time: 0.070912 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2972948223367788520 [03/25/2022-13:24:46] [V] [TRT] Tactic: 2972948223367788520 Time: 0.058752 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_medium_nt_v1 Tactic: 2985940154541537814 [03/25/2022-13:24:46] [V] [TRT] Tactic: 2985940154541537814 Time: 0.091136 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 3043273137345374664 [03/25/2022-13:24:46] [V] [TRT] Tactic: 3043273137345374664 Time: 0.110848 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 3221677093659484230 [03/25/2022-13:24:46] [V] [TRT] Tactic: 3221677093659484230 Time: 0.089088 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: 3242897809704328258 [03/25/2022-13:24:46] [V] [TRT] Tactic: 3242897809704328258 Time: 0.094208 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 3312456766204252694 [03/25/2022-13:24:46] [V] [TRT] Tactic: 3312456766204252694 Time: 0.11904 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 3538565962642681625 [03/25/2022-13:24:46] [V] [TRT] Tactic: 3538565962642681625 Time: 0.080896 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 3541919052468401776 [03/25/2022-13:24:46] [V] [TRT] Tactic: 3541919052468401776 Time: 0.085248 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 3593397928177382100 [03/25/2022-13:24:46] [V] [TRT] Tactic: 3593397928177382100 Time: 0.111104 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 3670282018109435863 [03/25/2022-13:24:46] [V] [TRT] Tactic: 3670282018109435863 Time: 0.068096 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 3671413346254027573 [03/25/2022-13:24:46] [V] [TRT] Tactic: 3671413346254027573 Time: 0.070656 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_medium_nt_v1 Tactic: 3899284354987683408 [03/25/2022-13:24:46] [V] [TRT] Tactic: 3899284354987683408 Time: 0.092416 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 3927509214678622419 [03/25/2022-13:24:46] [V] [TRT] Tactic: 3927509214678622419 Time: 0.092672 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 4112572034735311841 [03/25/2022-13:24:46] [V] [TRT] Tactic: 4112572034735311841 Time: 0.139904 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 4239974928951431644 [03/25/2022-13:24:46] [V] [TRT] Tactic: 4239974928951431644 Time: 0.084736 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 4610760414797216079 [03/25/2022-13:24:46] [V] [TRT] Tactic: 4610760414797216079 Time: 0.072448 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_small_nt_v1 Tactic: 4717285412741024953 [03/25/2022-13:24:46] [V] [TRT] Tactic: 4717285412741024953 Time: 0.091008 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 4796956614760326119 [03/25/2022-13:24:46] [V] [TRT] Tactic: 4796956614760326119 Time: 0.064384 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: 4909502217677847353 [03/25/2022-13:24:46] [V] [TRT] Tactic: 4909502217677847353 Time: 0.049536 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 4919361344804309192 [03/25/2022-13:24:46] [V] [TRT] Tactic: 4919361344804309192 Time: 0.1088 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 5043674678294309681 [03/25/2022-13:24:46] [V] [TRT] Tactic: 5043674678294309681 Time: 0.071552 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_interior_nt_v1 Tactic: 5126565865931538390 [03/25/2022-13:24:46] [V] [TRT] Tactic: 5126565865931538390 Time: 0.092416 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5204702486885981735 [03/25/2022-13:24:46] [V] [TRT] Tactic: 5204702486885981735 Time: 0.06976 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 5375256703210220108 [03/25/2022-13:24:46] [V] [TRT] Tactic: 5375256703210220108 Time: 0.067072 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 5424258848951129084 [03/25/2022-13:24:46] [V] [TRT] Tactic: 5424258848951129084 Time: 0.049664 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5443897483205284103 [03/25/2022-13:24:46] [V] [TRT] Tactic: 5443897483205284103 Time: 0.07168 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5707566217891294846 [03/25/2022-13:24:46] [V] [TRT] Tactic: 5707566217891294846 Time: 0.059776 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5986622376339202983 [03/25/2022-13:24:46] [V] [TRT] Tactic: 5986622376339202983 Time: 0.08128 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6007888770437705057 [03/25/2022-13:24:46] [V] [TRT] Tactic: 6007888770437705057 Time: 0.062848 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 6405251167055673379 [03/25/2022-13:24:46] [V] [TRT] Tactic: 6405251167055673379 Time: 0.06528 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 6433368103202497147 [03/25/2022-13:24:46] [V] [TRT] Tactic: 6433368103202497147 Time: 0.061696 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6441948709525127755 [03/25/2022-13:24:46] [V] [TRT] Tactic: 6441948709525127755 Time: 0.11264 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 6443933097134654777 [03/25/2022-13:24:46] [V] [TRT] Tactic: 6443933097134654777 Time: 0.064384 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6457435868048963632 [03/25/2022-13:24:46] [V] [TRT] Tactic: 6457435868048963632 Time: 0.072448 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1 Tactic: 6510345569544721081 [03/25/2022-13:24:46] [V] [TRT] Tactic: 6510345569544721081 Time: 0.094336 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 6793988781414507278 [03/25/2022-13:24:46] [V] [TRT] Tactic: 6793988781414507278 Time: 0.055552 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 6880710371738875469 [03/25/2022-13:24:46] [V] [TRT] Tactic: 6880710371738875469 Time: 0.0768 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6925201228918187099 [03/25/2022-13:24:46] [V] [TRT] Tactic: 6925201228918187099 Time: 0.056704 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6991524515605108718 [03/25/2022-13:24:46] [V] [TRT] Tactic: 6991524515605108718 Time: 0.088832 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 7245509442265271220 [03/25/2022-13:24:46] [V] [TRT] Tactic: 7245509442265271220 Time: 0.08064 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 7318929579222925725 [03/25/2022-13:24:46] [V] [TRT] Tactic: 7318929579222925725 Time: 0.064896 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 7731430299029542276 [03/25/2022-13:24:46] [V] [TRT] Tactic: 7731430299029542276 Time: 0.055424 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 7738495016763012180 [03/25/2022-13:24:46] [V] [TRT] Tactic: 7738495016763012180 Time: 0.056064 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: 7886967395128926382 [03/25/2022-13:24:46] [V] [TRT] Tactic: 7886967395128926382 Time: 0.062208 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 8142283985160822229 [03/25/2022-13:24:46] [V] [TRT] Tactic: 8142283985160822229 Time: 0.057216 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 8173975624668590862 [03/25/2022-13:24:46] [V] [TRT] Tactic: 8173975624668590862 Time: 0.058624 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 8234775147403903473 [03/25/2022-13:24:46] [V] [TRT] Tactic: 8234775147403903473 Time: 0.058496 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 8524082966802584889 [03/25/2022-13:24:46] [V] [TRT] Tactic: 8524082966802584889 Time: 0.060672 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 8684013308930763400 [03/25/2022-13:24:46] [V] [TRT] Tactic: 8684013308930763400 Time: 0.088576 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 8765382722978397630 [03/25/2022-13:24:46] [V] [TRT] Tactic: 8765382722978397630 Time: 0.061312 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 8843193587782643431 [03/25/2022-13:24:46] [V] [TRT] Tactic: 8843193587782643431 Time: 0.080256 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 8883810517410230831 [03/25/2022-13:24:46] [V] [TRT] Tactic: 8883810517410230831 Time: 0.062592 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 8930797211803511337 [03/25/2022-13:24:46] [V] [TRT] Tactic: 8930797211803511337 Time: 0.082816 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 8935070489925739043 [03/25/2022-13:24:46] [V] [TRT] Tactic: 8935070489925739043 Time: 0.059904 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1 Tactic: 9062173295331155069 [03/25/2022-13:24:46] [V] [TRT] Tactic: 9062173295331155069 Time: 0.137216 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_small_nt_v1 Tactic: -9118785798277698619 [03/25/2022-13:24:46] [V] [TRT] Tactic: -9118785798277698619 Time: 0.090624 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -8985599729413291927 [03/25/2022-13:24:46] [V] [TRT] Tactic: -8985599729413291927 Time: 0.065792 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -8972697510150675429 [03/25/2022-13:24:46] [V] [TRT] Tactic: -8972697510150675429 Time: 0.084352 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -8943710627305202139 [03/25/2022-13:24:46] [V] [TRT] Tactic: -8943710627305202139 Time: 0.093184 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -8859846367886814331 [03/25/2022-13:24:46] [V] [TRT] Tactic: -8859846367886814331 Time: 0.098048 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -8638624340850784688 [03/25/2022-13:24:46] [V] [TRT] Tactic: -8638624340850784688 Time: 0.082944 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -8556775352640313933 [03/25/2022-13:24:46] [V] [TRT] Tactic: -8556775352640313933 Time: 0.058752 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -8382298409581540699 [03/25/2022-13:24:46] [V] [TRT] Tactic: -8382298409581540699 Time: 0.135424 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -8172318747337038866 [03/25/2022-13:24:46] [V] [TRT] Tactic: -8172318747337038866 Time: 0.089344 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -8038164441468184723 [03/25/2022-13:24:46] [V] [TRT] Tactic: -8038164441468184723 Time: 0.06016 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -7844028314176826857 [03/25/2022-13:24:46] [V] [TRT] Tactic: -7844028314176826857 Time: 0.11136 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -7674507941016740570 [03/25/2022-13:24:46] [V] [TRT] Tactic: -7674507941016740570 Time: 0.04928 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -7364286662638617917 [03/25/2022-13:24:46] [V] [TRT] Tactic: -7364286662638617917 Time: 0.0544 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r1s1 Tactic: -7361755530333096258 [03/25/2022-13:24:46] [V] [TRT] Tactic: -7361755530333096258 Time: 0.108544 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_t1r1s1 Tactic: -7289760022626653388 [03/25/2022-13:24:46] [V] [TRT] Tactic: -7289760022626653388 Time: 0.062592 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -7106539943789766885 [03/25/2022-13:24:46] [V] [TRT] Tactic: -7106539943789766885 Time: 0.088064 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -6969478418607271266 [03/25/2022-13:24:46] [V] [TRT] Tactic: -6969478418607271266 Time: 0.088192 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -6930438165437733000 [03/25/2022-13:24:46] [V] [TRT] Tactic: -6930438165437733000 Time: 0.140032 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -6879607992933502380 [03/25/2022-13:24:46] [V] [TRT] Tactic: -6879607992933502380 Time: 0.063872 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -6839669803644810934 [03/25/2022-13:24:46] [V] [TRT] Tactic: -6839669803644810934 Time: 0.070656 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -6812830108414456369 [03/25/2022-13:24:46] [V] [TRT] Tactic: -6812830108414456369 Time: 0.07104 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -6779804930216439173 [03/25/2022-13:24:46] [V] [TRT] Tactic: -6779804930216439173 Time: 0.046848 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -6527178416855951297 [03/25/2022-13:24:46] [V] [TRT] Tactic: -6527178416855951297 Time: 0.110464 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -6510232214299595844 [03/25/2022-13:24:46] [V] [TRT] Tactic: -6510232214299595844 Time: 0.111104 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_small_nt_v1 Tactic: -6400348606759295499 [03/25/2022-13:24:46] [V] [TRT] Tactic: -6400348606759295499 Time: 0.088832 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -6346247605026339453 [03/25/2022-13:24:46] [V] [TRT] Tactic: -6346247605026339453 Time: 0.09216 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -6232597026469067819 [03/25/2022-13:24:46] [V] [TRT] Tactic: -6232597026469067819 Time: 0.079488 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_small_nt_v1 Tactic: -5980889159865208399 [03/25/2022-13:24:46] [V] [TRT] Tactic: -5980889159865208399 Time: 0.091008 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_medium_nt_v1 Tactic: -5766140806760372989 [03/25/2022-13:24:46] [V] [TRT] Tactic: -5766140806760372989 Time: 0.093312 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -5697614955743334137 [03/25/2022-13:24:46] [V] [TRT] Tactic: -5697614955743334137 Time: 0.086656 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -5671123121710113970 [03/25/2022-13:24:46] [V] [TRT] Tactic: -5671123121710113970 Time: 0.087296 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -5615581362569252260 [03/25/2022-13:24:46] [V] [TRT] Tactic: -5615581362569252260 Time: 0.095616 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -5562968047117507056 [03/25/2022-13:24:46] [V] [TRT] Tactic: -5562968047117507056 Time: 0.062464 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -5516472881360101487 [03/25/2022-13:24:46] [V] [TRT] Tactic: -5516472881360101487 Time: 0.082176 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -5311474420963248369 [03/25/2022-13:24:46] [V] [TRT] Tactic: -5311474420963248369 Time: 0.119808 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -5170003087447722174 [03/25/2022-13:24:46] [V] [TRT] Tactic: -5170003087447722174 Time: 0.113536 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -4889586143772361690 [03/25/2022-13:24:46] [V] [TRT] Tactic: -4889586143772361690 Time: 0.070528 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -4889498558023475527 [03/25/2022-13:24:46] [V] [TRT] Tactic: -4889498558023475527 Time: 0.061824 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -4849712423393454704 [03/25/2022-13:24:46] [V] [TRT] Tactic: -4849712423393454704 Time: 0.070272 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: -4681913707320020520 [03/25/2022-13:24:46] [V] [TRT] Tactic: -4681913707320020520 Time: 0.046592 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_medium_nt_v1 Tactic: -4516822589357530549 [03/25/2022-13:24:46] [V] [TRT] Tactic: -4516822589357530549 Time: 0.09472 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -4455415102719506646 [03/25/2022-13:24:46] [V] [TRT] Tactic: -4455415102719506646 Time: 0.080256 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -4425346730823666456 [03/25/2022-13:24:46] [V] [TRT] Tactic: -4425346730823666456 Time: 0.085248 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r1s1 Tactic: -4260476497340370474 [03/25/2022-13:24:46] [V] [TRT] Tactic: -4260476497340370474 Time: 0.136064 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -4182501876984672402 [03/25/2022-13:24:46] [V] [TRT] Tactic: -4182501876984672402 Time: 0.086912 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_t1r1s1 Tactic: -4151617293257698859 [03/25/2022-13:24:46] [V] [TRT] Tactic: -4151617293257698859 Time: 0.064768 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -3862908719298381451 [03/25/2022-13:24:46] [V] [TRT] Tactic: -3862908719298381451 Time: 0.04736 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -3825889760337461729 [03/25/2022-13:24:46] [V] [TRT] Tactic: -3825889760337461729 Time: 0.109056 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -3797022944823726673 [03/25/2022-13:24:46] [V] [TRT] Tactic: -3797022944823726673 Time: 0.07936 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -3613322253849278738 [03/25/2022-13:24:46] [V] [TRT] Tactic: -3613322253849278738 Time: 0.140544 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -3577322188448771475 [03/25/2022-13:24:46] [V] [TRT] Tactic: -3577322188448771475 Time: 0.086912 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16_t1r1s1 Tactic: -3531681826488401618 [03/25/2022-13:24:46] [V] [TRT] Tactic: -3531681826488401618 Time: 0.08896 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -3305554949874552860 [03/25/2022-13:24:46] [V] [TRT] Tactic: -3305554949874552860 Time: 0.136832 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -3288585994448820820 [03/25/2022-13:24:46] [V] [TRT] Tactic: -3288585994448820820 Time: 0.081024 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -2754311112012636251 [03/25/2022-13:24:46] [V] [TRT] Tactic: -2754311112012636251 Time: 0.08896 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -2432868635536396215 [03/25/2022-13:24:46] [V] [TRT] Tactic: -2432868635536396215 Time: 0.081536 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -2379804152300264660 [03/25/2022-13:24:46] [V] [TRT] Tactic: -2379804152300264660 Time: 0.1088 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: -2352253835013627337 [03/25/2022-13:24:46] [V] [TRT] Tactic: -2352253835013627337 Time: 0.05056 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -2335587136911650799 [03/25/2022-13:24:46] [V] [TRT] Tactic: -2335587136911650799 Time: 0.073472 [03/25/2022-13:24:46] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r1s1 Tactic: -2315453944962430928 [03/25/2022-13:24:47] [V] [TRT] Tactic: -2315453944962430928 Time: 0.059392 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: -2238364958919154661 [03/25/2022-13:24:47] [V] [TRT] Tactic: -2238364958919154661 Time: 0.09408 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -1916483171117495388 [03/25/2022-13:24:47] [V] [TRT] Tactic: -1916483171117495388 Time: 0.089088 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -1740762957710554518 [03/25/2022-13:24:47] [V] [TRT] Tactic: -1740762957710554518 Time: 0.137472 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -1549742793039499659 [03/25/2022-13:24:47] [V] [TRT] Tactic: -1549742793039499659 Time: 0.080128 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -1499578657823798783 [03/25/2022-13:24:47] [V] [TRT] Tactic: -1499578657823798783 Time: 0.08128 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -1494157908358500249 [03/25/2022-13:24:47] [V] [TRT] Tactic: -1494157908358500249 Time: 0.098944 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_t1r1s1 Tactic: -1328736756812546664 [03/25/2022-13:24:47] [V] [TRT] Tactic: -1328736756812546664 Time: 0.082048 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -1006589727652607355 [03/25/2022-13:24:47] [V] [TRT] Tactic: -1006589727652607355 Time: 0.097024 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -713022856474991236 [03/25/2022-13:24:47] [V] [TRT] Tactic: -713022856474991236 Time: 0.142464 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -619668460699260222 [03/25/2022-13:24:47] [V] [TRT] Tactic: -619668460699260222 Time: 0.093696 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -405554772060757402 [03/25/2022-13:24:47] [V] [TRT] Tactic: -405554772060757402 Time: 0.070016 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -375949437730908730 [03/25/2022-13:24:47] [V] [TRT] Tactic: -375949437730908730 Time: 0.068608 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: -233227833606287806 [03/25/2022-13:24:47] [V] [TRT] Tactic: -233227833606287806 Time: 0.071808 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -111878368089469751 [03/25/2022-13:24:47] [V] [TRT] Tactic: -111878368089469751 Time: 0.080384 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -48936598874722005 [03/25/2022-13:24:47] [V] [TRT] Tactic: -48936598874722005 Time: 0.061824 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -19707840769375107 [03/25/2022-13:24:47] [V] [TRT] Tactic: -19707840769375107 Time: 0.086912 [03/25/2022-13:24:47] [V] [TRT] Fastest Tactic: -4681913707320020520 Time: 0.046592 [03/25/2022-13:24:47] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: -4681913707320020520 [03/25/2022-13:24:47] [V] [TRT] =============== Computing costs for [03/25/2022-13:24:47] [V] [TRT] *************** Autotuning format combination: Int8(12544,196:4,14,1) -> Int8(12544,196:4,14,1) *************** [03/25/2022-13:24:47] [V] [TRT] *************** Autotuning format combination: Int8(12544,196:4,14,1) -> Int8(1568,196:32,14,1) *************** [03/25/2022-13:24:47] [V] [TRT] *************** Autotuning format combination: Int8(1568,196:32,14,1) -> Int8(1568,196:32,14,1) *************** [03/25/2022-13:24:47] [V] [TRT] Skip timing cache hit (epiFadd mismatch) for node: sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711, LayerImpl: CaskConvolution, tactic: 4414594337986714263 [03/25/2022-13:24:47] [V] [TRT] --------------- Timing Runner: sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 (CudaGroupConvolution) [03/25/2022-13:24:47] [V] [TRT] CudaGroupConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:47] [V] [TRT] --------------- Timing Runner: sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 (CudaDepthwiseConvolution) [03/25/2022-13:24:47] [V] [TRT] CudaDepthwiseConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:47] [V] [TRT] --------------- Timing Runner: sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 (FusedConvActConvolution) [03/25/2022-13:24:47] [V] [TRT] FusedConvActConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:47] [V] [TRT] --------------- Timing Runner: sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 (CaskConvolution) [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_epifadd Tactic: 177040020707947851 [03/25/2022-13:24:47] [V] [TRT] Tactic: 177040020707947851 Time: 0.188928 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 184229963126259101 [03/25/2022-13:24:47] [V] [TRT] Tactic: 184229963126259101 Time: 0.134272 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 289888059097454627 [03/25/2022-13:24:47] [V] [TRT] Tactic: 289888059097454627 Time: 0.167808 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32 Tactic: 328135613486708155 [03/25/2022-13:24:47] [V] [TRT] Tactic: 328135613486708155 Time: 0.276224 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 680740992583869928 [03/25/2022-13:24:47] [V] [TRT] Tactic: 680740992583869928 Time: 0.173952 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32 Tactic: 1111159740952609683 [03/25/2022-13:24:47] [V] [TRT] Tactic: 1111159740952609683 Time: 0.144384 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: 1134860903395928905 [03/25/2022-13:24:47] [V] [TRT] Tactic: 1134860903395928905 Time: 0.130816 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: 1276591930377039442 [03/25/2022-13:24:47] [V] [TRT] Tactic: 1276591930377039442 Time: 0.139648 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r3s3 Tactic: 1388866374720163187 [03/25/2022-13:24:47] [V] [TRT] Tactic: 1388866374720163187 Time: 0.204288 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32 Tactic: 1399501420456320585 [03/25/2022-13:24:47] [V] [TRT] Tactic: 1399501420456320585 Time: 0.171008 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: 1550399266192842845 [03/25/2022-13:24:47] [V] [TRT] Tactic: 1550399266192842845 Time: 0.173568 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 1572887561103143487 [03/25/2022-13:24:47] [V] [TRT] Tactic: 1572887561103143487 Time: 0.130944 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 1853122447892949466 [03/25/2022-13:24:47] [V] [TRT] Tactic: 1853122447892949466 Time: 0.17088 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: 2133329569091732311 [03/25/2022-13:24:47] [V] [TRT] Tactic: 2133329569091732311 Time: 0.169088 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32 Tactic: 2325023763229477890 [03/25/2022-13:24:47] [V] [TRT] Tactic: 2325023763229477890 Time: 0.094976 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: 2579824863892891529 [03/25/2022-13:24:47] [V] [TRT] Tactic: 2579824863892891529 Time: 0.2336 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32 Tactic: 2783960536172159663 [03/25/2022-13:24:47] [V] [TRT] Tactic: 2783960536172159663 Time: 0.092928 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32 Tactic: 2821711838552913693 [03/25/2022-13:24:47] [V] [TRT] Tactic: 2821711838552913693 Time: 0.137344 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64 Tactic: 2945009978756227538 [03/25/2022-13:24:47] [V] [TRT] Tactic: 2945009978756227538 Time: 0.098432 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_medium_nt_v1 Tactic: 2985940154541537814 [03/25/2022-13:24:47] [V] [TRT] Tactic: 2985940154541537814 Time: 0.174336 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 3284282970967328046 [03/25/2022-13:24:47] [V] [TRT] Tactic: 3284282970967328046 Time: 0.2016 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 3401614690060226673 [03/25/2022-13:24:47] [V] [TRT] Tactic: 3401614690060226673 Time: 0.169728 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32 Tactic: 3456719996792527006 [03/25/2022-13:24:47] [V] [TRT] Tactic: 3456719996792527006 Time: 0.123776 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 3512426920013359699 [03/25/2022-13:24:47] [V] [TRT] Tactic: 3512426920013359699 Time: 0.127488 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64 Tactic: 3651043333819148268 [03/25/2022-13:24:47] [V] [TRT] Tactic: 3651043333819148268 Time: 0.07808 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_medium_nt_v1 Tactic: 3899284354987683408 [03/25/2022-13:24:47] [V] [TRT] Tactic: 3899284354987683408 Time: 0.175872 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32 Tactic: 4042202769383439184 [03/25/2022-13:24:47] [V] [TRT] Tactic: 4042202769383439184 Time: 0.105984 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_large_nt_v1 Tactic: 4182625619810185112 [03/25/2022-13:24:47] [V] [TRT] Tactic: 4182625619810185112 Time: 0.188672 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 4214794893922618058 [03/25/2022-13:24:47] [V] [TRT] Tactic: 4214794893922618058 Time: 0.16704 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_epifadd Tactic: 4259547356717612415 [03/25/2022-13:24:47] [V] [TRT] Tactic: 4259547356717612415 Time: 0.134272 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 4384868749799132354 [03/25/2022-13:24:47] [V] [TRT] Tactic: 4384868749799132354 Time: 0.233728 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64 Tactic: 4414594337986714263 [03/25/2022-13:24:47] [V] [TRT] Tactic: 4414594337986714263 Time: 0.077824 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_small_nt_v1 Tactic: 4717285412741024953 [03/25/2022-13:24:47] [V] [TRT] Tactic: 4717285412741024953 Time: 0.172032 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32 Tactic: 4734519122557206480 [03/25/2022-13:24:47] [V] [TRT] Tactic: 4734519122557206480 Time: 0.09728 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: 4922297020351187339 [03/25/2022-13:24:47] [V] [TRT] Tactic: 4922297020351187339 Time: 0.14208 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16_t1r3s3 Tactic: 4931167631624420067 [03/25/2022-13:24:47] [V] [TRT] Tactic: 4931167631624420067 Time: 0.168832 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_epifadd Tactic: 5121596860264626879 [03/25/2022-13:24:47] [V] [TRT] Tactic: 5121596860264626879 Time: 0.093824 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 5136656982162849059 [03/25/2022-13:24:47] [V] [TRT] Tactic: 5136656982162849059 Time: 0.201856 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: 5158259316594207439 [03/25/2022-13:24:47] [V] [TRT] Tactic: 5158259316594207439 Time: 0.105472 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 5189825015507701541 [03/25/2022-13:24:47] [V] [TRT] Tactic: 5189825015507701541 Time: 0.300032 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32 Tactic: 5424417905073460656 [03/25/2022-13:24:47] [V] [TRT] Tactic: 5424417905073460656 Time: 0.141312 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32 Tactic: 5442043907221427810 [03/25/2022-13:24:47] [V] [TRT] Tactic: 5442043907221427810 Time: 0.110336 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_t1r3s3 Tactic: 5544365258913999384 [03/25/2022-13:24:47] [V] [TRT] Tactic: 5544365258913999384 Time: 0.107904 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r3s3 Tactic: 5641967928706599451 [03/25/2022-13:24:47] [V] [TRT] Tactic: 5641967928706599451 Time: 0.27072 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 5721595115357140131 [03/25/2022-13:24:47] [V] [TRT] Tactic: 5721595115357140131 Time: 0.134144 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 5966973378912044513 [03/25/2022-13:24:47] [V] [TRT] Tactic: 5966973378912044513 Time: 0.09344 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 6004789655466615912 [03/25/2022-13:24:47] [V] [TRT] Tactic: 6004789655466615912 Time: 0.131072 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r3s3 Tactic: 6146901278630392829 [03/25/2022-13:24:47] [V] [TRT] Tactic: 6146901278630392829 Time: 0.097152 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16 Tactic: 6394572396369862482 [03/25/2022-13:24:47] [V] [TRT] Tactic: 6394572396369862482 Time: 0.272512 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_epifadd Tactic: 6434020722187266170 [03/25/2022-13:24:47] [V] [TRT] Tactic: 6434020722187266170 Time: 0.09216 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32 Tactic: 6781129591847482048 [03/25/2022-13:24:47] [V] [TRT] Tactic: 6781129591847482048 Time: 0.109952 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 6984451771200230840 [03/25/2022-13:24:47] [V] [TRT] Tactic: 6984451771200230840 Time: 0.14144 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 7048234086361926570 [03/25/2022-13:24:47] [V] [TRT] Tactic: 7048234086361926570 Time: 0.179712 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32 Tactic: 7077570591813340966 [03/25/2022-13:24:47] [V] [TRT] Tactic: 7077570591813340966 Time: 0.106752 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 7191893591576074000 [03/25/2022-13:24:47] [V] [TRT] Tactic: 7191893591576074000 Time: 0.17024 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_t1r3s3 Tactic: 7429976449747682901 [03/25/2022-13:24:47] [V] [TRT] Tactic: 7429976449747682901 Time: 0.125568 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 7438984192263206338 [03/25/2022-13:24:47] [V] [TRT] Tactic: 7438984192263206338 Time: 0.103424 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: 7504901284678552178 [03/25/2022-13:24:47] [V] [TRT] Tactic: 7504901284678552178 Time: 0.092544 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 8096257414008860171 [03/25/2022-13:24:47] [V] [TRT] Tactic: 8096257414008860171 Time: 0.10752 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 8128112048355596715 [03/25/2022-13:24:47] [V] [TRT] Tactic: 8128112048355596715 Time: 0.107008 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_indexed_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: 8751622450593766232 [03/25/2022-13:24:47] [V] [TRT] Tactic: 8751622450593766232 Time: 0.099968 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_indexed_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32 Tactic: 9064458886956700976 [03/25/2022-13:24:47] [V] [TRT] Tactic: 9064458886956700976 Time: 0.101248 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 9143438935315839085 [03/25/2022-13:24:47] [V] [TRT] Tactic: 9143438935315839085 Time: 0.167552 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32 Tactic: -9165697322068360861 [03/25/2022-13:24:47] [V] [TRT] Tactic: -9165697322068360861 Time: 0.095872 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_small_nt_v1 Tactic: -9118785798277698619 [03/25/2022-13:24:47] [V] [TRT] Tactic: -9118785798277698619 Time: 0.166784 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: -9108166971364503411 [03/25/2022-13:24:47] [V] [TRT] Tactic: -9108166971364503411 Time: 0.182144 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16 Tactic: -8861822316054763526 [03/25/2022-13:24:47] [V] [TRT] Tactic: -8861822316054763526 Time: 0.169856 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64 Tactic: -8791277710877987710 [03/25/2022-13:24:47] [V] [TRT] Tactic: -8791277710877987710 Time: 0.125824 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: -8691377209893505057 [03/25/2022-13:24:47] [V] [TRT] Tactic: -8691377209893505057 Time: 0.093184 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: -8520292213102999339 [03/25/2022-13:24:47] [V] [TRT] Tactic: -8520292213102999339 Time: 0.142592 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16_t1r3s3 Tactic: -8475551154769412306 [03/25/2022-13:24:47] [V] [TRT] Tactic: -8475551154769412306 Time: 0.171776 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -8417388128970254446 [03/25/2022-13:24:47] [V] [TRT] Tactic: -8417388128970254446 Time: 0.14208 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32 Tactic: -8263994888336646547 [03/25/2022-13:24:47] [V] [TRT] Tactic: -8263994888336646547 Time: 0.093056 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32 Tactic: -8205948405243401049 [03/25/2022-13:24:47] [V] [TRT] Tactic: -8205948405243401049 Time: 0.173952 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: -7992068592656168418 [03/25/2022-13:24:47] [V] [TRT] Tactic: -7992068592656168418 Time: 0.106752 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_t1r3s3 Tactic: -7898477046581738867 [03/25/2022-13:24:47] [V] [TRT] Tactic: -7898477046581738867 Time: 0.134784 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_epifadd Tactic: -7842775553137511386 [03/25/2022-13:24:47] [V] [TRT] Tactic: -7842775553137511386 Time: 0.094336 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32 Tactic: -7683887278997527517 [03/25/2022-13:24:47] [V] [TRT] Tactic: -7683887278997527517 Time: 0.158464 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: -7381370635708568663 [03/25/2022-13:24:47] [V] [TRT] Tactic: -7381370635708568663 Time: 0.108672 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -7129320389887881029 [03/25/2022-13:24:47] [V] [TRT] Tactic: -7129320389887881029 Time: 0.127744 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -6959995514028471820 [03/25/2022-13:24:47] [V] [TRT] Tactic: -6959995514028471820 Time: 0.150016 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_small_nt_v1 Tactic: -6400348606759295499 [03/25/2022-13:24:47] [V] [TRT] Tactic: -6400348606759295499 Time: 0.169344 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: -6371781333659293809 [03/25/2022-13:24:47] [V] [TRT] Tactic: -6371781333659293809 Time: 0.178944 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32 Tactic: -6256128573036943404 [03/25/2022-13:24:47] [V] [TRT] Tactic: -6256128573036943404 Time: 0.1408 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_small_nt_v1 Tactic: -5980889159865208399 [03/25/2022-13:24:47] [V] [TRT] Tactic: -5980889159865208399 Time: 0.172032 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_medium_nt_v1 Tactic: -5766140806760372989 [03/25/2022-13:24:47] [V] [TRT] Tactic: -5766140806760372989 Time: 0.179968 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: -5709079507616090666 [03/25/2022-13:24:47] [V] [TRT] Tactic: -5709079507616090666 Time: 0.09216 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: -5698636014239116282 [03/25/2022-13:24:47] [V] [TRT] Tactic: -5698636014239116282 Time: 0.092672 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16 Tactic: -5180570335464125033 [03/25/2022-13:24:47] [V] [TRT] Tactic: -5180570335464125033 Time: 0.177408 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -4933563390723451692 [03/25/2022-13:24:47] [V] [TRT] Tactic: -4933563390723451692 Time: 0.128256 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_medium_nt_v1 Tactic: -4516822589357530549 [03/25/2022-13:24:47] [V] [TRT] Tactic: -4516822589357530549 Time: 0.18112 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64 Tactic: -4232916483289779353 [03/25/2022-13:24:47] [V] [TRT] Tactic: -4232916483289779353 Time: 0.201216 [03/25/2022-13:24:47] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64 Tactic: -3460842194336717186 [03/25/2022-13:24:48] [V] [TRT] Tactic: -3460842194336717186 Time: 0.099328 [03/25/2022-13:24:48] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: -3413217501222406256 [03/25/2022-13:24:48] [V] [TRT] Tactic: -3413217501222406256 Time: 0.090624 [03/25/2022-13:24:48] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: -3280888557222886418 [03/25/2022-13:24:48] [V] [TRT] Tactic: -3280888557222886418 Time: 0.112512 [03/25/2022-13:24:48] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -3238475748440751107 [03/25/2022-13:24:48] [V] [TRT] Tactic: -3238475748440751107 Time: 0.10368 [03/25/2022-13:24:48] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: -3182884991006484042 [03/25/2022-13:24:48] [V] [TRT] Tactic: -3182884991006484042 Time: 0.093696 [03/25/2022-13:24:48] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -3173468756112541306 [03/25/2022-13:24:48] [V] [TRT] Tactic: -3173468756112541306 Time: 0.170496 [03/25/2022-13:24:48] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_large_nt_v1 Tactic: -2917455979290586480 [03/25/2022-13:24:48] [V] [TRT] Tactic: -2917455979290586480 Time: 0.173312 [03/25/2022-13:24:48] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -2741641298163591508 [03/25/2022-13:24:48] [V] [TRT] Tactic: -2741641298163591508 Time: 0.105984 [03/25/2022-13:24:48] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_large_nt_v1 Tactic: -2571022005763160364 [03/25/2022-13:24:48] [V] [TRT] Tactic: -2571022005763160364 Time: 0.178944 [03/25/2022-13:24:48] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: -2499089240293650188 [03/25/2022-13:24:48] [V] [TRT] Tactic: -2499089240293650188 Time: 0.175616 [03/25/2022-13:24:48] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: -2328318099174473157 [03/25/2022-13:24:48] [V] [TRT] Tactic: -2328318099174473157 Time: 0.183552 [03/25/2022-13:24:48] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: -2083778562631872334 [03/25/2022-13:24:48] [V] [TRT] Tactic: -2083778562631872334 Time: 0.109568 [03/25/2022-13:24:48] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32 Tactic: -2054375205435666404 [03/25/2022-13:24:48] [V] [TRT] Tactic: -2054375205435666404 Time: 0.127232 [03/25/2022-13:24:48] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -1546787387293556842 [03/25/2022-13:24:48] [V] [TRT] Tactic: -1546787387293556842 Time: 0.093056 [03/25/2022-13:24:48] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32 Tactic: -1498626619443284096 [03/25/2022-13:24:48] [V] [TRT] Tactic: -1498626619443284096 Time: 0.133888 [03/25/2022-13:24:48] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -1471245223605064669 [03/25/2022-13:24:48] [V] [TRT] Tactic: -1471245223605064669 Time: 0.15488 [03/25/2022-13:24:48] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32 Tactic: -1283580231568512025 [03/25/2022-13:24:48] [V] [TRT] Tactic: -1283580231568512025 Time: 0.205824 [03/25/2022-13:24:48] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: -1224421172675151280 [03/25/2022-13:24:48] [V] [TRT] Tactic: -1224421172675151280 Time: 0.091136 [03/25/2022-13:24:48] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_epifadd Tactic: -1173968681844185579 [03/25/2022-13:24:48] [V] [TRT] Tactic: -1173968681844185579 Time: 0.20672 [03/25/2022-13:24:48] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -921247911551089037 [03/25/2022-13:24:48] [V] [TRT] Tactic: -921247911551089037 Time: 0.092416 [03/25/2022-13:24:48] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32 Tactic: -762222380308749469 [03/25/2022-13:24:48] [V] [TRT] Tactic: -762222380308749469 Time: 0.129408 [03/25/2022-13:24:48] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: -556794153877490941 [03/25/2022-13:24:48] [V] [TRT] Tactic: -556794153877490941 Time: 0.130688 [03/25/2022-13:24:48] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -516725800067794372 [03/25/2022-13:24:48] [V] [TRT] Tactic: -516725800067794372 Time: 0.094208 [03/25/2022-13:24:48] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_large_nt_v1 Tactic: -428104331444385564 [03/25/2022-13:24:48] [V] [TRT] Tactic: -428104331444385564 Time: 0.18304 [03/25/2022-13:24:48] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16 Tactic: -366411318217594794 [03/25/2022-13:24:48] [V] [TRT] Tactic: -366411318217594794 Time: 0.205824 [03/25/2022-13:24:48] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16 Tactic: -351548418071036983 [03/25/2022-13:24:48] [V] [TRT] Tactic: -351548418071036983 Time: 0.169984 [03/25/2022-13:24:48] [V] [TRT] Fastest Tactic: 4414594337986714263 Time: 0.077824 [03/25/2022-13:24:48] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: 4414594337986714263 [03/25/2022-13:24:48] [V] [TRT] =============== Computing costs for [03/25/2022-13:24:48] [V] [TRT] *************** Autotuning format combination: Int8(12544,196:4,14,1), Int8(50176,196:4,14,1) -> Int8(50176,196:4,14,1) *************** [03/25/2022-13:24:48] [V] [TRT] *************** Autotuning format combination: Int8(12544,196:4,14,1), Int8(6272,196:32,14,1) -> Int8(6272,196:32,14,1) *************** [03/25/2022-13:24:48] [V] [TRT] *************** Autotuning format combination: Int8(1568,196:32,14,1), Int8(6272,196:32,14,1) -> Int8(6272,196:32,14,1) *************** [03/25/2022-13:24:48] [V] [TRT] =============== Computing costs for [03/25/2022-13:24:48] [V] [TRT] *************** Autotuning format combination: Int8(50176,196:4,14,1) -> Int8(25088,196:4,14,1) *************** [03/25/2022-13:24:48] [V] [TRT] --------------- Timing Runner: sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 (CudaDepthwiseConvolution) [03/25/2022-13:24:48] [V] [TRT] CudaDepthwiseConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:48] [V] [TRT] --------------- Timing Runner: sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 (FusedConvActConvolution) [03/25/2022-13:24:48] [V] [TRT] FusedConvActConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:48] [V] [TRT] --------------- Timing Runner: sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 (CaskConvolution) [03/25/2022-13:24:48] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_medium_nn_v1 Tactic: 175853789719975416 [03/25/2022-13:24:48] [V] [TRT] Tactic: 175853789719975416 Time: 0.537472 [03/25/2022-13:24:48] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_medium_nn_v1 Tactic: 2171150287007712632 [03/25/2022-13:24:48] [V] [TRT] Tactic: 2171150287007712632 Time: 0.539136 [03/25/2022-13:24:48] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_small_nn_v1 Tactic: 2234457234705232274 [03/25/2022-13:24:48] [V] [TRT] Tactic: 2234457234705232274 Time: 0.493824 [03/25/2022-13:24:48] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_medium_nn_v1 Tactic: 5834048089706882838 [03/25/2022-13:24:48] [V] [TRT] Tactic: 5834048089706882838 Time: 0.495872 [03/25/2022-13:24:48] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_interior_nn_v1 Tactic: 6299962968199310600 [03/25/2022-13:24:48] [V] [TRT] Tactic: 6299962968199310600 Time: 0.507392 [03/25/2022-13:24:48] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_interior_nn_v1 Tactic: 6341572697076960911 [03/25/2022-13:24:48] [V] [TRT] Tactic: 6341572697076960911 Time: 0.496896 [03/25/2022-13:24:48] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_small_nn_v1 Tactic: -8626990807754934295 [03/25/2022-13:24:48] [V] [TRT] Tactic: -8626990807754934295 Time: 0.536448 [03/25/2022-13:24:48] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_interior_nn_v1 Tactic: -8498217049614706532 [03/25/2022-13:24:48] [V] [TRT] Tactic: -8498217049614706532 Time: 0.47808 [03/25/2022-13:24:48] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_small_nn_v1 Tactic: -7303593854972602201 [03/25/2022-13:24:48] [V] [TRT] Tactic: -7303593854972602201 Time: 0.50624 [03/25/2022-13:24:48] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_medium_nn_v1 Tactic: -6585664687867083638 [03/25/2022-13:24:48] [V] [TRT] Tactic: -6585664687867083638 Time: 0.508544 [03/25/2022-13:24:48] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_interior_nn_v1 Tactic: -3326139578711341011 [03/25/2022-13:24:48] [V] [TRT] Tactic: -3326139578711341011 Time: 0.502528 [03/25/2022-13:24:48] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_small_nn_v1 Tactic: -683636008127039856 [03/25/2022-13:24:48] [V] [TRT] Tactic: -683636008127039856 Time: 0.509824 [03/25/2022-13:24:48] [V] [TRT] Fastest Tactic: -8498217049614706532 Time: 0.47808 [03/25/2022-13:24:48] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: -8498217049614706532 [03/25/2022-13:24:48] [V] [TRT] *************** Autotuning format combination: Int8(50176,196:4,14,1) -> Int8(3136,196:32,14,1) *************** [03/25/2022-13:24:48] [V] [TRT] --------------- Timing Runner: sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 (CaskConvolution) [03/25/2022-13:24:48] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_small_c32_nn_v1 Tactic: 1100922622480907544 [03/25/2022-13:24:48] [V] [TRT] Tactic: 1100922622480907544 Time: 0.535168 [03/25/2022-13:24:48] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_interior_c32_nn_v1 Tactic: 2855900226702061782 [03/25/2022-13:24:48] [V] [TRT] Tactic: 2855900226702061782 Time: 0.50816 [03/25/2022-13:24:48] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_medium_c32_nn_v1 Tactic: 3606311198834416176 [03/25/2022-13:24:48] [V] [TRT] Tactic: 3606311198834416176 Time: 0.495232 [03/25/2022-13:24:48] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_small_c32_nn_v1 Tactic: 4325765560739862899 [03/25/2022-13:24:48] [V] [TRT] Tactic: 4325765560739862899 Time: 0.510976 [03/25/2022-13:24:48] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_interior_c32_nn_v1 Tactic: 8803458114157674373 [03/25/2022-13:24:48] [V] [TRT] Tactic: 8803458114157674373 Time: 0.477696 [03/25/2022-13:24:48] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_interior_c32_nn_v1 Tactic: -6934773036503365000 [03/25/2022-13:24:48] [V] [TRT] Tactic: -6934773036503365000 Time: 0.5024 [03/25/2022-13:24:48] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_interior_c32_nn_v1 Tactic: -4431642509665791294 [03/25/2022-13:24:48] [V] [TRT] Tactic: -4431642509665791294 Time: 0.469888 [03/25/2022-13:24:48] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_medium_c32_nn_v1 Tactic: -4255737803793506479 [03/25/2022-13:24:48] [V] [TRT] Tactic: -4255737803793506479 Time: 0.405248 [03/25/2022-13:24:48] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_small_c32_nn_v1 Tactic: -3958182351168863467 [03/25/2022-13:24:48] [V] [TRT] Tactic: -3958182351168863467 Time: 0.403712 [03/25/2022-13:24:48] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_medium_c32_nn_v1 Tactic: -3111968753064955248 [03/25/2022-13:24:48] [V] [TRT] Tactic: -3111968753064955248 Time: 0.42944 [03/25/2022-13:24:48] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_medium_c32_nn_v1 Tactic: -1492575840277333548 [03/25/2022-13:24:48] [V] [TRT] Tactic: -1492575840277333548 Time: 0.42944 [03/25/2022-13:24:48] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_small_c32_nn_v1 Tactic: -868495160148524802 [03/25/2022-13:24:48] [V] [TRT] Tactic: -868495160148524802 Time: 0.39232 [03/25/2022-13:24:48] [V] [TRT] Fastest Tactic: -868495160148524802 Time: 0.39232 [03/25/2022-13:24:48] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: -868495160148524802 [03/25/2022-13:24:48] [V] [TRT] *************** Autotuning format combination: Int8(6272,196:32,14,1) -> Int8(3136,196:32,14,1) *************** [03/25/2022-13:24:48] [V] [TRT] --------------- Timing Runner: sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 (CudaGroupConvolution) [03/25/2022-13:24:48] [V] [TRT] CudaGroupConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:48] [V] [TRT] --------------- Timing Runner: sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 (CudaDepthwiseConvolution) [03/25/2022-13:24:48] [V] [TRT] CudaDepthwiseConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:48] [V] [TRT] --------------- Timing Runner: sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 (FusedConvActConvolution) [03/25/2022-13:24:48] [V] [TRT] FusedConvActConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:48] [V] [TRT] --------------- Timing Runner: sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 (CaskConvolution) [03/25/2022-13:24:48] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16_t1r1s1 Tactic: 68468667201176803 [03/25/2022-13:24:48] [V] [TRT] Tactic: 68468667201176803 Time: 0.131968 [03/25/2022-13:24:48] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 125145153013230687 [03/25/2022-13:24:48] [V] [TRT] Tactic: 125145153013230687 Time: 0.09344 [03/25/2022-13:24:48] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_interior_nt_v1 Tactic: 434957160407688216 [03/25/2022-13:24:48] [V] [TRT] Tactic: 434957160407688216 Time: 0.140288 [03/25/2022-13:24:48] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 805889586762897346 [03/25/2022-13:24:48] [V] [TRT] Tactic: 805889586762897346 Time: 0.078208 [03/25/2022-13:24:48] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: 857001784974286465 [03/25/2022-13:24:48] [V] [TRT] Tactic: 857001784974286465 Time: 0.06272 [03/25/2022-13:24:48] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 1214130898909872671 [03/25/2022-13:24:48] [V] [TRT] Tactic: 1214130898909872671 Time: 0.153984 [03/25/2022-13:24:48] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_conv_fprop_smallk_i8i8_i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_warptilesize16x32_stage5_warpsize1x4x2_r1s1_u1v1_hw0_c1024_scalebias_relu Tactic: 1263683011321748626 [03/25/2022-13:24:48] [V] [TRT] Tactic: 1263683011321748626 Time: 0.059904 [03/25/2022-13:24:48] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 1278425129871930205 [03/25/2022-13:24:48] [V] [TRT] Tactic: 1278425129871930205 Time: 0.076032 [03/25/2022-13:24:48] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 1583811548148740665 [03/25/2022-13:24:48] [V] [TRT] Tactic: 1583811548148740665 Time: 0.093824 [03/25/2022-13:24:48] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 1701344857577810806 [03/25/2022-13:24:48] [V] [TRT] Tactic: 1701344857577810806 Time: 0.09984 [03/25/2022-13:24:48] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 1797231177354918208 [03/25/2022-13:24:48] [V] [TRT] Tactic: 1797231177354918208 Time: 0.142848 [03/25/2022-13:24:48] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2004812516525036381 [03/25/2022-13:24:48] [V] [TRT] Tactic: 2004812516525036381 Time: 0.084224 [03/25/2022-13:24:48] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2030033463723799063 [03/25/2022-13:24:48] [V] [TRT] Tactic: 2030033463723799063 Time: 0.072832 [03/25/2022-13:24:48] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_interior_nt_v1 Tactic: 2346437292116182513 [03/25/2022-13:24:48] [V] [TRT] Tactic: 2346437292116182513 Time: 0.1344 [03/25/2022-13:24:48] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2376898825218218566 [03/25/2022-13:24:48] [V] [TRT] Tactic: 2376898825218218566 Time: 0.073088 [03/25/2022-13:24:48] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_interior_nt_v1 Tactic: 2522133112320625287 [03/25/2022-13:24:48] [V] [TRT] Tactic: 2522133112320625287 Time: 0.130688 [03/25/2022-13:24:48] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2548171972648455240 [03/25/2022-13:24:48] [V] [TRT] Tactic: 2548171972648455240 Time: 0.077952 [03/25/2022-13:24:48] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 2548946449357458230 [03/25/2022-13:24:48] [V] [TRT] Tactic: 2548946449357458230 Time: 0.159872 [03/25/2022-13:24:48] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: 2570666021825229009 [03/25/2022-13:24:48] [V] [TRT] Tactic: 2570666021825229009 Time: 0.120192 [03/25/2022-13:24:48] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 2678520742286844763 [03/25/2022-13:24:48] [V] [TRT] Tactic: 2678520742286844763 Time: 0.165504 [03/25/2022-13:24:48] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 2756291002030759362 [03/25/2022-13:24:48] [V] [TRT] Tactic: 2756291002030759362 Time: 0.094592 [03/25/2022-13:24:48] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2972948223367788520 [03/25/2022-13:24:48] [V] [TRT] Tactic: 2972948223367788520 Time: 0.07488 [03/25/2022-13:24:48] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_medium_nt_v1 Tactic: 2985940154541537814 [03/25/2022-13:24:48] [V] [TRT] Tactic: 2985940154541537814 Time: 0.134272 [03/25/2022-13:24:48] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 3043273137345374664 [03/25/2022-13:24:48] [V] [TRT] Tactic: 3043273137345374664 Time: 0.152448 [03/25/2022-13:24:48] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 3221677093659484230 [03/25/2022-13:24:48] [V] [TRT] Tactic: 3221677093659484230 Time: 0.12352 [03/25/2022-13:24:48] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: 3242897809704328258 [03/25/2022-13:24:48] [V] [TRT] Tactic: 3242897809704328258 Time: 0.131456 [03/25/2022-13:24:48] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 3312456766204252694 [03/25/2022-13:24:48] [V] [TRT] Tactic: 3312456766204252694 Time: 0.177536 [03/25/2022-13:24:48] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 3538565962642681625 [03/25/2022-13:24:48] [V] [TRT] Tactic: 3538565962642681625 Time: 0.11072 [03/25/2022-13:24:48] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 3541919052468401776 [03/25/2022-13:24:48] [V] [TRT] Tactic: 3541919052468401776 Time: 0.1248 [03/25/2022-13:24:48] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 3593397928177382100 [03/25/2022-13:24:48] [V] [TRT] Tactic: 3593397928177382100 Time: 0.155264 [03/25/2022-13:24:48] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 3670282018109435863 [03/25/2022-13:24:48] [V] [TRT] Tactic: 3670282018109435863 Time: 0.090624 [03/25/2022-13:24:48] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 3671413346254027573 [03/25/2022-13:24:48] [V] [TRT] Tactic: 3671413346254027573 Time: 0.09856 [03/25/2022-13:24:48] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_medium_nt_v1 Tactic: 3899284354987683408 [03/25/2022-13:24:48] [V] [TRT] Tactic: 3899284354987683408 Time: 0.139008 [03/25/2022-13:24:48] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 3927509214678622419 [03/25/2022-13:24:48] [V] [TRT] Tactic: 3927509214678622419 Time: 0.129408 [03/25/2022-13:24:48] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 4112572034735311841 [03/25/2022-13:24:48] [V] [TRT] Tactic: 4112572034735311841 Time: 0.203392 [03/25/2022-13:24:48] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 4239974928951431644 [03/25/2022-13:24:48] [V] [TRT] Tactic: 4239974928951431644 Time: 0.108416 [03/25/2022-13:24:48] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 4610760414797216079 [03/25/2022-13:24:48] [V] [TRT] Tactic: 4610760414797216079 Time: 0.092032 [03/25/2022-13:24:48] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_small_nt_v1 Tactic: 4717285412741024953 [03/25/2022-13:24:49] [V] [TRT] Tactic: 4717285412741024953 Time: 0.133888 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 4796956614760326119 [03/25/2022-13:24:49] [V] [TRT] Tactic: 4796956614760326119 Time: 0.087168 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: 4909502217677847353 [03/25/2022-13:24:49] [V] [TRT] Tactic: 4909502217677847353 Time: 0.061312 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 4919361344804309192 [03/25/2022-13:24:49] [V] [TRT] Tactic: 4919361344804309192 Time: 0.161664 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 5043674678294309681 [03/25/2022-13:24:49] [V] [TRT] Tactic: 5043674678294309681 Time: 0.093056 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_interior_nt_v1 Tactic: 5126565865931538390 [03/25/2022-13:24:49] [V] [TRT] Tactic: 5126565865931538390 Time: 0.135424 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5204702486885981735 [03/25/2022-13:24:49] [V] [TRT] Tactic: 5204702486885981735 Time: 0.092928 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 5375256703210220108 [03/25/2022-13:24:49] [V] [TRT] Tactic: 5375256703210220108 Time: 0.090752 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 5424258848951129084 [03/25/2022-13:24:49] [V] [TRT] Tactic: 5424258848951129084 Time: 0.061568 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5443897483205284103 [03/25/2022-13:24:49] [V] [TRT] Tactic: 5443897483205284103 Time: 0.098816 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5707566217891294846 [03/25/2022-13:24:49] [V] [TRT] Tactic: 5707566217891294846 Time: 0.080896 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5986622376339202983 [03/25/2022-13:24:49] [V] [TRT] Tactic: 5986622376339202983 Time: 0.110592 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6007888770437705057 [03/25/2022-13:24:49] [V] [TRT] Tactic: 6007888770437705057 Time: 0.078592 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 6405251167055673379 [03/25/2022-13:24:49] [V] [TRT] Tactic: 6405251167055673379 Time: 0.089088 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 6433368103202497147 [03/25/2022-13:24:49] [V] [TRT] Tactic: 6433368103202497147 Time: 0.083456 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6441948709525127755 [03/25/2022-13:24:49] [V] [TRT] Tactic: 6441948709525127755 Time: 0.156288 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 6443933097134654777 [03/25/2022-13:24:49] [V] [TRT] Tactic: 6443933097134654777 Time: 0.083072 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6457435868048963632 [03/25/2022-13:24:49] [V] [TRT] Tactic: 6457435868048963632 Time: 0.093184 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1 Tactic: 6510345569544721081 [03/25/2022-13:24:49] [V] [TRT] Tactic: 6510345569544721081 Time: 0.120704 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 6793988781414507278 [03/25/2022-13:24:49] [V] [TRT] Tactic: 6793988781414507278 Time: 0.07296 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 6880710371738875469 [03/25/2022-13:24:49] [V] [TRT] Tactic: 6880710371738875469 Time: 0.08832 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6925201228918187099 [03/25/2022-13:24:49] [V] [TRT] Tactic: 6925201228918187099 Time: 0.075008 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6991524515605108718 [03/25/2022-13:24:49] [V] [TRT] Tactic: 6991524515605108718 Time: 0.09408 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 7245509442265271220 [03/25/2022-13:24:49] [V] [TRT] Tactic: 7245509442265271220 Time: 0.109184 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 7318929579222925725 [03/25/2022-13:24:49] [V] [TRT] Tactic: 7318929579222925725 Time: 0.083072 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 7731430299029542276 [03/25/2022-13:24:49] [V] [TRT] Tactic: 7731430299029542276 Time: 0.075008 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 7738495016763012180 [03/25/2022-13:24:49] [V] [TRT] Tactic: 7738495016763012180 Time: 0.078464 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: 7886967395128926382 [03/25/2022-13:24:49] [V] [TRT] Tactic: 7886967395128926382 Time: 0.084864 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 8142283985160822229 [03/25/2022-13:24:49] [V] [TRT] Tactic: 8142283985160822229 Time: 0.081152 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 8173975624668590862 [03/25/2022-13:24:49] [V] [TRT] Tactic: 8173975624668590862 Time: 0.081792 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 8234775147403903473 [03/25/2022-13:24:49] [V] [TRT] Tactic: 8234775147403903473 Time: 0.0832 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 8524082966802584889 [03/25/2022-13:24:49] [V] [TRT] Tactic: 8524082966802584889 Time: 0.07616 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 8684013308930763400 [03/25/2022-13:24:49] [V] [TRT] Tactic: 8684013308930763400 Time: 0.130176 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 8765382722978397630 [03/25/2022-13:24:49] [V] [TRT] Tactic: 8765382722978397630 Time: 0.076544 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 8843193587782643431 [03/25/2022-13:24:49] [V] [TRT] Tactic: 8843193587782643431 Time: 0.112 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 8883810517410230831 [03/25/2022-13:24:49] [V] [TRT] Tactic: 8883810517410230831 Time: 0.086784 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 8930797211803511337 [03/25/2022-13:24:49] [V] [TRT] Tactic: 8930797211803511337 Time: 0.093184 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 8935070489925739043 [03/25/2022-13:24:49] [V] [TRT] Tactic: 8935070489925739043 Time: 0.081152 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1 Tactic: 9062173295331155069 [03/25/2022-13:24:49] [V] [TRT] Tactic: 9062173295331155069 Time: 0.165504 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_small_nt_v1 Tactic: -9118785798277698619 [03/25/2022-13:24:49] [V] [TRT] Tactic: -9118785798277698619 Time: 0.132224 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -8985599729413291927 [03/25/2022-13:24:49] [V] [TRT] Tactic: -8985599729413291927 Time: 0.088576 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -8972697510150675429 [03/25/2022-13:24:49] [V] [TRT] Tactic: -8972697510150675429 Time: 0.119808 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -8943710627305202139 [03/25/2022-13:24:49] [V] [TRT] Tactic: -8943710627305202139 Time: 0.130176 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -8859846367886814331 [03/25/2022-13:24:49] [V] [TRT] Tactic: -8859846367886814331 Time: 0.145664 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -8638624340850784688 [03/25/2022-13:24:49] [V] [TRT] Tactic: -8638624340850784688 Time: 0.11648 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -8556775352640313933 [03/25/2022-13:24:49] [V] [TRT] Tactic: -8556775352640313933 Time: 0.080128 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -8382298409581540699 [03/25/2022-13:24:49] [V] [TRT] Tactic: -8382298409581540699 Time: 0.205312 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -8172318747337038866 [03/25/2022-13:24:49] [V] [TRT] Tactic: -8172318747337038866 Time: 0.133376 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -8038164441468184723 [03/25/2022-13:24:49] [V] [TRT] Tactic: -8038164441468184723 Time: 0.07744 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -7844028314176826857 [03/25/2022-13:24:49] [V] [TRT] Tactic: -7844028314176826857 Time: 0.153984 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -7674507941016740570 [03/25/2022-13:24:49] [V] [TRT] Tactic: -7674507941016740570 Time: 0.06144 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -7364286662638617917 [03/25/2022-13:24:49] [V] [TRT] Tactic: -7364286662638617917 Time: 0.076032 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r1s1 Tactic: -7361755530333096258 [03/25/2022-13:24:49] [V] [TRT] Tactic: -7361755530333096258 Time: 0.161536 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_t1r1s1 Tactic: -7289760022626653388 [03/25/2022-13:24:49] [V] [TRT] Tactic: -7289760022626653388 Time: 0.086656 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -7106539943789766885 [03/25/2022-13:24:49] [V] [TRT] Tactic: -7106539943789766885 Time: 0.131328 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -6969478418607271266 [03/25/2022-13:24:49] [V] [TRT] Tactic: -6969478418607271266 Time: 0.131712 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -6930438165437733000 [03/25/2022-13:24:49] [V] [TRT] Tactic: -6930438165437733000 Time: 0.204032 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -6879607992933502380 [03/25/2022-13:24:49] [V] [TRT] Tactic: -6879607992933502380 Time: 0.08128 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -6839669803644810934 [03/25/2022-13:24:49] [V] [TRT] Tactic: -6839669803644810934 Time: 0.09216 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -6812830108414456369 [03/25/2022-13:24:49] [V] [TRT] Tactic: -6812830108414456369 Time: 0.09216 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -6779804930216439173 [03/25/2022-13:24:49] [V] [TRT] Tactic: -6779804930216439173 Time: 0.061696 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -6527178416855951297 [03/25/2022-13:24:49] [V] [TRT] Tactic: -6527178416855951297 Time: 0.16448 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -6510232214299595844 [03/25/2022-13:24:49] [V] [TRT] Tactic: -6510232214299595844 Time: 0.163584 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_small_nt_v1 Tactic: -6400348606759295499 [03/25/2022-13:24:49] [V] [TRT] Tactic: -6400348606759295499 Time: 0.12928 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -6346247605026339453 [03/25/2022-13:24:49] [V] [TRT] Tactic: -6346247605026339453 Time: 0.12928 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -6232597026469067819 [03/25/2022-13:24:49] [V] [TRT] Tactic: -6232597026469067819 Time: 0.116096 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_small_nt_v1 Tactic: -5980889159865208399 [03/25/2022-13:24:49] [V] [TRT] Tactic: -5980889159865208399 Time: 0.136448 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_medium_nt_v1 Tactic: -5766140806760372989 [03/25/2022-13:24:49] [V] [TRT] Tactic: -5766140806760372989 Time: 0.13632 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -5697614955743334137 [03/25/2022-13:24:49] [V] [TRT] Tactic: -5697614955743334137 Time: 0.126336 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -5671123121710113970 [03/25/2022-13:24:49] [V] [TRT] Tactic: -5671123121710113970 Time: 0.10944 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -5615581362569252260 [03/25/2022-13:24:49] [V] [TRT] Tactic: -5615581362569252260 Time: 0.140288 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -5562968047117507056 [03/25/2022-13:24:49] [V] [TRT] Tactic: -5562968047117507056 Time: 0.07808 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -5516472881360101487 [03/25/2022-13:24:49] [V] [TRT] Tactic: -5516472881360101487 Time: 0.115712 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -5311474420963248369 [03/25/2022-13:24:49] [V] [TRT] Tactic: -5311474420963248369 Time: 0.178688 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -5170003087447722174 [03/25/2022-13:24:49] [V] [TRT] Tactic: -5170003087447722174 Time: 0.1568 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -4889586143772361690 [03/25/2022-13:24:49] [V] [TRT] Tactic: -4889586143772361690 Time: 0.090112 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -4889498558023475527 [03/25/2022-13:24:49] [V] [TRT] Tactic: -4889498558023475527 Time: 0.085376 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -4849712423393454704 [03/25/2022-13:24:49] [V] [TRT] Tactic: -4849712423393454704 Time: 0.089344 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: -4681913707320020520 [03/25/2022-13:24:49] [V] [TRT] Tactic: -4681913707320020520 Time: 0.061952 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_medium_nt_v1 Tactic: -4516822589357530549 [03/25/2022-13:24:49] [V] [TRT] Tactic: -4516822589357530549 Time: 0.139776 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -4455415102719506646 [03/25/2022-13:24:49] [V] [TRT] Tactic: -4455415102719506646 Time: 0.111488 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -4425346730823666456 [03/25/2022-13:24:49] [V] [TRT] Tactic: -4425346730823666456 Time: 0.11648 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r1s1 Tactic: -4260476497340370474 [03/25/2022-13:24:49] [V] [TRT] Tactic: -4260476497340370474 Time: 0.20608 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -4182501876984672402 [03/25/2022-13:24:49] [V] [TRT] Tactic: -4182501876984672402 Time: 0.109952 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_t1r1s1 Tactic: -4151617293257698859 [03/25/2022-13:24:49] [V] [TRT] Tactic: -4151617293257698859 Time: 0.083584 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -3862908719298381451 [03/25/2022-13:24:49] [V] [TRT] Tactic: -3862908719298381451 Time: 0.062592 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -3825889760337461729 [03/25/2022-13:24:49] [V] [TRT] Tactic: -3825889760337461729 Time: 0.16192 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -3797022944823726673 [03/25/2022-13:24:49] [V] [TRT] Tactic: -3797022944823726673 Time: 0.108288 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -3613322253849278738 [03/25/2022-13:24:49] [V] [TRT] Tactic: -3613322253849278738 Time: 0.206208 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -3577322188448771475 [03/25/2022-13:24:49] [V] [TRT] Tactic: -3577322188448771475 Time: 0.124544 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16_t1r1s1 Tactic: -3531681826488401618 [03/25/2022-13:24:49] [V] [TRT] Tactic: -3531681826488401618 Time: 0.132352 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -3305554949874552860 [03/25/2022-13:24:49] [V] [TRT] Tactic: -3305554949874552860 Time: 0.164224 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -3288585994448820820 [03/25/2022-13:24:49] [V] [TRT] Tactic: -3288585994448820820 Time: 0.092544 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -2754311112012636251 [03/25/2022-13:24:49] [V] [TRT] Tactic: -2754311112012636251 Time: 0.124416 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -2432868635536396215 [03/25/2022-13:24:49] [V] [TRT] Tactic: -2432868635536396215 Time: 0.093696 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -2379804152300264660 [03/25/2022-13:24:49] [V] [TRT] Tactic: -2379804152300264660 Time: 0.151296 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: -2352253835013627337 [03/25/2022-13:24:49] [V] [TRT] Tactic: -2352253835013627337 Time: 0.062208 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -2335587136911650799 [03/25/2022-13:24:49] [V] [TRT] Tactic: -2335587136911650799 Time: 0.106752 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r1s1 Tactic: -2315453944962430928 [03/25/2022-13:24:49] [V] [TRT] Tactic: -2315453944962430928 Time: 0.0832 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: -2238364958919154661 [03/25/2022-13:24:49] [V] [TRT] Tactic: -2238364958919154661 Time: 0.120576 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -1916483171117495388 [03/25/2022-13:24:49] [V] [TRT] Tactic: -1916483171117495388 Time: 0.094336 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -1740762957710554518 [03/25/2022-13:24:49] [V] [TRT] Tactic: -1740762957710554518 Time: 0.164352 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -1549742793039499659 [03/25/2022-13:24:49] [V] [TRT] Tactic: -1549742793039499659 Time: 0.11712 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -1499578657823798783 [03/25/2022-13:24:49] [V] [TRT] Tactic: -1499578657823798783 Time: 0.110208 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -1494157908358500249 [03/25/2022-13:24:49] [V] [TRT] Tactic: -1494157908358500249 Time: 0.147328 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_t1r1s1 Tactic: -1328736756812546664 [03/25/2022-13:24:49] [V] [TRT] Tactic: -1328736756812546664 Time: 0.109696 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -1006589727652607355 [03/25/2022-13:24:49] [V] [TRT] Tactic: -1006589727652607355 Time: 0.140672 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -713022856474991236 [03/25/2022-13:24:49] [V] [TRT] Tactic: -713022856474991236 Time: 0.205696 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -619668460699260222 [03/25/2022-13:24:49] [V] [TRT] Tactic: -619668460699260222 Time: 0.119552 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -405554772060757402 [03/25/2022-13:24:49] [V] [TRT] Tactic: -405554772060757402 Time: 0.093312 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -375949437730908730 [03/25/2022-13:24:49] [V] [TRT] Tactic: -375949437730908730 Time: 0.093056 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: -233227833606287806 [03/25/2022-13:24:49] [V] [TRT] Tactic: -233227833606287806 Time: 0.093312 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -111878368089469751 [03/25/2022-13:24:49] [V] [TRT] Tactic: -111878368089469751 Time: 0.10944 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -48936598874722005 [03/25/2022-13:24:49] [V] [TRT] Tactic: -48936598874722005 Time: 0.07872 [03/25/2022-13:24:49] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -19707840769375107 [03/25/2022-13:24:49] [V] [TRT] Tactic: -19707840769375107 Time: 0.122112 [03/25/2022-13:24:49] [V] [TRT] Fastest Tactic: 1263683011321748626 Time: 0.059904 [03/25/2022-13:24:49] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: 1263683011321748626 [03/25/2022-13:24:49] [V] [TRT] =============== Computing costs for [03/25/2022-13:24:49] [V] [TRT] *************** Autotuning format combination: Int8(50176,196:4,14,1) -> Int8(25088,49:4,7,1) *************** [03/25/2022-13:24:49] [V] [TRT] --------------- Timing Runner: sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 (CudaDepthwiseConvolution) [03/25/2022-13:24:49] [V] [TRT] CudaDepthwiseConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:49] [V] [TRT] --------------- Timing Runner: sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 (FusedConvActConvolution) [03/25/2022-13:24:49] [V] [TRT] FusedConvActConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:49] [V] [TRT] --------------- Timing Runner: sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 (CaskConvolution) [03/25/2022-13:24:49] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_medium_nn_v1 Tactic: 175853789719975416 [03/25/2022-13:24:49] [V] [TRT] Tactic: 175853789719975416 Time: 0.439552 [03/25/2022-13:24:50] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_medium_nn_v1 Tactic: 2171150287007712632 [03/25/2022-13:24:50] [V] [TRT] Tactic: 2171150287007712632 Time: 0.433792 [03/25/2022-13:24:50] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_small_nn_v1 Tactic: 2234457234705232274 [03/25/2022-13:24:50] [V] [TRT] Tactic: 2234457234705232274 Time: 0.395392 [03/25/2022-13:24:50] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_medium_nn_v1 Tactic: 5834048089706882838 [03/25/2022-13:24:50] [V] [TRT] Tactic: 5834048089706882838 Time: 0.396928 [03/25/2022-13:24:50] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_interior_nn_v1 Tactic: 6299962968199310600 [03/25/2022-13:24:50] [V] [TRT] Tactic: 6299962968199310600 Time: 0.404096 [03/25/2022-13:24:50] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_interior_nn_v1 Tactic: 6341572697076960911 [03/25/2022-13:24:50] [V] [TRT] Tactic: 6341572697076960911 Time: 0.416128 [03/25/2022-13:24:50] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_small_nn_v1 Tactic: -8626990807754934295 [03/25/2022-13:24:50] [V] [TRT] Tactic: -8626990807754934295 Time: 0.438528 [03/25/2022-13:24:50] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_interior_nn_v1 Tactic: -8498217049614706532 [03/25/2022-13:24:50] [V] [TRT] Tactic: -8498217049614706532 Time: 0.382976 [03/25/2022-13:24:50] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_small_nn_v1 Tactic: -7303593854972602201 [03/25/2022-13:24:50] [V] [TRT] Tactic: -7303593854972602201 Time: 0.422016 [03/25/2022-13:24:50] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_medium_nn_v1 Tactic: -6585664687867083638 [03/25/2022-13:24:50] [V] [TRT] Tactic: -6585664687867083638 Time: 0.405504 [03/25/2022-13:24:50] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_interior_nn_v1 Tactic: -3326139578711341011 [03/25/2022-13:24:50] [V] [TRT] Tactic: -3326139578711341011 Time: 0.41408 [03/25/2022-13:24:50] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_small_nn_v1 Tactic: -683636008127039856 [03/25/2022-13:24:50] [V] [TRT] Tactic: -683636008127039856 Time: 0.405888 [03/25/2022-13:24:50] [V] [TRT] Fastest Tactic: -8498217049614706532 Time: 0.382976 [03/25/2022-13:24:50] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: -8498217049614706532 [03/25/2022-13:24:50] [V] [TRT] *************** Autotuning format combination: Int8(50176,196:4,14,1) -> Int8(3136,49:32,7,1) *************** [03/25/2022-13:24:50] [V] [TRT] --------------- Timing Runner: sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 (CaskConvolution) [03/25/2022-13:24:50] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_small_c32_nn_v1 Tactic: 1100922622480907544 [03/25/2022-13:24:50] [V] [TRT] Tactic: 1100922622480907544 Time: 0.438016 [03/25/2022-13:24:50] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_interior_c32_nn_v1 Tactic: 2855900226702061782 [03/25/2022-13:24:50] [V] [TRT] Tactic: 2855900226702061782 Time: 0.405376 [03/25/2022-13:24:50] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_medium_c32_nn_v1 Tactic: 3606311198834416176 [03/25/2022-13:24:50] [V] [TRT] Tactic: 3606311198834416176 Time: 0.396928 [03/25/2022-13:24:50] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_small_c32_nn_v1 Tactic: 4325765560739862899 [03/25/2022-13:24:50] [V] [TRT] Tactic: 4325765560739862899 Time: 0.40704 [03/25/2022-13:24:50] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_interior_c32_nn_v1 Tactic: 8803458114157674373 [03/25/2022-13:24:50] [V] [TRT] Tactic: 8803458114157674373 Time: 0.382848 [03/25/2022-13:24:50] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_interior_c32_nn_v1 Tactic: -6934773036503365000 [03/25/2022-13:24:50] [V] [TRT] Tactic: -6934773036503365000 Time: 0.413696 [03/25/2022-13:24:50] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_interior_c32_nn_v1 Tactic: -4431642509665791294 [03/25/2022-13:24:50] [V] [TRT] Tactic: -4431642509665791294 Time: 0.403584 [03/25/2022-13:24:50] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_medium_c32_nn_v1 Tactic: -4255737803793506479 [03/25/2022-13:24:50] [V] [TRT] Tactic: -4255737803793506479 Time: 0.406272 [03/25/2022-13:24:50] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_small_c32_nn_v1 Tactic: -3958182351168863467 [03/25/2022-13:24:50] [V] [TRT] Tactic: -3958182351168863467 Time: 0.421632 [03/25/2022-13:24:50] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_medium_c32_nn_v1 Tactic: -3111968753064955248 [03/25/2022-13:24:50] [V] [TRT] Tactic: -3111968753064955248 Time: 0.4448 [03/25/2022-13:24:50] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_medium_c32_nn_v1 Tactic: -1492575840277333548 [03/25/2022-13:24:50] [V] [TRT] Tactic: -1492575840277333548 Time: 0.47232 [03/25/2022-13:24:50] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_small_c32_nn_v1 Tactic: -868495160148524802 [03/25/2022-13:24:50] [V] [TRT] Tactic: -868495160148524802 Time: 0.427008 [03/25/2022-13:24:50] [V] [TRT] Fastest Tactic: 8803458114157674373 Time: 0.382848 [03/25/2022-13:24:50] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: 8803458114157674373 [03/25/2022-13:24:50] [V] [TRT] *************** Autotuning format combination: Int8(6272,196:32,14,1) -> Int8(3136,49:32,7,1) *************** [03/25/2022-13:24:50] [V] [TRT] --------------- Timing Runner: sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 (CudaGroupConvolution) [03/25/2022-13:24:50] [V] [TRT] CudaGroupConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:50] [V] [TRT] --------------- Timing Runner: sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 (CudaDepthwiseConvolution) [03/25/2022-13:24:50] [V] [TRT] CudaDepthwiseConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:50] [V] [TRT] --------------- Timing Runner: sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 (FusedConvActConvolution) [03/25/2022-13:24:50] [V] [TRT] FusedConvActConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:50] [V] [TRT] --------------- Timing Runner: sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 (CaskConvolution) [03/25/2022-13:24:50] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16_t1r1s1 Tactic: 68468667201176803 [03/25/2022-13:24:50] [V] [TRT] Tactic: 68468667201176803 Time: 0.157568 [03/25/2022-13:24:50] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 125145153013230687 [03/25/2022-13:24:50] [V] [TRT] Tactic: 125145153013230687 Time: 0.095872 [03/25/2022-13:24:50] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_epifadd Tactic: 177040020707947851 [03/25/2022-13:24:50] [V] [TRT] Tactic: 177040020707947851 Time: 0.246528 [03/25/2022-13:24:50] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32 Tactic: 328135613486708155 [03/25/2022-13:24:50] [V] [TRT] Tactic: 328135613486708155 Time: 0.281728 [03/25/2022-13:24:50] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_interior_nt_v1 Tactic: 434957160407688216 [03/25/2022-13:24:50] [V] [TRT] Tactic: 434957160407688216 Time: 0.16512 [03/25/2022-13:24:50] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 805889586762897346 [03/25/2022-13:24:50] [V] [TRT] Tactic: 805889586762897346 Time: 0.091904 [03/25/2022-13:24:50] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: 857001784974286465 [03/25/2022-13:24:50] [V] [TRT] Tactic: 857001784974286465 Time: 0.075776 [03/25/2022-13:24:50] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32 Tactic: 1111159740952609683 [03/25/2022-13:24:50] [V] [TRT] Tactic: 1111159740952609683 Time: 0.102784 [03/25/2022-13:24:50] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: 1134860903395928905 [03/25/2022-13:24:50] [V] [TRT] Tactic: 1134860903395928905 Time: 0.11712 [03/25/2022-13:24:50] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: 1276591930377039442 [03/25/2022-13:24:50] [V] [TRT] Tactic: 1276591930377039442 Time: 0.134784 [03/25/2022-13:24:50] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32 Tactic: 1399501420456320585 [03/25/2022-13:24:50] [V] [TRT] Tactic: 1399501420456320585 Time: 0.1536 [03/25/2022-13:24:50] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: 1550399266192842845 [03/25/2022-13:24:50] [V] [TRT] Tactic: 1550399266192842845 Time: 0.179584 [03/25/2022-13:24:50] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 1583811548148740665 [03/25/2022-13:24:50] [V] [TRT] Tactic: 1583811548148740665 Time: 0.099968 [03/25/2022-13:24:50] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 1701344857577810806 [03/25/2022-13:24:50] [V] [TRT] Tactic: 1701344857577810806 Time: 0.114304 [03/25/2022-13:24:50] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 1797231177354918208 [03/25/2022-13:24:50] [V] [TRT] Tactic: 1797231177354918208 Time: 0.1984 [03/25/2022-13:24:50] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: 2133329569091732311 [03/25/2022-13:24:50] [V] [TRT] Tactic: 2133329569091732311 Time: 0.157184 [03/25/2022-13:24:50] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_no_preds Tactic: 2186058294798640800 [03/25/2022-13:24:50] [V] [TRT] Tactic: 2186058294798640800 Time: 0.081024 [03/25/2022-13:24:50] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32 Tactic: 2325023763229477890 [03/25/2022-13:24:50] [V] [TRT] Tactic: 2325023763229477890 Time: 0.129408 [03/25/2022-13:24:50] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_interior_nt_v1 Tactic: 2346437292116182513 [03/25/2022-13:24:50] [V] [TRT] Tactic: 2346437292116182513 Time: 0.161408 [03/25/2022-13:24:50] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_no_preds Tactic: 2434539343777234419 [03/25/2022-13:24:50] [V] [TRT] Tactic: 2434539343777234419 Time: 0.088832 [03/25/2022-13:24:50] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_interior_nt_v1 Tactic: 2522133112320625287 [03/25/2022-13:24:50] [V] [TRT] Tactic: 2522133112320625287 Time: 0.161536 [03/25/2022-13:24:50] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: 2579824863892891529 [03/25/2022-13:24:50] [V] [TRT] Tactic: 2579824863892891529 Time: 0.235136 [03/25/2022-13:24:50] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32 Tactic: 2783960536172159663 [03/25/2022-13:24:50] [V] [TRT] Tactic: 2783960536172159663 Time: 0.114048 [03/25/2022-13:24:50] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32 Tactic: 2821711838552913693 [03/25/2022-13:24:50] [V] [TRT] Tactic: 2821711838552913693 Time: 0.111616 [03/25/2022-13:24:50] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64 Tactic: 2945009978756227538 [03/25/2022-13:24:50] [V] [TRT] Tactic: 2945009978756227538 Time: 0.121216 [03/25/2022-13:24:50] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_medium_nt_v1 Tactic: 2985940154541537814 [03/25/2022-13:24:50] [V] [TRT] Tactic: 2985940154541537814 Time: 0.159872 [03/25/2022-13:24:50] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: 3242897809704328258 [03/25/2022-13:24:50] [V] [TRT] Tactic: 3242897809704328258 Time: 0.163328 [03/25/2022-13:24:50] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32 Tactic: 3456719996792527006 [03/25/2022-13:24:50] [V] [TRT] Tactic: 3456719996792527006 Time: 0.120192 [03/25/2022-13:24:50] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 3538565962642681625 [03/25/2022-13:24:50] [V] [TRT] Tactic: 3538565962642681625 Time: 0.129792 [03/25/2022-13:24:50] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64 Tactic: 3651043333819148268 [03/25/2022-13:24:50] [V] [TRT] Tactic: 3651043333819148268 Time: 0.098816 [03/25/2022-13:24:50] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_no_preds Tactic: 3866129666720518662 [03/25/2022-13:24:50] [V] [TRT] Tactic: 3866129666720518662 Time: 0.1216 [03/25/2022-13:24:50] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_medium_nt_v1 Tactic: 3899284354987683408 [03/25/2022-13:24:50] [V] [TRT] Tactic: 3899284354987683408 Time: 0.16832 [03/25/2022-13:24:50] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32 Tactic: 4042202769383439184 [03/25/2022-13:24:50] [V] [TRT] Tactic: 4042202769383439184 Time: 0.115712 [03/25/2022-13:24:50] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_epifadd Tactic: 4259547356717612415 [03/25/2022-13:24:51] [V] [TRT] Tactic: 4259547356717612415 Time: 0.198912 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64 Tactic: 4414594337986714263 [03/25/2022-13:24:51] [V] [TRT] Tactic: 4414594337986714263 Time: 0.104192 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_small_nt_v1 Tactic: 4717285412741024953 [03/25/2022-13:24:51] [V] [TRT] Tactic: 4717285412741024953 Time: 0.166144 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32 Tactic: 4734519122557206480 [03/25/2022-13:24:51] [V] [TRT] Tactic: 4734519122557206480 Time: 0.104704 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: 4909502217677847353 [03/25/2022-13:24:51] [V] [TRT] Tactic: 4909502217677847353 Time: 0.084096 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: 4922297020351187339 [03/25/2022-13:24:51] [V] [TRT] Tactic: 4922297020351187339 Time: 0.107264 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_epifadd Tactic: 5121596860264626879 [03/25/2022-13:24:51] [V] [TRT] Tactic: 5121596860264626879 Time: 0.098432 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_interior_nt_v1 Tactic: 5126565865931538390 [03/25/2022-13:24:51] [V] [TRT] Tactic: 5126565865931538390 Time: 0.172544 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: 5158259316594207439 [03/25/2022-13:24:51] [V] [TRT] Tactic: 5158259316594207439 Time: 0.1184 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 5375256703210220108 [03/25/2022-13:24:51] [V] [TRT] Tactic: 5375256703210220108 Time: 0.124032 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_no_preds Tactic: 5380489069875971144 [03/25/2022-13:24:51] [V] [TRT] Tactic: 5380489069875971144 Time: 0.20224 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32 Tactic: 5424417905073460656 [03/25/2022-13:24:51] [V] [TRT] Tactic: 5424417905073460656 Time: 0.185344 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32 Tactic: 5442043907221427810 [03/25/2022-13:24:51] [V] [TRT] Tactic: 5442043907221427810 Time: 0.104704 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_no_preds Tactic: 5698083265414543143 [03/25/2022-13:24:51] [V] [TRT] Tactic: 5698083265414543143 Time: 0.158208 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6007888770437705057 [03/25/2022-13:24:51] [V] [TRT] Tactic: 6007888770437705057 Time: 0.099712 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16 Tactic: 6394572396369862482 [03/25/2022-13:24:51] [V] [TRT] Tactic: 6394572396369862482 Time: 0.288896 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 6405251167055673379 [03/25/2022-13:24:51] [V] [TRT] Tactic: 6405251167055673379 Time: 0.110848 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 6433368103202497147 [03/25/2022-13:24:51] [V] [TRT] Tactic: 6433368103202497147 Time: 0.115584 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_epifadd Tactic: 6434020722187266170 [03/25/2022-13:24:51] [V] [TRT] Tactic: 6434020722187266170 Time: 0.118784 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6441948709525127755 [03/25/2022-13:24:51] [V] [TRT] Tactic: 6441948709525127755 Time: 0.1696 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6457435868048963632 [03/25/2022-13:24:51] [V] [TRT] Tactic: 6457435868048963632 Time: 0.104704 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1 Tactic: 6510345569544721081 [03/25/2022-13:24:51] [V] [TRT] Tactic: 6510345569544721081 Time: 0.157696 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32 Tactic: 6781129591847482048 [03/25/2022-13:24:51] [V] [TRT] Tactic: 6781129591847482048 Time: 0.147328 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6925201228918187099 [03/25/2022-13:24:51] [V] [TRT] Tactic: 6925201228918187099 Time: 0.093312 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6991524515605108718 [03/25/2022-13:24:51] [V] [TRT] Tactic: 6991524515605108718 Time: 0.110848 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32 Tactic: 7077570591813340966 [03/25/2022-13:24:51] [V] [TRT] Tactic: 7077570591813340966 Time: 0.11648 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 7318929579222925725 [03/25/2022-13:24:51] [V] [TRT] Tactic: 7318929579222925725 Time: 0.123136 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: 7504901284678552178 [03/25/2022-13:24:51] [V] [TRT] Tactic: 7504901284678552178 Time: 0.104448 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 7731430299029542276 [03/25/2022-13:24:51] [V] [TRT] Tactic: 7731430299029542276 Time: 0.093184 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 7738495016763012180 [03/25/2022-13:24:51] [V] [TRT] Tactic: 7738495016763012180 Time: 0.096768 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: 7886967395128926382 [03/25/2022-13:24:51] [V] [TRT] Tactic: 7886967395128926382 Time: 0.123008 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 8234775147403903473 [03/25/2022-13:24:51] [V] [TRT] Tactic: 8234775147403903473 Time: 0.105728 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_indexed_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: 8751622450593766232 [03/25/2022-13:24:51] [V] [TRT] Tactic: 8751622450593766232 Time: 0.114048 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 8765382722978397630 [03/25/2022-13:24:51] [V] [TRT] Tactic: 8765382722978397630 Time: 0.09472 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1 Tactic: 9062173295331155069 [03/25/2022-13:24:51] [V] [TRT] Tactic: 9062173295331155069 Time: 0.202496 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_indexed_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32 Tactic: 9064458886956700976 [03/25/2022-13:24:51] [V] [TRT] Tactic: 9064458886956700976 Time: 0.114176 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32 Tactic: -9165697322068360861 [03/25/2022-13:24:51] [V] [TRT] Tactic: -9165697322068360861 Time: 0.127104 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_small_nt_v1 Tactic: -9118785798277698619 [03/25/2022-13:24:51] [V] [TRT] Tactic: -9118785798277698619 Time: 0.169344 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: -9108166971364503411 [03/25/2022-13:24:51] [V] [TRT] Tactic: -9108166971364503411 Time: 0.190208 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -8943710627305202139 [03/25/2022-13:24:51] [V] [TRT] Tactic: -8943710627305202139 Time: 0.163328 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16 Tactic: -8861822316054763526 [03/25/2022-13:24:51] [V] [TRT] Tactic: -8861822316054763526 Time: 0.170112 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64 Tactic: -8791277710877987710 [03/25/2022-13:24:51] [V] [TRT] Tactic: -8791277710877987710 Time: 0.149376 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: -8691377209893505057 [03/25/2022-13:24:51] [V] [TRT] Tactic: -8691377209893505057 Time: 0.097152 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -8638624340850784688 [03/25/2022-13:24:51] [V] [TRT] Tactic: -8638624340850784688 Time: 0.137216 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -8556775352640313933 [03/25/2022-13:24:51] [V] [TRT] Tactic: -8556775352640313933 Time: 0.115968 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: -8520292213102999339 [03/25/2022-13:24:51] [V] [TRT] Tactic: -8520292213102999339 Time: 0.125824 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32 Tactic: -8263994888336646547 [03/25/2022-13:24:51] [V] [TRT] Tactic: -8263994888336646547 Time: 0.105472 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32 Tactic: -8205948405243401049 [03/25/2022-13:24:51] [V] [TRT] Tactic: -8205948405243401049 Time: 0.194304 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -8172318747337038866 [03/25/2022-13:24:51] [V] [TRT] Tactic: -8172318747337038866 Time: 0.167808 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -7844028314176826857 [03/25/2022-13:24:51] [V] [TRT] Tactic: -7844028314176826857 Time: 0.15808 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_epifadd Tactic: -7842775553137511386 [03/25/2022-13:24:51] [V] [TRT] Tactic: -7842775553137511386 Time: 0.132736 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32 Tactic: -7683887278997527517 [03/25/2022-13:24:51] [V] [TRT] Tactic: -7683887278997527517 Time: 0.225024 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: -7381370635708568663 [03/25/2022-13:24:51] [V] [TRT] Tactic: -7381370635708568663 Time: 0.133504 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r1s1 Tactic: -7361755530333096258 [03/25/2022-13:24:51] [V] [TRT] Tactic: -7361755530333096258 Time: 0.22272 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_t1r1s1 Tactic: -7289760022626653388 [03/25/2022-13:24:51] [V] [TRT] Tactic: -7289760022626653388 Time: 0.1248 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -6812830108414456369 [03/25/2022-13:24:51] [V] [TRT] Tactic: -6812830108414456369 Time: 0.13312 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -6527178416855951297 [03/25/2022-13:24:51] [V] [TRT] Tactic: -6527178416855951297 Time: 0.2144 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -6510232214299595844 [03/25/2022-13:24:51] [V] [TRT] Tactic: -6510232214299595844 Time: 0.214656 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_small_nt_v1 Tactic: -6400348606759295499 [03/25/2022-13:24:51] [V] [TRT] Tactic: -6400348606759295499 Time: 0.160896 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32 Tactic: -6256128573036943404 [03/25/2022-13:24:51] [V] [TRT] Tactic: -6256128573036943404 Time: 0.158848 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_small_nt_v1 Tactic: -5980889159865208399 [03/25/2022-13:24:51] [V] [TRT] Tactic: -5980889159865208399 Time: 0.172032 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_medium_nt_v1 Tactic: -5766140806760372989 [03/25/2022-13:24:51] [V] [TRT] Tactic: -5766140806760372989 Time: 0.173184 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -5697614955743334137 [03/25/2022-13:24:51] [V] [TRT] Tactic: -5697614955743334137 Time: 0.159104 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -5311474420963248369 [03/25/2022-13:24:51] [V] [TRT] Tactic: -5311474420963248369 Time: 0.235264 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16 Tactic: -5180570335464125033 [03/25/2022-13:24:51] [V] [TRT] Tactic: -5180570335464125033 Time: 0.180736 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -5170003087447722174 [03/25/2022-13:24:51] [V] [TRT] Tactic: -5170003087447722174 Time: 0.171904 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -4849712423393454704 [03/25/2022-13:24:51] [V] [TRT] Tactic: -4849712423393454704 Time: 0.104192 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_medium_nt_v1 Tactic: -4516822589357530549 [03/25/2022-13:24:51] [V] [TRT] Tactic: -4516822589357530549 Time: 0.177152 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -4425346730823666456 [03/25/2022-13:24:51] [V] [TRT] Tactic: -4425346730823666456 Time: 0.144896 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r1s1 Tactic: -4260476497340370474 [03/25/2022-13:24:51] [V] [TRT] Tactic: -4260476497340370474 Time: 0.2848 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64 Tactic: -4232916483289779353 [03/25/2022-13:24:51] [V] [TRT] Tactic: -4232916483289779353 Time: 0.185728 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -4182501876984672402 [03/25/2022-13:24:51] [V] [TRT] Tactic: -4182501876984672402 Time: 0.120448 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_t1r1s1 Tactic: -4151617293257698859 [03/25/2022-13:24:51] [V] [TRT] Tactic: -4151617293257698859 Time: 0.09984 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -3862908719298381451 [03/25/2022-13:24:51] [V] [TRT] Tactic: -3862908719298381451 Time: 0.080896 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -3613322253849278738 [03/25/2022-13:24:51] [V] [TRT] Tactic: -3613322253849278738 Time: 0.238848 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -3577322188448771475 [03/25/2022-13:24:51] [V] [TRT] Tactic: -3577322188448771475 Time: 0.1952 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16_t1r1s1 Tactic: -3531681826488401618 [03/25/2022-13:24:51] [V] [TRT] Tactic: -3531681826488401618 Time: 0.167168 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64 Tactic: -3460842194336717186 [03/25/2022-13:24:51] [V] [TRT] Tactic: -3460842194336717186 Time: 0.143232 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -2754311112012636251 [03/25/2022-13:24:51] [V] [TRT] Tactic: -2754311112012636251 Time: 0.195584 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: -2499089240293650188 [03/25/2022-13:24:51] [V] [TRT] Tactic: -2499089240293650188 Time: 0.163456 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: -2352253835013627337 [03/25/2022-13:24:51] [V] [TRT] Tactic: -2352253835013627337 Time: 0.08448 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: -2328318099174473157 [03/25/2022-13:24:51] [V] [TRT] Tactic: -2328318099174473157 Time: 0.20736 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r1s1 Tactic: -2315453944962430928 [03/25/2022-13:24:51] [V] [TRT] Tactic: -2315453944962430928 Time: 0.103168 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: -2083778562631872334 [03/25/2022-13:24:51] [V] [TRT] Tactic: -2083778562631872334 Time: 0.145536 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32 Tactic: -2054375205435666404 [03/25/2022-13:24:51] [V] [TRT] Tactic: -2054375205435666404 Time: 0.133504 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -1740762957710554518 [03/25/2022-13:24:51] [V] [TRT] Tactic: -1740762957710554518 Time: 0.20352 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -1549742793039499659 [03/25/2022-13:24:51] [V] [TRT] Tactic: -1549742793039499659 Time: 0.13632 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -1499578657823798783 [03/25/2022-13:24:51] [V] [TRT] Tactic: -1499578657823798783 Time: 0.134784 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32 Tactic: -1498626619443284096 [03/25/2022-13:24:51] [V] [TRT] Tactic: -1498626619443284096 Time: 0.207104 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -1494157908358500249 [03/25/2022-13:24:51] [V] [TRT] Tactic: -1494157908358500249 Time: 0.187392 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_no_preds Tactic: -1465330458665632513 [03/25/2022-13:24:51] [V] [TRT] Tactic: -1465330458665632513 Time: 0.134016 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_t1r1s1 Tactic: -1328736756812546664 [03/25/2022-13:24:51] [V] [TRT] Tactic: -1328736756812546664 Time: 0.12672 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32 Tactic: -1283580231568512025 [03/25/2022-13:24:51] [V] [TRT] Tactic: -1283580231568512025 Time: 0.264832 [03/25/2022-13:24:51] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_epifadd Tactic: -1173968681844185579 [03/25/2022-13:24:51] [V] [TRT] Tactic: -1173968681844185579 Time: 0.26496 [03/25/2022-13:24:52] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32 Tactic: -762222380308749469 [03/25/2022-13:24:52] [V] [TRT] Tactic: -762222380308749469 Time: 0.156928 [03/25/2022-13:24:52] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -713022856474991236 [03/25/2022-13:24:52] [V] [TRT] Tactic: -713022856474991236 Time: 0.239104 [03/25/2022-13:24:52] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -619668460699260222 [03/25/2022-13:24:52] [V] [TRT] Tactic: -619668460699260222 Time: 0.15872 [03/25/2022-13:24:52] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: -556794153877490941 [03/25/2022-13:24:52] [V] [TRT] Tactic: -556794153877490941 Time: 0.156032 [03/25/2022-13:24:52] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -405554772060757402 [03/25/2022-13:24:52] [V] [TRT] Tactic: -405554772060757402 Time: 0.129152 [03/25/2022-13:24:52] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -375949437730908730 [03/25/2022-13:24:52] [V] [TRT] Tactic: -375949437730908730 Time: 0.126976 [03/25/2022-13:24:52] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16 Tactic: -366411318217594794 [03/25/2022-13:24:52] [V] [TRT] Tactic: -366411318217594794 Time: 0.22656 [03/25/2022-13:24:52] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16 Tactic: -351548418071036983 [03/25/2022-13:24:52] [V] [TRT] Tactic: -351548418071036983 Time: 0.168192 [03/25/2022-13:24:52] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: -233227833606287806 [03/25/2022-13:24:52] [V] [TRT] Tactic: -233227833606287806 Time: 0.133632 [03/25/2022-13:24:52] [V] [TRT] Fastest Tactic: 857001784974286465 Time: 0.075776 [03/25/2022-13:24:52] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: 857001784974286465 [03/25/2022-13:24:52] [V] [TRT] =============== Computing costs for [03/25/2022-13:24:52] [V] [TRT] *************** Autotuning format combination: Int8(25088,196:4,14,1) -> Int8(6272,49:4,7,1) *************** [03/25/2022-13:24:52] [V] [TRT] --------------- Timing Runner: sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 (CudaDepthwiseConvolution) [03/25/2022-13:24:52] [V] [TRT] CudaDepthwiseConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:52] [V] [TRT] --------------- Timing Runner: sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 (FusedConvActConvolution) [03/25/2022-13:24:52] [V] [TRT] FusedConvActConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:52] [V] [TRT] --------------- Timing Runner: sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 (CaskConvolution) [03/25/2022-13:24:52] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_medium_nn_v1 Tactic: 175853789719975416 [03/25/2022-13:24:52] [V] [TRT] Tactic: 175853789719975416 Time: 0.663168 [03/25/2022-13:24:52] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_medium_nn_v1 Tactic: 2171150287007712632 [03/25/2022-13:24:52] [V] [TRT] Tactic: 2171150287007712632 Time: 0.688128 [03/25/2022-13:24:52] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_small_nn_v1 Tactic: 2234457234705232274 [03/25/2022-13:24:52] [V] [TRT] Tactic: 2234457234705232274 Time: 0.591104 [03/25/2022-13:24:52] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_medium_nn_v1 Tactic: 5834048089706882838 [03/25/2022-13:24:52] [V] [TRT] Tactic: 5834048089706882838 Time: 0.614656 [03/25/2022-13:24:52] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_small_nn_v1 Tactic: -8626990807754934295 [03/25/2022-13:24:52] [V] [TRT] Tactic: -8626990807754934295 Time: 0.650496 [03/25/2022-13:24:52] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_small_nn_v1 Tactic: -7303593854972602201 [03/25/2022-13:24:52] [V] [TRT] Tactic: -7303593854972602201 Time: 0.6176 [03/25/2022-13:24:52] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_medium_nn_v1 Tactic: -6585664687867083638 [03/25/2022-13:24:52] [V] [TRT] Tactic: -6585664687867083638 Time: 0.573568 [03/25/2022-13:24:52] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_xregs_large_nn_v1 Tactic: -3730012925709297561 [03/25/2022-13:24:52] [V] [TRT] Tactic: -3730012925709297561 Time: 0.591104 [03/25/2022-13:24:52] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_xregs_large_nn_v1 Tactic: -2277259417488004546 [03/25/2022-13:24:52] [V] [TRT] Tactic: -2277259417488004546 Time: 0.597248 [03/25/2022-13:24:52] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_small_nn_v1 Tactic: -683636008127039856 [03/25/2022-13:24:52] [V] [TRT] Tactic: -683636008127039856 Time: 0.567296 [03/25/2022-13:24:52] [V] [TRT] Fastest Tactic: -683636008127039856 Time: 0.567296 [03/25/2022-13:24:52] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: -683636008127039856 [03/25/2022-13:24:52] [V] [TRT] *************** Autotuning format combination: Int8(25088,196:4,14,1) -> Int8(784,49:32,7,1) *************** [03/25/2022-13:24:52] [V] [TRT] --------------- Timing Runner: sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 (CaskConvolution) [03/25/2022-13:24:52] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_xregs_large_c32_nn_v1 Tactic: 984309058095623735 [03/25/2022-13:24:52] [V] [TRT] Tactic: 984309058095623735 Time: 0.592 [03/25/2022-13:24:52] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_small_c32_nn_v1 Tactic: 1100922622480907544 [03/25/2022-13:24:52] [V] [TRT] Tactic: 1100922622480907544 Time: 0.650752 [03/25/2022-13:24:52] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_xregs_large_c32_nn_v1 Tactic: 3238312825609165543 [03/25/2022-13:24:52] [V] [TRT] Tactic: 3238312825609165543 Time: 0.596864 [03/25/2022-13:24:52] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_medium_c32_nn_v1 Tactic: 3606311198834416176 [03/25/2022-13:24:52] [V] [TRT] Tactic: 3606311198834416176 Time: 0.614272 [03/25/2022-13:24:52] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_small_c32_nn_v1 Tactic: 4325765560739862899 [03/25/2022-13:24:52] [V] [TRT] Tactic: 4325765560739862899 Time: 0.567168 [03/25/2022-13:24:52] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_medium_c32_nn_v1 Tactic: -4255737803793506479 [03/25/2022-13:24:52] [V] [TRT] Tactic: -4255737803793506479 Time: 0.573952 [03/25/2022-13:24:52] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_small_c32_nn_v1 Tactic: -3958182351168863467 [03/25/2022-13:24:52] [V] [TRT] Tactic: -3958182351168863467 Time: 0.616832 [03/25/2022-13:24:52] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_medium_c32_nn_v1 Tactic: -3111968753064955248 [03/25/2022-13:24:52] [V] [TRT] Tactic: -3111968753064955248 Time: 0.68736 [03/25/2022-13:24:52] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_medium_c32_nn_v1 Tactic: -1492575840277333548 [03/25/2022-13:24:52] [V] [TRT] Tactic: -1492575840277333548 Time: 0.663936 [03/25/2022-13:24:52] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_small_c32_nn_v1 Tactic: -868495160148524802 [03/25/2022-13:24:52] [V] [TRT] Tactic: -868495160148524802 Time: 0.591232 [03/25/2022-13:24:52] [V] [TRT] Fastest Tactic: 4325765560739862899 Time: 0.567168 [03/25/2022-13:24:52] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: 4325765560739862899 [03/25/2022-13:24:52] [V] [TRT] *************** Autotuning format combination: Int8(3136,196:32,14,1) -> Int8(784,49:32,7,1) *************** [03/25/2022-13:24:52] [V] [TRT] --------------- Timing Runner: sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 (CudaGroupConvolution) [03/25/2022-13:24:52] [V] [TRT] CudaGroupConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:52] [V] [TRT] --------------- Timing Runner: sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 (CudaDepthwiseConvolution) [03/25/2022-13:24:52] [V] [TRT] CudaDepthwiseConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:52] [V] [TRT] --------------- Timing Runner: sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 (FusedConvActConvolution) [03/25/2022-13:24:52] [V] [TRT] FusedConvActConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:52] [V] [TRT] --------------- Timing Runner: sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 (CaskConvolution) [03/25/2022-13:24:52] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_epifadd Tactic: 177040020707947851 [03/25/2022-13:24:53] [V] [TRT] Tactic: 177040020707947851 Time: 0.222592 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 184229963126259101 [03/25/2022-13:24:53] [V] [TRT] Tactic: 184229963126259101 Time: 0.14592 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 289888059097454627 [03/25/2022-13:24:53] [V] [TRT] Tactic: 289888059097454627 Time: 0.16128 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32 Tactic: 328135613486708155 [03/25/2022-13:24:53] [V] [TRT] Tactic: 328135613486708155 Time: 0.342528 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 680740992583869928 [03/25/2022-13:24:53] [V] [TRT] Tactic: 680740992583869928 Time: 0.179712 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32 Tactic: 1111159740952609683 [03/25/2022-13:24:53] [V] [TRT] Tactic: 1111159740952609683 Time: 0.140288 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: 1134860903395928905 [03/25/2022-13:24:53] [V] [TRT] Tactic: 1134860903395928905 Time: 0.130688 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: 1276591930377039442 [03/25/2022-13:24:53] [V] [TRT] Tactic: 1276591930377039442 Time: 0.148352 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r3s3 Tactic: 1388866374720163187 [03/25/2022-13:24:53] [V] [TRT] Tactic: 1388866374720163187 Time: 0.243456 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32 Tactic: 1399501420456320585 [03/25/2022-13:24:53] [V] [TRT] Tactic: 1399501420456320585 Time: 0.180096 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: 1550399266192842845 [03/25/2022-13:24:53] [V] [TRT] Tactic: 1550399266192842845 Time: 0.18176 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 1572887561103143487 [03/25/2022-13:24:53] [V] [TRT] Tactic: 1572887561103143487 Time: 0.1824 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 1853122447892949466 [03/25/2022-13:24:53] [V] [TRT] Tactic: 1853122447892949466 Time: 0.178816 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: 2133329569091732311 [03/25/2022-13:24:53] [V] [TRT] Tactic: 2133329569091732311 Time: 0.168576 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32 Tactic: 2325023763229477890 [03/25/2022-13:24:53] [V] [TRT] Tactic: 2325023763229477890 Time: 0.11136 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: 2579824863892891529 [03/25/2022-13:24:53] [V] [TRT] Tactic: 2579824863892891529 Time: 0.249216 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32 Tactic: 2783960536172159663 [03/25/2022-13:24:53] [V] [TRT] Tactic: 2783960536172159663 Time: 0.121472 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32 Tactic: 2821711838552913693 [03/25/2022-13:24:53] [V] [TRT] Tactic: 2821711838552913693 Time: 0.154112 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64 Tactic: 2945009978756227538 [03/25/2022-13:24:53] [V] [TRT] Tactic: 2945009978756227538 Time: 0.112768 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_medium_nt_v1 Tactic: 2985940154541537814 [03/25/2022-13:24:53] [V] [TRT] Tactic: 2985940154541537814 Time: 0.173056 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 3284282970967328046 [03/25/2022-13:24:53] [V] [TRT] Tactic: 3284282970967328046 Time: 0.236416 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 3401614690060226673 [03/25/2022-13:24:53] [V] [TRT] Tactic: 3401614690060226673 Time: 0.207872 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32 Tactic: 3456719996792527006 [03/25/2022-13:24:53] [V] [TRT] Tactic: 3456719996792527006 Time: 0.154624 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 3512426920013359699 [03/25/2022-13:24:53] [V] [TRT] Tactic: 3512426920013359699 Time: 0.144384 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64 Tactic: 3651043333819148268 [03/25/2022-13:24:53] [V] [TRT] Tactic: 3651043333819148268 Time: 0.075264 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_medium_nt_v1 Tactic: 3899284354987683408 [03/25/2022-13:24:53] [V] [TRT] Tactic: 3899284354987683408 Time: 0.169344 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32 Tactic: 4042202769383439184 [03/25/2022-13:24:53] [V] [TRT] Tactic: 4042202769383439184 Time: 0.10688 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_large_nt_v1 Tactic: 4182625619810185112 [03/25/2022-13:24:53] [V] [TRT] Tactic: 4182625619810185112 Time: 0.187264 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 4214794893922618058 [03/25/2022-13:24:53] [V] [TRT] Tactic: 4214794893922618058 Time: 0.166912 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_epifadd Tactic: 4259547356717612415 [03/25/2022-13:24:53] [V] [TRT] Tactic: 4259547356717612415 Time: 0.18368 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 4384868749799132354 [03/25/2022-13:24:53] [V] [TRT] Tactic: 4384868749799132354 Time: 0.24896 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64 Tactic: 4414594337986714263 [03/25/2022-13:24:53] [V] [TRT] Tactic: 4414594337986714263 Time: 0.082944 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_small_nt_v1 Tactic: 4717285412741024953 [03/25/2022-13:24:53] [V] [TRT] Tactic: 4717285412741024953 Time: 0.17344 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32 Tactic: 4734519122557206480 [03/25/2022-13:24:53] [V] [TRT] Tactic: 4734519122557206480 Time: 0.089088 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: 4922297020351187339 [03/25/2022-13:24:53] [V] [TRT] Tactic: 4922297020351187339 Time: 0.114432 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16_t1r3s3 Tactic: 4931167631624420067 [03/25/2022-13:24:53] [V] [TRT] Tactic: 4931167631624420067 Time: 0.16256 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_epifadd Tactic: 5121596860264626879 [03/25/2022-13:24:53] [V] [TRT] Tactic: 5121596860264626879 Time: 0.08704 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 5136656982162849059 [03/25/2022-13:24:53] [V] [TRT] Tactic: 5136656982162849059 Time: 0.237056 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: 5158259316594207439 [03/25/2022-13:24:53] [V] [TRT] Tactic: 5158259316594207439 Time: 0.106752 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 5189825015507701541 [03/25/2022-13:24:53] [V] [TRT] Tactic: 5189825015507701541 Time: 0.362752 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32 Tactic: 5424417905073460656 [03/25/2022-13:24:53] [V] [TRT] Tactic: 5424417905073460656 Time: 0.20352 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32 Tactic: 5442043907221427810 [03/25/2022-13:24:53] [V] [TRT] Tactic: 5442043907221427810 Time: 0.135808 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_t1r3s3 Tactic: 5544365258913999384 [03/25/2022-13:24:53] [V] [TRT] Tactic: 5544365258913999384 Time: 0.133888 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r3s3 Tactic: 5641967928706599451 [03/25/2022-13:24:53] [V] [TRT] Tactic: 5641967928706599451 Time: 0.302592 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 5721595115357140131 [03/25/2022-13:24:53] [V] [TRT] Tactic: 5721595115357140131 Time: 0.141184 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 5966973378912044513 [03/25/2022-13:24:53] [V] [TRT] Tactic: 5966973378912044513 Time: 0.10944 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 6004789655466615912 [03/25/2022-13:24:53] [V] [TRT] Tactic: 6004789655466615912 Time: 0.182656 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r3s3 Tactic: 6146901278630392829 [03/25/2022-13:24:53] [V] [TRT] Tactic: 6146901278630392829 Time: 0.089088 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16 Tactic: 6394572396369862482 [03/25/2022-13:24:53] [V] [TRT] Tactic: 6394572396369862482 Time: 0.30336 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_epifadd Tactic: 6434020722187266170 [03/25/2022-13:24:53] [V] [TRT] Tactic: 6434020722187266170 Time: 0.088576 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32 Tactic: 6781129591847482048 [03/25/2022-13:24:53] [V] [TRT] Tactic: 6781129591847482048 Time: 0.125312 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 6984451771200230840 [03/25/2022-13:24:53] [V] [TRT] Tactic: 6984451771200230840 Time: 0.136576 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 7048234086361926570 [03/25/2022-13:24:53] [V] [TRT] Tactic: 7048234086361926570 Time: 0.1952 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32 Tactic: 7077570591813340966 [03/25/2022-13:24:53] [V] [TRT] Tactic: 7077570591813340966 Time: 0.134016 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 7191893591576074000 [03/25/2022-13:24:53] [V] [TRT] Tactic: 7191893591576074000 Time: 0.180608 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_t1r3s3 Tactic: 7429976449747682901 [03/25/2022-13:24:53] [V] [TRT] Tactic: 7429976449747682901 Time: 0.123648 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 7438984192263206338 [03/25/2022-13:24:53] [V] [TRT] Tactic: 7438984192263206338 Time: 0.106752 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: 7504901284678552178 [03/25/2022-13:24:53] [V] [TRT] Tactic: 7504901284678552178 Time: 0.09088 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 8096257414008860171 [03/25/2022-13:24:53] [V] [TRT] Tactic: 8096257414008860171 Time: 0.124672 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 8128112048355596715 [03/25/2022-13:24:53] [V] [TRT] Tactic: 8128112048355596715 Time: 0.119552 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_indexed_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: 8751622450593766232 [03/25/2022-13:24:53] [V] [TRT] Tactic: 8751622450593766232 Time: 0.100736 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_indexed_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32 Tactic: 9064458886956700976 [03/25/2022-13:24:53] [V] [TRT] Tactic: 9064458886956700976 Time: 0.102528 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 9143438935315839085 [03/25/2022-13:24:53] [V] [TRT] Tactic: 9143438935315839085 Time: 0.207744 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32 Tactic: -9165697322068360861 [03/25/2022-13:24:53] [V] [TRT] Tactic: -9165697322068360861 Time: 0.089856 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_small_nt_v1 Tactic: -9118785798277698619 [03/25/2022-13:24:53] [V] [TRT] Tactic: -9118785798277698619 Time: 0.168832 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: -9108166971364503411 [03/25/2022-13:24:53] [V] [TRT] Tactic: -9108166971364503411 Time: 0.196864 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16 Tactic: -8861822316054763526 [03/25/2022-13:24:53] [V] [TRT] Tactic: -8861822316054763526 Time: 0.162432 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64 Tactic: -8791277710877987710 [03/25/2022-13:24:53] [V] [TRT] Tactic: -8791277710877987710 Time: 0.129408 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: -8691377209893505057 [03/25/2022-13:24:53] [V] [TRT] Tactic: -8691377209893505057 Time: 0.103296 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: -8520292213102999339 [03/25/2022-13:24:53] [V] [TRT] Tactic: -8520292213102999339 Time: 0.115968 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16_t1r3s3 Tactic: -8475551154769412306 [03/25/2022-13:24:53] [V] [TRT] Tactic: -8475551154769412306 Time: 0.170496 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -8417388128970254446 [03/25/2022-13:24:53] [V] [TRT] Tactic: -8417388128970254446 Time: 0.139008 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32 Tactic: -8263994888336646547 [03/25/2022-13:24:53] [V] [TRT] Tactic: -8263994888336646547 Time: 0.091392 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32 Tactic: -8205948405243401049 [03/25/2022-13:24:53] [V] [TRT] Tactic: -8205948405243401049 Time: 0.182144 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: -7992068592656168418 [03/25/2022-13:24:53] [V] [TRT] Tactic: -7992068592656168418 Time: 0.124544 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_t1r3s3 Tactic: -7898477046581738867 [03/25/2022-13:24:53] [V] [TRT] Tactic: -7898477046581738867 Time: 0.181632 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_epifadd Tactic: -7842775553137511386 [03/25/2022-13:24:53] [V] [TRT] Tactic: -7842775553137511386 Time: 0.11072 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32 Tactic: -7683887278997527517 [03/25/2022-13:24:53] [V] [TRT] Tactic: -7683887278997527517 Time: 0.204544 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: -7381370635708568663 [03/25/2022-13:24:53] [V] [TRT] Tactic: -7381370635708568663 Time: 0.120448 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -7129320389887881029 [03/25/2022-13:24:53] [V] [TRT] Tactic: -7129320389887881029 Time: 0.126848 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -6959995514028471820 [03/25/2022-13:24:53] [V] [TRT] Tactic: -6959995514028471820 Time: 0.12288 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_small_nt_v1 Tactic: -6400348606759295499 [03/25/2022-13:24:53] [V] [TRT] Tactic: -6400348606759295499 Time: 0.16832 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: -6371781333659293809 [03/25/2022-13:24:53] [V] [TRT] Tactic: -6371781333659293809 Time: 0.176256 [03/25/2022-13:24:53] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32 Tactic: -6256128573036943404 [03/25/2022-13:24:54] [V] [TRT] Tactic: -6256128573036943404 Time: 0.135808 [03/25/2022-13:24:54] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_small_nt_v1 Tactic: -5980889159865208399 [03/25/2022-13:24:54] [V] [TRT] Tactic: -5980889159865208399 Time: 0.164736 [03/25/2022-13:24:54] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_medium_nt_v1 Tactic: -5766140806760372989 [03/25/2022-13:24:54] [V] [TRT] Tactic: -5766140806760372989 Time: 0.178944 [03/25/2022-13:24:54] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: -5709079507616090666 [03/25/2022-13:24:54] [V] [TRT] Tactic: -5709079507616090666 Time: 0.091008 [03/25/2022-13:24:54] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: -5698636014239116282 [03/25/2022-13:24:54] [V] [TRT] Tactic: -5698636014239116282 Time: 0.08704 [03/25/2022-13:24:54] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16 Tactic: -5180570335464125033 [03/25/2022-13:24:54] [V] [TRT] Tactic: -5180570335464125033 Time: 0.17728 [03/25/2022-13:24:54] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -4933563390723451692 [03/25/2022-13:24:54] [V] [TRT] Tactic: -4933563390723451692 Time: 0.145152 [03/25/2022-13:24:54] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_medium_nt_v1 Tactic: -4516822589357530549 [03/25/2022-13:24:54] [V] [TRT] Tactic: -4516822589357530549 Time: 0.181248 [03/25/2022-13:24:54] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64 Tactic: -4232916483289779353 [03/25/2022-13:24:54] [V] [TRT] Tactic: -4232916483289779353 Time: 0.198144 [03/25/2022-13:24:54] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64 Tactic: -3460842194336717186 [03/25/2022-13:24:54] [V] [TRT] Tactic: -3460842194336717186 Time: 0.115456 [03/25/2022-13:24:54] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: -3413217501222406256 [03/25/2022-13:24:54] [V] [TRT] Tactic: -3413217501222406256 Time: 0.087296 [03/25/2022-13:24:54] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: -3280888557222886418 [03/25/2022-13:24:54] [V] [TRT] Tactic: -3280888557222886418 Time: 0.11968 [03/25/2022-13:24:54] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -3238475748440751107 [03/25/2022-13:24:54] [V] [TRT] Tactic: -3238475748440751107 Time: 0.105856 [03/25/2022-13:24:54] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: -3182884991006484042 [03/25/2022-13:24:54] [V] [TRT] Tactic: -3182884991006484042 Time: 0.11072 [03/25/2022-13:24:54] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -3173468756112541306 [03/25/2022-13:24:54] [V] [TRT] Tactic: -3173468756112541306 Time: 0.180736 [03/25/2022-13:24:54] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_large_nt_v1 Tactic: -2917455979290586480 [03/25/2022-13:24:54] [V] [TRT] Tactic: -2917455979290586480 Time: 0.166528 [03/25/2022-13:24:54] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -2741641298163591508 [03/25/2022-13:24:54] [V] [TRT] Tactic: -2741641298163591508 Time: 0.131072 [03/25/2022-13:24:54] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_large_nt_v1 Tactic: -2571022005763160364 [03/25/2022-13:24:54] [V] [TRT] Tactic: -2571022005763160364 Time: 0.17856 [03/25/2022-13:24:54] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: -2499089240293650188 [03/25/2022-13:24:54] [V] [TRT] Tactic: -2499089240293650188 Time: 0.180992 [03/25/2022-13:24:54] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: -2328318099174473157 [03/25/2022-13:24:54] [V] [TRT] Tactic: -2328318099174473157 Time: 0.180224 [03/25/2022-13:24:54] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: -2083778562631872334 [03/25/2022-13:24:54] [V] [TRT] Tactic: -2083778562631872334 Time: 0.124928 [03/25/2022-13:24:54] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32 Tactic: -2054375205435666404 [03/25/2022-13:24:54] [V] [TRT] Tactic: -2054375205435666404 Time: 0.127488 [03/25/2022-13:24:54] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -1546787387293556842 [03/25/2022-13:24:54] [V] [TRT] Tactic: -1546787387293556842 Time: 0.091136 [03/25/2022-13:24:54] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32 Tactic: -1498626619443284096 [03/25/2022-13:24:54] [V] [TRT] Tactic: -1498626619443284096 Time: 0.18432 [03/25/2022-13:24:54] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -1471245223605064669 [03/25/2022-13:24:54] [V] [TRT] Tactic: -1471245223605064669 Time: 0.10496 [03/25/2022-13:24:54] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32 Tactic: -1283580231568512025 [03/25/2022-13:24:54] [V] [TRT] Tactic: -1283580231568512025 Time: 0.239232 [03/25/2022-13:24:54] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: -1224421172675151280 [03/25/2022-13:24:54] [V] [TRT] Tactic: -1224421172675151280 Time: 0.119808 [03/25/2022-13:24:54] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_epifadd Tactic: -1173968681844185579 [03/25/2022-13:24:54] [V] [TRT] Tactic: -1173968681844185579 Time: 0.238592 [03/25/2022-13:24:54] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -921247911551089037 [03/25/2022-13:24:54] [V] [TRT] Tactic: -921247911551089037 Time: 0.103296 [03/25/2022-13:24:54] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32 Tactic: -762222380308749469 [03/25/2022-13:24:54] [V] [TRT] Tactic: -762222380308749469 Time: 0.145152 [03/25/2022-13:24:54] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: -556794153877490941 [03/25/2022-13:24:54] [V] [TRT] Tactic: -556794153877490941 Time: 0.145664 [03/25/2022-13:24:54] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -516725800067794372 [03/25/2022-13:24:54] [V] [TRT] Tactic: -516725800067794372 Time: 0.089088 [03/25/2022-13:24:54] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_large_nt_v1 Tactic: -428104331444385564 [03/25/2022-13:24:54] [V] [TRT] Tactic: -428104331444385564 Time: 0.182272 [03/25/2022-13:24:54] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16 Tactic: -366411318217594794 [03/25/2022-13:24:54] [V] [TRT] Tactic: -366411318217594794 Time: 0.24448 [03/25/2022-13:24:54] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16 Tactic: -351548418071036983 [03/25/2022-13:24:54] [V] [TRT] Tactic: -351548418071036983 Time: 0.162816 [03/25/2022-13:24:54] [V] [TRT] Fastest Tactic: 3651043333819148268 Time: 0.075264 [03/25/2022-13:24:54] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: 3651043333819148268 [03/25/2022-13:24:54] [V] [TRT] =============== Computing costs for [03/25/2022-13:24:54] [V] [TRT] *************** Autotuning format combination: Int8(6272,49:4,7,1), Int8(25088,49:4,7,1) -> Int8(25088,49:4,7,1) *************** [03/25/2022-13:24:54] [V] [TRT] --------------- Timing Runner: sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 (CudaDepthwiseConvolution) [03/25/2022-13:24:54] [V] [TRT] CudaDepthwiseConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:54] [V] [TRT] --------------- Timing Runner: sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 (FusedConvActConvolution) [03/25/2022-13:24:54] [V] [TRT] FusedConvActConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:54] [V] [TRT] --------------- Timing Runner: sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 (CaskConvolution) [03/25/2022-13:24:54] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_medium_nn_v1 Tactic: 175853789719975416 [03/25/2022-13:24:54] [V] [TRT] Tactic: 175853789719975416 Time: 0.285312 [03/25/2022-13:24:54] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_medium_nn_v1 Tactic: 2171150287007712632 [03/25/2022-13:24:54] [V] [TRT] Tactic: 2171150287007712632 Time: 0.291328 [03/25/2022-13:24:54] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_small_nn_v1 Tactic: 2234457234705232274 [03/25/2022-13:24:54] [V] [TRT] Tactic: 2234457234705232274 Time: 0.266496 [03/25/2022-13:24:54] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_medium_nn_v1 Tactic: 5834048089706882838 [03/25/2022-13:24:54] [V] [TRT] Tactic: 5834048089706882838 Time: 0.267648 [03/25/2022-13:24:54] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_interior_nn_v1 Tactic: 6299962968199310600 [03/25/2022-13:24:54] [V] [TRT] Tactic: 6299962968199310600 Time: 0.274048 [03/25/2022-13:24:54] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_interior_nn_v1 Tactic: 6341572697076960911 [03/25/2022-13:24:54] [V] [TRT] Tactic: 6341572697076960911 Time: 0.277888 [03/25/2022-13:24:54] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_small_nn_v1 Tactic: -8626990807754934295 [03/25/2022-13:24:54] [V] [TRT] Tactic: -8626990807754934295 Time: 0.283648 [03/25/2022-13:24:54] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_interior_nn_v1 Tactic: -8498217049614706532 [03/25/2022-13:24:54] [V] [TRT] Tactic: -8498217049614706532 Time: 0.257536 [03/25/2022-13:24:54] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_small_nn_v1 Tactic: -7303593854972602201 [03/25/2022-13:24:54] [V] [TRT] Tactic: -7303593854972602201 Time: 0.27648 [03/25/2022-13:24:54] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_medium_nn_v1 Tactic: -6585664687867083638 [03/25/2022-13:24:54] [V] [TRT] Tactic: -6585664687867083638 Time: 0.275712 [03/25/2022-13:24:54] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_interior_nn_v1 Tactic: -3326139578711341011 [03/25/2022-13:24:54] [V] [TRT] Tactic: -3326139578711341011 Time: 0.268032 [03/25/2022-13:24:54] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_small_nn_v1 Tactic: -683636008127039856 [03/25/2022-13:24:54] [V] [TRT] Tactic: -683636008127039856 Time: 0.276096 [03/25/2022-13:24:54] [V] [TRT] Fastest Tactic: -8498217049614706532 Time: 0.257536 [03/25/2022-13:24:54] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: -8498217049614706532 [03/25/2022-13:24:54] [V] [TRT] *************** Autotuning format combination: Int8(6272,49:4,7,1), Int8(3136,49:32,7,1) -> Int8(3136,49:32,7,1) *************** [03/25/2022-13:24:54] [V] [TRT] --------------- Timing Runner: sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 (CaskConvolution) [03/25/2022-13:24:54] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_small_c32_nn_v1 Tactic: 1100922622480907544 [03/25/2022-13:24:54] [V] [TRT] Tactic: 1100922622480907544 Time: 0.280704 [03/25/2022-13:24:54] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_interior_c32_nn_v1 Tactic: 2855900226702061782 [03/25/2022-13:24:54] [V] [TRT] Tactic: 2855900226702061782 Time: 0.273408 [03/25/2022-13:24:54] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_medium_c32_nn_v1 Tactic: 3606311198834416176 [03/25/2022-13:24:54] [V] [TRT] Tactic: 3606311198834416176 Time: 0.26624 [03/25/2022-13:24:54] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_small_c32_nn_v1 Tactic: 4325765560739862899 [03/25/2022-13:24:54] [V] [TRT] Tactic: 4325765560739862899 Time: 0.2752 [03/25/2022-13:24:54] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_interior_c32_nn_v1 Tactic: 8803458114157674373 [03/25/2022-13:24:54] [V] [TRT] Tactic: 8803458114157674373 Time: 0.256512 [03/25/2022-13:24:54] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_interior_c32_nn_v1 Tactic: -6934773036503365000 [03/25/2022-13:24:54] [V] [TRT] Tactic: -6934773036503365000 Time: 0.264576 [03/25/2022-13:24:54] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_interior_c32_nn_v1 Tactic: -4431642509665791294 [03/25/2022-13:24:54] [V] [TRT] Tactic: -4431642509665791294 Time: 0.276224 [03/25/2022-13:24:54] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_medium_c32_nn_v1 Tactic: -4255737803793506479 [03/25/2022-13:24:54] [V] [TRT] Tactic: -4255737803793506479 Time: 0.275072 [03/25/2022-13:24:54] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_small_c32_nn_v1 Tactic: -3958182351168863467 [03/25/2022-13:24:54] [V] [TRT] Tactic: -3958182351168863467 Time: 0.273664 [03/25/2022-13:24:54] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_medium_c32_nn_v1 Tactic: -3111968753064955248 [03/25/2022-13:24:54] [V] [TRT] Tactic: -3111968753064955248 Time: 0.289024 [03/25/2022-13:24:54] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_medium_c32_nn_v1 Tactic: -1492575840277333548 [03/25/2022-13:24:54] [V] [TRT] Tactic: -1492575840277333548 Time: 0.281728 [03/25/2022-13:24:54] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_small_c32_nn_v1 Tactic: -868495160148524802 [03/25/2022-13:24:54] [V] [TRT] Tactic: -868495160148524802 Time: 0.265216 [03/25/2022-13:24:54] [V] [TRT] Fastest Tactic: 8803458114157674373 Time: 0.256512 [03/25/2022-13:24:54] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: 8803458114157674373 [03/25/2022-13:24:54] [V] [TRT] *************** Autotuning format combination: Int8(784,49:32,7,1), Int8(3136,49:32,7,1) -> Int8(3136,49:32,7,1) *************** [03/25/2022-13:24:54] [V] [TRT] --------------- Timing Runner: sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 (CudaGroupConvolution) [03/25/2022-13:24:54] [V] [TRT] CudaGroupConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:54] [V] [TRT] --------------- Timing Runner: sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 (CudaDepthwiseConvolution) [03/25/2022-13:24:54] [V] [TRT] CudaDepthwiseConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:54] [V] [TRT] --------------- Timing Runner: sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 (FusedConvActConvolution) [03/25/2022-13:24:54] [V] [TRT] FusedConvActConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:54] [V] [TRT] --------------- Timing Runner: sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 (CaskConvolution) [03/25/2022-13:24:54] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16_t1r1s1 Tactic: 68468667201176803 [03/25/2022-13:24:54] [V] [TRT] Tactic: 68468667201176803 Time: 0.108416 [03/25/2022-13:24:54] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 125145153013230687 [03/25/2022-13:24:54] [V] [TRT] Tactic: 125145153013230687 Time: 0.083968 [03/25/2022-13:24:54] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_interior_nt_v1 Tactic: 434957160407688216 [03/25/2022-13:24:54] [V] [TRT] Tactic: 434957160407688216 Time: 0.115328 [03/25/2022-13:24:54] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: 857001784974286465 [03/25/2022-13:24:54] [V] [TRT] Tactic: 857001784974286465 Time: 0.080128 [03/25/2022-13:24:54] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 1583811548148740665 [03/25/2022-13:24:54] [V] [TRT] Tactic: 1583811548148740665 Time: 0.084352 [03/25/2022-13:24:54] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 1701344857577810806 [03/25/2022-13:24:54] [V] [TRT] Tactic: 1701344857577810806 Time: 0.092544 [03/25/2022-13:24:54] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 1797231177354918208 [03/25/2022-13:24:54] [V] [TRT] Tactic: 1797231177354918208 Time: 0.1312 [03/25/2022-13:24:54] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_interior_nt_v1 Tactic: 2346437292116182513 [03/25/2022-13:24:54] [V] [TRT] Tactic: 2346437292116182513 Time: 0.102272 [03/25/2022-13:24:54] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_interior_nt_v1 Tactic: 2522133112320625287 [03/25/2022-13:24:54] [V] [TRT] Tactic: 2522133112320625287 Time: 0.100864 [03/25/2022-13:24:54] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 2548946449357458230 [03/25/2022-13:24:54] [V] [TRT] Tactic: 2548946449357458230 Time: 0.12224 [03/25/2022-13:24:54] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: 2570666021825229009 [03/25/2022-13:24:54] [V] [TRT] Tactic: 2570666021825229009 Time: 0.113664 [03/25/2022-13:24:54] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 2678520742286844763 [03/25/2022-13:24:54] [V] [TRT] Tactic: 2678520742286844763 Time: 0.133376 [03/25/2022-13:24:54] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 2756291002030759362 [03/25/2022-13:24:54] [V] [TRT] Tactic: 2756291002030759362 Time: 0.098048 [03/25/2022-13:24:54] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_medium_nt_v1 Tactic: 2985940154541537814 [03/25/2022-13:24:55] [V] [TRT] Tactic: 2985940154541537814 Time: 0.10304 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 3043273137345374664 [03/25/2022-13:24:55] [V] [TRT] Tactic: 3043273137345374664 Time: 0.10816 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: 3242897809704328258 [03/25/2022-13:24:55] [V] [TRT] Tactic: 3242897809704328258 Time: 0.10816 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 3312456766204252694 [03/25/2022-13:24:55] [V] [TRT] Tactic: 3312456766204252694 Time: 0.131328 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 3538565962642681625 [03/25/2022-13:24:55] [V] [TRT] Tactic: 3538565962642681625 Time: 0.096512 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 3541919052468401776 [03/25/2022-13:24:55] [V] [TRT] Tactic: 3541919052468401776 Time: 0.100864 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 3593397928177382100 [03/25/2022-13:24:55] [V] [TRT] Tactic: 3593397928177382100 Time: 0.116736 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_medium_nt_v1 Tactic: 3899284354987683408 [03/25/2022-13:24:55] [V] [TRT] Tactic: 3899284354987683408 Time: 0.11648 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 3927509214678622419 [03/25/2022-13:24:55] [V] [TRT] Tactic: 3927509214678622419 Time: 0.105344 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 4112572034735311841 [03/25/2022-13:24:55] [V] [TRT] Tactic: 4112572034735311841 Time: 0.156672 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 4610760414797216079 [03/25/2022-13:24:55] [V] [TRT] Tactic: 4610760414797216079 Time: 0.083968 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_small_nt_v1 Tactic: 4717285412741024953 [03/25/2022-13:24:55] [V] [TRT] Tactic: 4717285412741024953 Time: 0.106112 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 4796956614760326119 [03/25/2022-13:24:55] [V] [TRT] Tactic: 4796956614760326119 Time: 0.078208 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: 4909502217677847353 [03/25/2022-13:24:55] [V] [TRT] Tactic: 4909502217677847353 Time: 0.074496 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 5043674678294309681 [03/25/2022-13:24:55] [V] [TRT] Tactic: 5043674678294309681 Time: 0.088832 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_interior_nt_v1 Tactic: 5126565865931538390 [03/25/2022-13:24:55] [V] [TRT] Tactic: 5126565865931538390 Time: 0.104576 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5204702486885981735 [03/25/2022-13:24:55] [V] [TRT] Tactic: 5204702486885981735 Time: 0.085632 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 5424258848951129084 [03/25/2022-13:24:55] [V] [TRT] Tactic: 5424258848951129084 Time: 0.075392 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5443897483205284103 [03/25/2022-13:24:55] [V] [TRT] Tactic: 5443897483205284103 Time: 0.09024 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5707566217891294846 [03/25/2022-13:24:55] [V] [TRT] Tactic: 5707566217891294846 Time: 0.076928 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5986622376339202983 [03/25/2022-13:24:55] [V] [TRT] Tactic: 5986622376339202983 Time: 0.091008 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6007888770437705057 [03/25/2022-13:24:55] [V] [TRT] Tactic: 6007888770437705057 Time: 0.079104 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 6405251167055673379 [03/25/2022-13:24:55] [V] [TRT] Tactic: 6405251167055673379 Time: 0.079488 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 6433368103202497147 [03/25/2022-13:24:55] [V] [TRT] Tactic: 6433368103202497147 Time: 0.079616 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6441948709525127755 [03/25/2022-13:24:55] [V] [TRT] Tactic: 6441948709525127755 Time: 0.121472 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 6443933097134654777 [03/25/2022-13:24:55] [V] [TRT] Tactic: 6443933097134654777 Time: 0.080256 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6457435868048963632 [03/25/2022-13:24:55] [V] [TRT] Tactic: 6457435868048963632 Time: 0.08512 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1 Tactic: 6510345569544721081 [03/25/2022-13:24:55] [V] [TRT] Tactic: 6510345569544721081 Time: 0.115328 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 6793988781414507278 [03/25/2022-13:24:55] [V] [TRT] Tactic: 6793988781414507278 Time: 0.073472 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6925201228918187099 [03/25/2022-13:24:55] [V] [TRT] Tactic: 6925201228918187099 Time: 0.075392 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6991524515605108718 [03/25/2022-13:24:55] [V] [TRT] Tactic: 6991524515605108718 Time: 0.098304 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 7318929579222925725 [03/25/2022-13:24:55] [V] [TRT] Tactic: 7318929579222925725 Time: 0.079872 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: 7886967395128926382 [03/25/2022-13:24:55] [V] [TRT] Tactic: 7886967395128926382 Time: 0.084608 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 8142283985160822229 [03/25/2022-13:24:55] [V] [TRT] Tactic: 8142283985160822229 Time: 0.083328 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 8173975624668590862 [03/25/2022-13:24:55] [V] [TRT] Tactic: 8173975624668590862 Time: 0.083968 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 8234775147403903473 [03/25/2022-13:24:55] [V] [TRT] Tactic: 8234775147403903473 Time: 0.085888 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 8524082966802584889 [03/25/2022-13:24:55] [V] [TRT] Tactic: 8524082966802584889 Time: 0.072704 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 8684013308930763400 [03/25/2022-13:24:55] [V] [TRT] Tactic: 8684013308930763400 Time: 0.1056 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 8765382722978397630 [03/25/2022-13:24:55] [V] [TRT] Tactic: 8765382722978397630 Time: 0.073216 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 8883810517410230831 [03/25/2022-13:24:55] [V] [TRT] Tactic: 8883810517410230831 Time: 0.086272 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1 Tactic: 9062173295331155069 [03/25/2022-13:24:55] [V] [TRT] Tactic: 9062173295331155069 Time: 0.134144 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_small_nt_v1 Tactic: -9118785798277698619 [03/25/2022-13:24:55] [V] [TRT] Tactic: -9118785798277698619 Time: 0.102528 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -8943710627305202139 [03/25/2022-13:24:55] [V] [TRT] Tactic: -8943710627305202139 Time: 0.100352 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -8859846367886814331 [03/25/2022-13:24:55] [V] [TRT] Tactic: -8859846367886814331 Time: 0.112 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -8638624340850784688 [03/25/2022-13:24:55] [V] [TRT] Tactic: -8638624340850784688 Time: 0.102912 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -8382298409581540699 [03/25/2022-13:24:55] [V] [TRT] Tactic: -8382298409581540699 Time: 0.149376 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -8172318747337038866 [03/25/2022-13:24:55] [V] [TRT] Tactic: -8172318747337038866 Time: 0.116864 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -7844028314176826857 [03/25/2022-13:24:55] [V] [TRT] Tactic: -7844028314176826857 Time: 0.110848 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -7674507941016740570 [03/25/2022-13:24:55] [V] [TRT] Tactic: -7674507941016740570 Time: 0.074368 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r1s1 Tactic: -7361755530333096258 [03/25/2022-13:24:55] [V] [TRT] Tactic: -7361755530333096258 Time: 0.124544 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_t1r1s1 Tactic: -7289760022626653388 [03/25/2022-13:24:55] [V] [TRT] Tactic: -7289760022626653388 Time: 0.086272 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -7106539943789766885 [03/25/2022-13:24:55] [V] [TRT] Tactic: -7106539943789766885 Time: 0.115072 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -6969478418607271266 [03/25/2022-13:24:55] [V] [TRT] Tactic: -6969478418607271266 Time: 0.114304 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -6879607992933502380 [03/25/2022-13:24:55] [V] [TRT] Tactic: -6879607992933502380 Time: 0.077184 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -6839669803644810934 [03/25/2022-13:24:55] [V] [TRT] Tactic: -6839669803644810934 Time: 0.087552 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -6812830108414456369 [03/25/2022-13:24:55] [V] [TRT] Tactic: -6812830108414456369 Time: 0.087936 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -6779804930216439173 [03/25/2022-13:24:55] [V] [TRT] Tactic: -6779804930216439173 Time: 0.079488 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -6510232214299595844 [03/25/2022-13:24:55] [V] [TRT] Tactic: -6510232214299595844 Time: 0.14144 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_small_nt_v1 Tactic: -6400348606759295499 [03/25/2022-13:24:55] [V] [TRT] Tactic: -6400348606759295499 Time: 0.0992 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -6346247605026339453 [03/25/2022-13:24:55] [V] [TRT] Tactic: -6346247605026339453 Time: 0.099456 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -6232597026469067819 [03/25/2022-13:24:55] [V] [TRT] Tactic: -6232597026469067819 Time: 0.10112 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_small_nt_v1 Tactic: -5980889159865208399 [03/25/2022-13:24:55] [V] [TRT] Tactic: -5980889159865208399 Time: 0.1152 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_medium_nt_v1 Tactic: -5766140806760372989 [03/25/2022-13:24:55] [V] [TRT] Tactic: -5766140806760372989 Time: 0.105216 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -5697614955743334137 [03/25/2022-13:24:55] [V] [TRT] Tactic: -5697614955743334137 Time: 0.10304 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -5671123121710113970 [03/25/2022-13:24:55] [V] [TRT] Tactic: -5671123121710113970 Time: 0.086656 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -5562968047117507056 [03/25/2022-13:24:55] [V] [TRT] Tactic: -5562968047117507056 Time: 0.078976 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -5516472881360101487 [03/25/2022-13:24:55] [V] [TRT] Tactic: -5516472881360101487 Time: 0.101248 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -5311474420963248369 [03/25/2022-13:24:55] [V] [TRT] Tactic: -5311474420963248369 Time: 0.132352 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -4889498558023475527 [03/25/2022-13:24:55] [V] [TRT] Tactic: -4889498558023475527 Time: 0.084736 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: -4681913707320020520 [03/25/2022-13:24:55] [V] [TRT] Tactic: -4681913707320020520 Time: 0.079488 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_medium_nt_v1 Tactic: -4516822589357530549 [03/25/2022-13:24:55] [V] [TRT] Tactic: -4516822589357530549 Time: 0.109056 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -4455415102719506646 [03/25/2022-13:24:55] [V] [TRT] Tactic: -4455415102719506646 Time: 0.093696 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -4425346730823666456 [03/25/2022-13:24:55] [V] [TRT] Tactic: -4425346730823666456 Time: 0.121984 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r1s1 Tactic: -4260476497340370474 [03/25/2022-13:24:55] [V] [TRT] Tactic: -4260476497340370474 Time: 0.150656 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -4182501876984672402 [03/25/2022-13:24:55] [V] [TRT] Tactic: -4182501876984672402 Time: 0.08768 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_t1r1s1 Tactic: -4151617293257698859 [03/25/2022-13:24:55] [V] [TRT] Tactic: -4151617293257698859 Time: 0.080768 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -3862908719298381451 [03/25/2022-13:24:55] [V] [TRT] Tactic: -3862908719298381451 Time: 0.079104 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -3825889760337461729 [03/25/2022-13:24:55] [V] [TRT] Tactic: -3825889760337461729 Time: 0.13696 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -3613322253849278738 [03/25/2022-13:24:55] [V] [TRT] Tactic: -3613322253849278738 Time: 0.160512 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16_t1r1s1 Tactic: -3531681826488401618 [03/25/2022-13:24:55] [V] [TRT] Tactic: -3531681826488401618 Time: 0.116992 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -3305554949874552860 [03/25/2022-13:24:55] [V] [TRT] Tactic: -3305554949874552860 Time: 0.132096 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -3288585994448820820 [03/25/2022-13:24:55] [V] [TRT] Tactic: -3288585994448820820 Time: 0.082944 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -2754311112012636251 [03/25/2022-13:24:55] [V] [TRT] Tactic: -2754311112012636251 Time: 0.116608 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -2432868635536396215 [03/25/2022-13:24:55] [V] [TRT] Tactic: -2432868635536396215 Time: 0.083584 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: -2352253835013627337 [03/25/2022-13:24:55] [V] [TRT] Tactic: -2352253835013627337 Time: 0.075264 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r1s1 Tactic: -2315453944962430928 [03/25/2022-13:24:55] [V] [TRT] Tactic: -2315453944962430928 Time: 0.085888 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: -2238364958919154661 [03/25/2022-13:24:55] [V] [TRT] Tactic: -2238364958919154661 Time: 0.114048 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -1916483171117495388 [03/25/2022-13:24:55] [V] [TRT] Tactic: -1916483171117495388 Time: 0.097408 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -1740762957710554518 [03/25/2022-13:24:55] [V] [TRT] Tactic: -1740762957710554518 Time: 0.131968 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -1549742793039499659 [03/25/2022-13:24:55] [V] [TRT] Tactic: -1549742793039499659 Time: 0.102784 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -1494157908358500249 [03/25/2022-13:24:55] [V] [TRT] Tactic: -1494157908358500249 Time: 0.114176 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_t1r1s1 Tactic: -1328736756812546664 [03/25/2022-13:24:55] [V] [TRT] Tactic: -1328736756812546664 Time: 0.091648 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -1006589727652607355 [03/25/2022-13:24:55] [V] [TRT] Tactic: -1006589727652607355 Time: 0.126336 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -619668460699260222 [03/25/2022-13:24:55] [V] [TRT] Tactic: -619668460699260222 Time: 0.11328 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -405554772060757402 [03/25/2022-13:24:55] [V] [TRT] Tactic: -405554772060757402 Time: 0.08704 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -375949437730908730 [03/25/2022-13:24:55] [V] [TRT] Tactic: -375949437730908730 Time: 0.096384 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: -233227833606287806 [03/25/2022-13:24:55] [V] [TRT] Tactic: -233227833606287806 Time: 0.088832 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -111878368089469751 [03/25/2022-13:24:55] [V] [TRT] Tactic: -111878368089469751 Time: 0.118912 [03/25/2022-13:24:55] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -19707840769375107 [03/25/2022-13:24:55] [V] [TRT] Tactic: -19707840769375107 Time: 0.111104 [03/25/2022-13:24:55] [V] [TRT] Fastest Tactic: 8524082966802584889 Time: 0.072704 [03/25/2022-13:24:55] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: 8524082966802584889 [03/25/2022-13:24:55] [V] [TRT] =============== Computing costs for [03/25/2022-13:24:55] [V] [TRT] *************** Autotuning format combination: Int8(25088,49:4,7,1) -> Int8(6272,49:4,7,1) *************** [03/25/2022-13:24:55] [V] [TRT] --------------- Timing Runner: sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 (CudaDepthwiseConvolution) [03/25/2022-13:24:55] [V] [TRT] CudaDepthwiseConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:55] [V] [TRT] --------------- Timing Runner: sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 (FusedConvActConvolution) [03/25/2022-13:24:55] [V] [TRT] FusedConvActConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:55] [V] [TRT] --------------- Timing Runner: sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 (CaskConvolution) [03/25/2022-13:24:55] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_medium_nn_v1 Tactic: 175853789719975416 [03/25/2022-13:24:55] [V] [TRT] Tactic: 175853789719975416 Time: 0.301184 [03/25/2022-13:24:55] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_medium_nn_v1 Tactic: 2171150287007712632 [03/25/2022-13:24:55] [V] [TRT] Tactic: 2171150287007712632 Time: 0.302976 [03/25/2022-13:24:55] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_small_nn_v1 Tactic: 2234457234705232274 [03/25/2022-13:24:55] [V] [TRT] Tactic: 2234457234705232274 Time: 0.272256 [03/25/2022-13:24:55] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_medium_nn_v1 Tactic: 5834048089706882838 [03/25/2022-13:24:55] [V] [TRT] Tactic: 5834048089706882838 Time: 0.274944 [03/25/2022-13:24:55] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_interior_nn_v1 Tactic: 6299962968199310600 [03/25/2022-13:24:55] [V] [TRT] Tactic: 6299962968199310600 Time: 0.259968 [03/25/2022-13:24:55] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_interior_nn_v1 Tactic: 6341572697076960911 [03/25/2022-13:24:55] [V] [TRT] Tactic: 6341572697076960911 Time: 0.275712 [03/25/2022-13:24:55] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_small_nn_v1 Tactic: -8626990807754934295 [03/25/2022-13:24:55] [V] [TRT] Tactic: -8626990807754934295 Time: 0.299136 [03/25/2022-13:24:55] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_interior_nn_v1 Tactic: -8498217049614706532 [03/25/2022-13:24:55] [V] [TRT] Tactic: -8498217049614706532 Time: 0.263168 [03/25/2022-13:24:55] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_small_nn_v1 Tactic: -7303593854972602201 [03/25/2022-13:24:55] [V] [TRT] Tactic: -7303593854972602201 Time: 0.285056 [03/25/2022-13:24:55] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_medium_nn_v1 Tactic: -6585664687867083638 [03/25/2022-13:24:55] [V] [TRT] Tactic: -6585664687867083638 Time: 0.260992 [03/25/2022-13:24:55] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_interior_nn_v1 Tactic: -3326139578711341011 [03/25/2022-13:24:55] [V] [TRT] Tactic: -3326139578711341011 Time: 0.2752 [03/25/2022-13:24:55] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_small_nn_v1 Tactic: -683636008127039856 [03/25/2022-13:24:56] [V] [TRT] Tactic: -683636008127039856 Time: 0.261376 [03/25/2022-13:24:56] [V] [TRT] Fastest Tactic: 6299962968199310600 Time: 0.259968 [03/25/2022-13:24:56] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: 6299962968199310600 [03/25/2022-13:24:56] [V] [TRT] *************** Autotuning format combination: Int8(25088,49:4,7,1) -> Int8(784,49:32,7,1) *************** [03/25/2022-13:24:56] [V] [TRT] --------------- Timing Runner: sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 (CaskConvolution) [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_small_c32_nn_v1 Tactic: 1100922622480907544 [03/25/2022-13:24:56] [V] [TRT] Tactic: 1100922622480907544 Time: 0.299264 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_interior_c32_nn_v1 Tactic: 2855900226702061782 [03/25/2022-13:24:56] [V] [TRT] Tactic: 2855900226702061782 Time: 0.259968 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_medium_c32_nn_v1 Tactic: 3606311198834416176 [03/25/2022-13:24:56] [V] [TRT] Tactic: 3606311198834416176 Time: 0.274944 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_small_c32_nn_v1 Tactic: 4325765560739862899 [03/25/2022-13:24:56] [V] [TRT] Tactic: 4325765560739862899 Time: 0.261376 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_interior_c32_nn_v1 Tactic: 8803458114157674373 [03/25/2022-13:24:56] [V] [TRT] Tactic: 8803458114157674373 Time: 0.262912 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_interior_c32_nn_v1 Tactic: -6934773036503365000 [03/25/2022-13:24:56] [V] [TRT] Tactic: -6934773036503365000 Time: 0.275072 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_interior_c32_nn_v1 Tactic: -4431642509665791294 [03/25/2022-13:24:56] [V] [TRT] Tactic: -4431642509665791294 Time: 0.2752 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_medium_c32_nn_v1 Tactic: -4255737803793506479 [03/25/2022-13:24:56] [V] [TRT] Tactic: -4255737803793506479 Time: 0.261248 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_small_c32_nn_v1 Tactic: -3958182351168863467 [03/25/2022-13:24:56] [V] [TRT] Tactic: -3958182351168863467 Time: 0.2848 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_medium_c32_nn_v1 Tactic: -3111968753064955248 [03/25/2022-13:24:56] [V] [TRT] Tactic: -3111968753064955248 Time: 0.302464 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_medium_c32_nn_v1 Tactic: -1492575840277333548 [03/25/2022-13:24:56] [V] [TRT] Tactic: -1492575840277333548 Time: 0.300928 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_small_c32_nn_v1 Tactic: -868495160148524802 [03/25/2022-13:24:56] [V] [TRT] Tactic: -868495160148524802 Time: 0.272 [03/25/2022-13:24:56] [V] [TRT] Fastest Tactic: 2855900226702061782 Time: 0.259968 [03/25/2022-13:24:56] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: 2855900226702061782 [03/25/2022-13:24:56] [V] [TRT] *************** Autotuning format combination: Int8(3136,49:32,7,1) -> Int8(784,49:32,7,1) *************** [03/25/2022-13:24:56] [V] [TRT] --------------- Timing Runner: sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 (CudaGroupConvolution) [03/25/2022-13:24:56] [V] [TRT] CudaGroupConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:56] [V] [TRT] --------------- Timing Runner: sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 (CudaDepthwiseConvolution) [03/25/2022-13:24:56] [V] [TRT] CudaDepthwiseConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:56] [V] [TRT] --------------- Timing Runner: sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 (FusedConvActConvolution) [03/25/2022-13:24:56] [V] [TRT] FusedConvActConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:56] [V] [TRT] --------------- Timing Runner: sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 (CaskConvolution) [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16_t1r1s1 Tactic: 68468667201176803 [03/25/2022-13:24:56] [V] [TRT] Tactic: 68468667201176803 Time: 0.084864 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 125145153013230687 [03/25/2022-13:24:56] [V] [TRT] Tactic: 125145153013230687 Time: 0.068352 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_interior_nt_v1 Tactic: 434957160407688216 [03/25/2022-13:24:56] [V] [TRT] Tactic: 434957160407688216 Time: 0.082816 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 805889586762897346 [03/25/2022-13:24:56] [V] [TRT] Tactic: 805889586762897346 Time: 0.048 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: 857001784974286465 [03/25/2022-13:24:56] [V] [TRT] Tactic: 857001784974286465 Time: 0.036736 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 1214130898909872671 [03/25/2022-13:24:56] [V] [TRT] Tactic: 1214130898909872671 Time: 0.09728 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 1278425129871930205 [03/25/2022-13:24:56] [V] [TRT] Tactic: 1278425129871930205 Time: 0.046976 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 1583811548148740665 [03/25/2022-13:24:56] [V] [TRT] Tactic: 1583811548148740665 Time: 0.074368 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 1701344857577810806 [03/25/2022-13:24:56] [V] [TRT] Tactic: 1701344857577810806 Time: 0.06592 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 1797231177354918208 [03/25/2022-13:24:56] [V] [TRT] Tactic: 1797231177354918208 Time: 0.097408 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2004812516525036381 [03/25/2022-13:24:56] [V] [TRT] Tactic: 2004812516525036381 Time: 0.064512 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2030033463723799063 [03/25/2022-13:24:56] [V] [TRT] Tactic: 2030033463723799063 Time: 0.064 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_interior_nt_v1 Tactic: 2346437292116182513 [03/25/2022-13:24:56] [V] [TRT] Tactic: 2346437292116182513 Time: 0.084992 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2376898825218218566 [03/25/2022-13:24:56] [V] [TRT] Tactic: 2376898825218218566 Time: 0.04864 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_interior_nt_v1 Tactic: 2522133112320625287 [03/25/2022-13:24:56] [V] [TRT] Tactic: 2522133112320625287 Time: 0.084736 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2548171972648455240 [03/25/2022-13:24:56] [V] [TRT] Tactic: 2548171972648455240 Time: 0.05056 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 2548946449357458230 [03/25/2022-13:24:56] [V] [TRT] Tactic: 2548946449357458230 Time: 0.107904 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: 2570666021825229009 [03/25/2022-13:24:56] [V] [TRT] Tactic: 2570666021825229009 Time: 0.075392 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 2678520742286844763 [03/25/2022-13:24:56] [V] [TRT] Tactic: 2678520742286844763 Time: 0.102528 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 2756291002030759362 [03/25/2022-13:24:56] [V] [TRT] Tactic: 2756291002030759362 Time: 0.059904 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2972948223367788520 [03/25/2022-13:24:56] [V] [TRT] Tactic: 2972948223367788520 Time: 0.05504 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_medium_nt_v1 Tactic: 2985940154541537814 [03/25/2022-13:24:56] [V] [TRT] Tactic: 2985940154541537814 Time: 0.086016 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 3043273137345374664 [03/25/2022-13:24:56] [V] [TRT] Tactic: 3043273137345374664 Time: 0.1024 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 3221677093659484230 [03/25/2022-13:24:56] [V] [TRT] Tactic: 3221677093659484230 Time: 0.083456 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: 3242897809704328258 [03/25/2022-13:24:56] [V] [TRT] Tactic: 3242897809704328258 Time: 0.091264 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 3312456766204252694 [03/25/2022-13:24:56] [V] [TRT] Tactic: 3312456766204252694 Time: 0.116352 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 3538565962642681625 [03/25/2022-13:24:56] [V] [TRT] Tactic: 3538565962642681625 Time: 0.072576 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 3541919052468401776 [03/25/2022-13:24:56] [V] [TRT] Tactic: 3541919052468401776 Time: 0.081536 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 3593397928177382100 [03/25/2022-13:24:56] [V] [TRT] Tactic: 3593397928177382100 Time: 0.097792 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 3670282018109435863 [03/25/2022-13:24:56] [V] [TRT] Tactic: 3670282018109435863 Time: 0.0672 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 3671413346254027573 [03/25/2022-13:24:56] [V] [TRT] Tactic: 3671413346254027573 Time: 0.064768 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_medium_nt_v1 Tactic: 3899284354987683408 [03/25/2022-13:24:56] [V] [TRT] Tactic: 3899284354987683408 Time: 0.083968 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 3927509214678622419 [03/25/2022-13:24:56] [V] [TRT] Tactic: 3927509214678622419 Time: 0.09024 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 4112572034735311841 [03/25/2022-13:24:56] [V] [TRT] Tactic: 4112572034735311841 Time: 0.119808 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 4239974928951431644 [03/25/2022-13:24:56] [V] [TRT] Tactic: 4239974928951431644 Time: 0.092672 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 4610760414797216079 [03/25/2022-13:24:56] [V] [TRT] Tactic: 4610760414797216079 Time: 0.062464 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_small_nt_v1 Tactic: 4717285412741024953 [03/25/2022-13:24:56] [V] [TRT] Tactic: 4717285412741024953 Time: 0.085504 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 4796956614760326119 [03/25/2022-13:24:56] [V] [TRT] Tactic: 4796956614760326119 Time: 0.06592 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: 4909502217677847353 [03/25/2022-13:24:56] [V] [TRT] Tactic: 4909502217677847353 Time: 0.039808 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 4919361344804309192 [03/25/2022-13:24:56] [V] [TRT] Tactic: 4919361344804309192 Time: 0.098304 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 5043674678294309681 [03/25/2022-13:24:56] [V] [TRT] Tactic: 5043674678294309681 Time: 0.061568 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_interior_nt_v1 Tactic: 5126565865931538390 [03/25/2022-13:24:56] [V] [TRT] Tactic: 5126565865931538390 Time: 0.086912 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5204702486885981735 [03/25/2022-13:24:56] [V] [TRT] Tactic: 5204702486885981735 Time: 0.068352 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 5375256703210220108 [03/25/2022-13:24:56] [V] [TRT] Tactic: 5375256703210220108 Time: 0.05888 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 5424258848951129084 [03/25/2022-13:24:56] [V] [TRT] Tactic: 5424258848951129084 Time: 0.039808 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5443897483205284103 [03/25/2022-13:24:56] [V] [TRT] Tactic: 5443897483205284103 Time: 0.065536 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5707566217891294846 [03/25/2022-13:24:56] [V] [TRT] Tactic: 5707566217891294846 Time: 0.051456 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5986622376339202983 [03/25/2022-13:24:56] [V] [TRT] Tactic: 5986622376339202983 Time: 0.083328 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6007888770437705057 [03/25/2022-13:24:56] [V] [TRT] Tactic: 6007888770437705057 Time: 0.06656 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 6405251167055673379 [03/25/2022-13:24:56] [V] [TRT] Tactic: 6405251167055673379 Time: 0.06656 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 6433368103202497147 [03/25/2022-13:24:56] [V] [TRT] Tactic: 6433368103202497147 Time: 0.052608 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6441948709525127755 [03/25/2022-13:24:56] [V] [TRT] Tactic: 6441948709525127755 Time: 0.098048 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 6443933097134654777 [03/25/2022-13:24:56] [V] [TRT] Tactic: 6443933097134654777 Time: 0.068608 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6457435868048963632 [03/25/2022-13:24:56] [V] [TRT] Tactic: 6457435868048963632 Time: 0.06272 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1 Tactic: 6510345569544721081 [03/25/2022-13:24:56] [V] [TRT] Tactic: 6510345569544721081 Time: 0.076288 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 6793988781414507278 [03/25/2022-13:24:56] [V] [TRT] Tactic: 6793988781414507278 Time: 0.049408 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 6880710371738875469 [03/25/2022-13:24:56] [V] [TRT] Tactic: 6880710371738875469 Time: 0.070656 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6925201228918187099 [03/25/2022-13:24:56] [V] [TRT] Tactic: 6925201228918187099 Time: 0.050048 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6991524515605108718 [03/25/2022-13:24:56] [V] [TRT] Tactic: 6991524515605108718 Time: 0.062336 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 7245509442265271220 [03/25/2022-13:24:56] [V] [TRT] Tactic: 7245509442265271220 Time: 0.081664 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 7318929579222925725 [03/25/2022-13:24:56] [V] [TRT] Tactic: 7318929579222925725 Time: 0.066944 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 7731430299029542276 [03/25/2022-13:24:56] [V] [TRT] Tactic: 7731430299029542276 Time: 0.04928 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 7738495016763012180 [03/25/2022-13:24:56] [V] [TRT] Tactic: 7738495016763012180 Time: 0.049024 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: 7886967395128926382 [03/25/2022-13:24:56] [V] [TRT] Tactic: 7886967395128926382 Time: 0.052352 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 8142283985160822229 [03/25/2022-13:24:56] [V] [TRT] Tactic: 8142283985160822229 Time: 0.049024 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 8173975624668590862 [03/25/2022-13:24:56] [V] [TRT] Tactic: 8173975624668590862 Time: 0.049792 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 8234775147403903473 [03/25/2022-13:24:56] [V] [TRT] Tactic: 8234775147403903473 Time: 0.04992 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 8524082966802584889 [03/25/2022-13:24:56] [V] [TRT] Tactic: 8524082966802584889 Time: 0.05568 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 8684013308930763400 [03/25/2022-13:24:56] [V] [TRT] Tactic: 8684013308930763400 Time: 0.08384 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 8765382722978397630 [03/25/2022-13:24:56] [V] [TRT] Tactic: 8765382722978397630 Time: 0.055808 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 8843193587782643431 [03/25/2022-13:24:56] [V] [TRT] Tactic: 8843193587782643431 Time: 0.072064 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 8883810517410230831 [03/25/2022-13:24:56] [V] [TRT] Tactic: 8883810517410230831 Time: 0.052992 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 8930797211803511337 [03/25/2022-13:24:56] [V] [TRT] Tactic: 8930797211803511337 Time: 0.06144 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 8935070489925739043 [03/25/2022-13:24:56] [V] [TRT] Tactic: 8935070489925739043 Time: 0.0544 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1 Tactic: 9062173295331155069 [03/25/2022-13:24:56] [V] [TRT] Tactic: 9062173295331155069 Time: 0.102528 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_small_nt_v1 Tactic: -9118785798277698619 [03/25/2022-13:24:56] [V] [TRT] Tactic: -9118785798277698619 Time: 0.085504 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -8985599729413291927 [03/25/2022-13:24:56] [V] [TRT] Tactic: -8985599729413291927 Time: 0.057856 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -8972697510150675429 [03/25/2022-13:24:56] [V] [TRT] Tactic: -8972697510150675429 Time: 0.074624 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -8943710627305202139 [03/25/2022-13:24:56] [V] [TRT] Tactic: -8943710627305202139 Time: 0.089984 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -8859846367886814331 [03/25/2022-13:24:56] [V] [TRT] Tactic: -8859846367886814331 Time: 0.093184 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -8638624340850784688 [03/25/2022-13:24:56] [V] [TRT] Tactic: -8638624340850784688 Time: 0.073472 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -8556775352640313933 [03/25/2022-13:24:56] [V] [TRT] Tactic: -8556775352640313933 Time: 0.051456 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -8382298409581540699 [03/25/2022-13:24:56] [V] [TRT] Tactic: -8382298409581540699 Time: 0.132864 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -8172318747337038866 [03/25/2022-13:24:56] [V] [TRT] Tactic: -8172318747337038866 Time: 0.081408 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -8038164441468184723 [03/25/2022-13:24:56] [V] [TRT] Tactic: -8038164441468184723 Time: 0.065664 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -7844028314176826857 [03/25/2022-13:24:56] [V] [TRT] Tactic: -7844028314176826857 Time: 0.103168 [03/25/2022-13:24:56] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -7674507941016740570 [03/25/2022-13:24:57] [V] [TRT] Tactic: -7674507941016740570 Time: 0.039808 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -7364286662638617917 [03/25/2022-13:24:57] [V] [TRT] Tactic: -7364286662638617917 Time: 0.048256 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r1s1 Tactic: -7361755530333096258 [03/25/2022-13:24:57] [V] [TRT] Tactic: -7361755530333096258 Time: 0.108544 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_t1r1s1 Tactic: -7289760022626653388 [03/25/2022-13:24:57] [V] [TRT] Tactic: -7289760022626653388 Time: 0.053248 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -7106539943789766885 [03/25/2022-13:24:57] [V] [TRT] Tactic: -7106539943789766885 Time: 0.081536 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -6969478418607271266 [03/25/2022-13:24:57] [V] [TRT] Tactic: -6969478418607271266 Time: 0.080896 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -6930438165437733000 [03/25/2022-13:24:57] [V] [TRT] Tactic: -6930438165437733000 Time: 0.119808 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -6879607992933502380 [03/25/2022-13:24:57] [V] [TRT] Tactic: -6879607992933502380 Time: 0.065792 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -6839669803644810934 [03/25/2022-13:24:57] [V] [TRT] Tactic: -6839669803644810934 Time: 0.060672 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -6812830108414456369 [03/25/2022-13:24:57] [V] [TRT] Tactic: -6812830108414456369 Time: 0.061056 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -6779804930216439173 [03/25/2022-13:24:57] [V] [TRT] Tactic: -6779804930216439173 Time: 0.03648 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -6527178416855951297 [03/25/2022-13:24:57] [V] [TRT] Tactic: -6527178416855951297 Time: 0.099328 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -6510232214299595844 [03/25/2022-13:24:57] [V] [TRT] Tactic: -6510232214299595844 Time: 0.099328 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_small_nt_v1 Tactic: -6400348606759295499 [03/25/2022-13:24:57] [V] [TRT] Tactic: -6400348606759295499 Time: 0.084608 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -6346247605026339453 [03/25/2022-13:24:57] [V] [TRT] Tactic: -6346247605026339453 Time: 0.089472 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -6232597026469067819 [03/25/2022-13:24:57] [V] [TRT] Tactic: -6232597026469067819 Time: 0.067584 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_small_nt_v1 Tactic: -5980889159865208399 [03/25/2022-13:24:57] [V] [TRT] Tactic: -5980889159865208399 Time: 0.082944 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_medium_nt_v1 Tactic: -5766140806760372989 [03/25/2022-13:24:57] [V] [TRT] Tactic: -5766140806760372989 Time: 0.087296 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -5697614955743334137 [03/25/2022-13:24:57] [V] [TRT] Tactic: -5697614955743334137 Time: 0.08192 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -5671123121710113970 [03/25/2022-13:24:57] [V] [TRT] Tactic: -5671123121710113970 Time: 0.095616 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -5615581362569252260 [03/25/2022-13:24:57] [V] [TRT] Tactic: -5615581362569252260 Time: 0.096256 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -5562968047117507056 [03/25/2022-13:24:57] [V] [TRT] Tactic: -5562968047117507056 Time: 0.066432 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -5516472881360101487 [03/25/2022-13:24:57] [V] [TRT] Tactic: -5516472881360101487 Time: 0.073088 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -5311474420963248369 [03/25/2022-13:24:57] [V] [TRT] Tactic: -5311474420963248369 Time: 0.117248 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -5170003087447722174 [03/25/2022-13:24:57] [V] [TRT] Tactic: -5170003087447722174 Time: 0.097536 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -4889586143772361690 [03/25/2022-13:24:57] [V] [TRT] Tactic: -4889586143772361690 Time: 0.060544 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -4889498558023475527 [03/25/2022-13:24:57] [V] [TRT] Tactic: -4889498558023475527 Time: 0.052352 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -4849712423393454704 [03/25/2022-13:24:57] [V] [TRT] Tactic: -4849712423393454704 Time: 0.061312 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: -4681913707320020520 [03/25/2022-13:24:57] [V] [TRT] Tactic: -4681913707320020520 Time: 0.036736 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_medium_nt_v1 Tactic: -4516822589357530549 [03/25/2022-13:24:57] [V] [TRT] Tactic: -4516822589357530549 Time: 0.08768 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -4455415102719506646 [03/25/2022-13:24:57] [V] [TRT] Tactic: -4455415102719506646 Time: 0.072704 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -4425346730823666456 [03/25/2022-13:24:57] [V] [TRT] Tactic: -4425346730823666456 Time: 0.083712 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r1s1 Tactic: -4260476497340370474 [03/25/2022-13:24:57] [V] [TRT] Tactic: -4260476497340370474 Time: 0.133248 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -4182501876984672402 [03/25/2022-13:24:57] [V] [TRT] Tactic: -4182501876984672402 Time: 0.093568 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_t1r1s1 Tactic: -4151617293257698859 [03/25/2022-13:24:57] [V] [TRT] Tactic: -4151617293257698859 Time: 0.06848 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -3862908719298381451 [03/25/2022-13:24:57] [V] [TRT] Tactic: -3862908719298381451 Time: 0.036864 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -3825889760337461729 [03/25/2022-13:24:57] [V] [TRT] Tactic: -3825889760337461729 Time: 0.09856 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -3797022944823726673 [03/25/2022-13:24:57] [V] [TRT] Tactic: -3797022944823726673 Time: 0.07232 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -3613322253849278738 [03/25/2022-13:24:57] [V] [TRT] Tactic: -3613322253849278738 Time: 0.120192 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -3577322188448771475 [03/25/2022-13:24:57] [V] [TRT] Tactic: -3577322188448771475 Time: 0.075776 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16_t1r1s1 Tactic: -3531681826488401618 [03/25/2022-13:24:57] [V] [TRT] Tactic: -3531681826488401618 Time: 0.081792 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -3305554949874552860 [03/25/2022-13:24:57] [V] [TRT] Tactic: -3305554949874552860 Time: 0.10176 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -3288585994448820820 [03/25/2022-13:24:57] [V] [TRT] Tactic: -3288585994448820820 Time: 0.06784 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -2754311112012636251 [03/25/2022-13:24:57] [V] [TRT] Tactic: -2754311112012636251 Time: 0.076416 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -2432868635536396215 [03/25/2022-13:24:57] [V] [TRT] Tactic: -2432868635536396215 Time: 0.074496 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -2379804152300264660 [03/25/2022-13:24:57] [V] [TRT] Tactic: -2379804152300264660 Time: 0.101888 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: -2352253835013627337 [03/25/2022-13:24:57] [V] [TRT] Tactic: -2352253835013627337 Time: 0.039808 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -2335587136911650799 [03/25/2022-13:24:57] [V] [TRT] Tactic: -2335587136911650799 Time: 0.065024 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r1s1 Tactic: -2315453944962430928 [03/25/2022-13:24:57] [V] [TRT] Tactic: -2315453944962430928 Time: 0.050176 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: -2238364958919154661 [03/25/2022-13:24:57] [V] [TRT] Tactic: -2238364958919154661 Time: 0.075904 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -1916483171117495388 [03/25/2022-13:24:57] [V] [TRT] Tactic: -1916483171117495388 Time: 0.062336 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -1740762957710554518 [03/25/2022-13:24:57] [V] [TRT] Tactic: -1740762957710554518 Time: 0.1024 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -1549742793039499659 [03/25/2022-13:24:57] [V] [TRT] Tactic: -1549742793039499659 Time: 0.06848 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -1499578657823798783 [03/25/2022-13:24:57] [V] [TRT] Tactic: -1499578657823798783 Time: 0.073088 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -1494157908358500249 [03/25/2022-13:24:57] [V] [TRT] Tactic: -1494157908358500249 Time: 0.093952 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_t1r1s1 Tactic: -1328736756812546664 [03/25/2022-13:24:57] [V] [TRT] Tactic: -1328736756812546664 Time: 0.083456 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -1006589727652607355 [03/25/2022-13:24:57] [V] [TRT] Tactic: -1006589727652607355 Time: 0.095616 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -713022856474991236 [03/25/2022-13:24:57] [V] [TRT] Tactic: -713022856474991236 Time: 0.11968 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -619668460699260222 [03/25/2022-13:24:57] [V] [TRT] Tactic: -619668460699260222 Time: 0.07552 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -405554772060757402 [03/25/2022-13:24:57] [V] [TRT] Tactic: -405554772060757402 Time: 0.068736 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -375949437730908730 [03/25/2022-13:24:57] [V] [TRT] Tactic: -375949437730908730 Time: 0.05888 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: -233227833606287806 [03/25/2022-13:24:57] [V] [TRT] Tactic: -233227833606287806 Time: 0.061312 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -111878368089469751 [03/25/2022-13:24:57] [V] [TRT] Tactic: -111878368089469751 Time: 0.080768 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -48936598874722005 [03/25/2022-13:24:57] [V] [TRT] Tactic: -48936598874722005 Time: 0.058368 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -19707840769375107 [03/25/2022-13:24:57] [V] [TRT] Tactic: -19707840769375107 Time: 0.075008 [03/25/2022-13:24:57] [V] [TRT] Fastest Tactic: -6779804930216439173 Time: 0.03648 [03/25/2022-13:24:57] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: -6779804930216439173 [03/25/2022-13:24:57] [V] [TRT] =============== Computing costs for [03/25/2022-13:24:57] [V] [TRT] *************** Autotuning format combination: Int8(6272,49:4,7,1) -> Int8(6272,49:4,7,1) *************** [03/25/2022-13:24:57] [V] [TRT] --------------- Timing Runner: sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 (CudaDepthwiseConvolution) [03/25/2022-13:24:57] [V] [TRT] CudaDepthwiseConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:57] [V] [TRT] --------------- Timing Runner: sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 (FusedConvActConvolution) [03/25/2022-13:24:57] [V] [TRT] FusedConvActConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:57] [V] [TRT] --------------- Timing Runner: sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 (CaskConvolution) [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_medium_nn_v1 Tactic: 175853789719975416 [03/25/2022-13:24:57] [V] [TRT] Tactic: 175853789719975416 Time: 0.700288 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_medium_nn_v1 Tactic: 2171150287007712632 [03/25/2022-13:24:57] [V] [TRT] Tactic: 2171150287007712632 Time: 0.843264 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_small_nn_v1 Tactic: 2234457234705232274 [03/25/2022-13:24:57] [V] [TRT] Tactic: 2234457234705232274 Time: 0.591616 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_medium_nn_v1 Tactic: 5834048089706882838 [03/25/2022-13:24:57] [V] [TRT] Tactic: 5834048089706882838 Time: 0.647808 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_small_nn_v1 Tactic: -8626990807754934295 [03/25/2022-13:24:57] [V] [TRT] Tactic: -8626990807754934295 Time: 0.647936 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_small_nn_v1 Tactic: -7303593854972602201 [03/25/2022-13:24:57] [V] [TRT] Tactic: -7303593854972602201 Time: 0.617088 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_medium_nn_v1 Tactic: -6585664687867083638 [03/25/2022-13:24:57] [V] [TRT] Tactic: -6585664687867083638 Time: 0.586752 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_xregs_large_nn_v1 Tactic: -3730012925709297561 [03/25/2022-13:24:57] [V] [TRT] Tactic: -3730012925709297561 Time: 0.62016 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_xregs_large_nn_v1 Tactic: -2277259417488004546 [03/25/2022-13:24:57] [V] [TRT] Tactic: -2277259417488004546 Time: 0.573824 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_small_nn_v1 Tactic: -683636008127039856 [03/25/2022-13:24:57] [V] [TRT] Tactic: -683636008127039856 Time: 0.5248 [03/25/2022-13:24:57] [V] [TRT] Fastest Tactic: -683636008127039856 Time: 0.5248 [03/25/2022-13:24:57] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: -683636008127039856 [03/25/2022-13:24:57] [V] [TRT] *************** Autotuning format combination: Int8(6272,49:4,7,1) -> Int8(784,49:32,7,1) *************** [03/25/2022-13:24:57] [V] [TRT] --------------- Timing Runner: sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 (CaskConvolution) [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_xregs_large_c32_nn_v1 Tactic: 984309058095623735 [03/25/2022-13:24:57] [V] [TRT] Tactic: 984309058095623735 Time: 0.574848 [03/25/2022-13:24:57] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_small_c32_nn_v1 Tactic: 1100922622480907544 [03/25/2022-13:24:58] [V] [TRT] Tactic: 1100922622480907544 Time: 0.598016 [03/25/2022-13:24:58] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_xregs_large_c32_nn_v1 Tactic: 3238312825609165543 [03/25/2022-13:24:58] [V] [TRT] Tactic: 3238312825609165543 Time: 0.573568 [03/25/2022-13:24:58] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_medium_c32_nn_v1 Tactic: 3606311198834416176 [03/25/2022-13:24:58] [V] [TRT] Tactic: 3606311198834416176 Time: 0.59904 [03/25/2022-13:24:58] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_small_c32_nn_v1 Tactic: 4325765560739862899 [03/25/2022-13:24:58] [V] [TRT] Tactic: 4325765560739862899 Time: 0.525184 [03/25/2022-13:24:58] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x128_relu_medium_c32_nn_v1 Tactic: -4255737803793506479 [03/25/2022-13:24:58] [V] [TRT] Tactic: -4255737803793506479 Time: 0.543488 [03/25/2022-13:24:58] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_small_c32_nn_v1 Tactic: -3958182351168863467 [03/25/2022-13:24:58] [V] [TRT] Tactic: -3958182351168863467 Time: 0.570752 [03/25/2022-13:24:58] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_xregs_medium_c32_nn_v1 Tactic: -3111968753064955248 [03/25/2022-13:24:58] [V] [TRT] Tactic: -3111968753064955248 Time: 0.77504 [03/25/2022-13:24:58] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x32_relu_medium_c32_nn_v1 Tactic: -1492575840277333548 [03/25/2022-13:24:58] [V] [TRT] Tactic: -1492575840277333548 Time: 0.649856 [03/25/2022-13:24:58] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: ampere_int8x4_icudnn_int8x4_128x64_relu_small_c32_nn_v1 Tactic: -868495160148524802 [03/25/2022-13:24:58] [V] [TRT] Tactic: -868495160148524802 Time: 0.5472 [03/25/2022-13:24:58] [V] [TRT] Fastest Tactic: 4325765560739862899 Time: 0.525184 [03/25/2022-13:24:58] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: 4325765560739862899 [03/25/2022-13:24:58] [V] [TRT] *************** Autotuning format combination: Int8(784,49:32,7,1) -> Int8(784,49:32,7,1) *************** [03/25/2022-13:24:58] [V] [TRT] --------------- Timing Runner: sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 (CudaGroupConvolution) [03/25/2022-13:24:58] [V] [TRT] CudaGroupConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:58] [V] [TRT] --------------- Timing Runner: sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 (CudaDepthwiseConvolution) [03/25/2022-13:24:58] [V] [TRT] CudaDepthwiseConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:58] [V] [TRT] --------------- Timing Runner: sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 (FusedConvActConvolution) [03/25/2022-13:24:58] [V] [TRT] FusedConvActConvolution has no valid tactics for this config, skipping [03/25/2022-13:24:58] [V] [TRT] --------------- Timing Runner: sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 (CaskConvolution) [03/25/2022-13:24:58] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_epifadd Tactic: 177040020707947851 [03/25/2022-13:24:58] [V] [TRT] Tactic: 177040020707947851 Time: 0.166912 [03/25/2022-13:24:58] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 184229963126259101 [03/25/2022-13:24:58] [V] [TRT] Tactic: 184229963126259101 Time: 0.126208 [03/25/2022-13:24:58] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 289888059097454627 [03/25/2022-13:24:58] [V] [TRT] Tactic: 289888059097454627 Time: 0.149632 [03/25/2022-13:24:58] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32 Tactic: 328135613486708155 [03/25/2022-13:24:58] [V] [TRT] Tactic: 328135613486708155 Time: 0.264192 [03/25/2022-13:24:58] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 680740992583869928 [03/25/2022-13:24:58] [V] [TRT] Tactic: 680740992583869928 Time: 0.164608 [03/25/2022-13:24:58] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32 Tactic: 1111159740952609683 [03/25/2022-13:24:58] [V] [TRT] Tactic: 1111159740952609683 Time: 0.12352 [03/25/2022-13:24:58] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: 1134860903395928905 [03/25/2022-13:24:58] [V] [TRT] Tactic: 1134860903395928905 Time: 0.116224 [03/25/2022-13:24:58] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: 1276591930377039442 [03/25/2022-13:24:58] [V] [TRT] Tactic: 1276591930377039442 Time: 0.128384 [03/25/2022-13:24:58] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r3s3 Tactic: 1388866374720163187 [03/25/2022-13:24:58] [V] [TRT] Tactic: 1388866374720163187 Time: 0.198016 [03/25/2022-13:24:58] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32 Tactic: 1399501420456320585 [03/25/2022-13:24:58] [V] [TRT] Tactic: 1399501420456320585 Time: 0.15552 [03/25/2022-13:24:58] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: 1550399266192842845 [03/25/2022-13:24:58] [V] [TRT] Tactic: 1550399266192842845 Time: 0.15424 [03/25/2022-13:24:58] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 1572887561103143487 [03/25/2022-13:24:58] [V] [TRT] Tactic: 1572887561103143487 Time: 0.121728 [03/25/2022-13:24:58] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 1853122447892949466 [03/25/2022-13:24:58] [V] [TRT] Tactic: 1853122447892949466 Time: 0.155776 [03/25/2022-13:24:58] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: 2133329569091732311 [03/25/2022-13:24:58] [V] [TRT] Tactic: 2133329569091732311 Time: 0.153472 [03/25/2022-13:24:58] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32 Tactic: 2325023763229477890 [03/25/2022-13:24:58] [V] [TRT] Tactic: 2325023763229477890 Time: 0.083968 [03/25/2022-13:24:58] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: 2579824863892891529 [03/25/2022-13:24:58] [V] [TRT] Tactic: 2579824863892891529 Time: 0.216576 [03/25/2022-13:24:58] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32 Tactic: 2783960536172159663 [03/25/2022-13:24:58] [V] [TRT] Tactic: 2783960536172159663 Time: 0.09728 [03/25/2022-13:24:58] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32 Tactic: 2821711838552913693 [03/25/2022-13:24:58] [V] [TRT] Tactic: 2821711838552913693 Time: 0.136192 [03/25/2022-13:24:58] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64 Tactic: 2945009978756227538 [03/25/2022-13:24:58] [V] [TRT] Tactic: 2945009978756227538 Time: 0.086912 [03/25/2022-13:24:58] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_medium_nt_v1 Tactic: 2985940154541537814 [03/25/2022-13:24:58] [V] [TRT] Tactic: 2985940154541537814 Time: 0.164608 [03/25/2022-13:24:58] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 3284282970967328046 [03/25/2022-13:24:58] [V] [TRT] Tactic: 3284282970967328046 Time: 0.182912 [03/25/2022-13:24:58] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 3401614690060226673 [03/25/2022-13:24:58] [V] [TRT] Tactic: 3401614690060226673 Time: 0.150656 [03/25/2022-13:24:58] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32 Tactic: 3456719996792527006 [03/25/2022-13:24:58] [V] [TRT] Tactic: 3456719996792527006 Time: 0.137472 [03/25/2022-13:24:58] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 3512426920013359699 [03/25/2022-13:24:58] [V] [TRT] Tactic: 3512426920013359699 Time: 0.119296 [03/25/2022-13:24:58] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64 Tactic: 3651043333819148268 [03/25/2022-13:24:58] [V] [TRT] Tactic: 3651043333819148268 Time: 0.065792 [03/25/2022-13:24:58] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_medium_nt_v1 Tactic: 3899284354987683408 [03/25/2022-13:24:58] [V] [TRT] Tactic: 3899284354987683408 Time: 0.156928 [03/25/2022-13:24:58] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32 Tactic: 4042202769383439184 [03/25/2022-13:24:58] [V] [TRT] Tactic: 4042202769383439184 Time: 0.092032 [03/25/2022-13:24:58] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_large_nt_v1 Tactic: 4182625619810185112 [03/25/2022-13:24:58] [V] [TRT] Tactic: 4182625619810185112 Time: 0.18112 [03/25/2022-13:24:58] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 4214794893922618058 [03/25/2022-13:24:58] [V] [TRT] Tactic: 4214794893922618058 Time: 0.152576 [03/25/2022-13:24:58] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_epifadd Tactic: 4259547356717612415 [03/25/2022-13:24:58] [V] [TRT] Tactic: 4259547356717612415 Time: 0.123648 [03/25/2022-13:24:58] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 4384868749799132354 [03/25/2022-13:24:58] [V] [TRT] Tactic: 4384868749799132354 Time: 0.216448 [03/25/2022-13:24:58] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64 Tactic: 4414594337986714263 [03/25/2022-13:24:58] [V] [TRT] Tactic: 4414594337986714263 Time: 0.06784 [03/25/2022-13:24:58] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_small_nt_v1 Tactic: 4717285412741024953 [03/25/2022-13:24:58] [V] [TRT] Tactic: 4717285412741024953 Time: 0.157952 [03/25/2022-13:24:58] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32 Tactic: 4734519122557206480 [03/25/2022-13:24:58] [V] [TRT] Tactic: 4734519122557206480 Time: 0.081024 [03/25/2022-13:24:58] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: 4922297020351187339 [03/25/2022-13:24:58] [V] [TRT] Tactic: 4922297020351187339 Time: 0.10176 [03/25/2022-13:24:58] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16_t1r3s3 Tactic: 4931167631624420067 [03/25/2022-13:24:58] [V] [TRT] Tactic: 4931167631624420067 Time: 0.1504 [03/25/2022-13:24:58] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_epifadd Tactic: 5121596860264626879 [03/25/2022-13:24:58] [V] [TRT] Tactic: 5121596860264626879 Time: 0.079872 [03/25/2022-13:24:58] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 5136656982162849059 [03/25/2022-13:24:58] [V] [TRT] Tactic: 5136656982162849059 Time: 0.183424 [03/25/2022-13:24:58] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: 5158259316594207439 [03/25/2022-13:24:58] [V] [TRT] Tactic: 5158259316594207439 Time: 0.092416 [03/25/2022-13:24:58] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 5189825015507701541 [03/25/2022-13:24:58] [V] [TRT] Tactic: 5189825015507701541 Time: 0.290176 [03/25/2022-13:24:58] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32 Tactic: 5424417905073460656 [03/25/2022-13:24:58] [V] [TRT] Tactic: 5424417905073460656 Time: 0.136704 [03/25/2022-13:24:58] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32 Tactic: 5442043907221427810 [03/25/2022-13:24:58] [V] [TRT] Tactic: 5442043907221427810 Time: 0.119808 [03/25/2022-13:24:58] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_t1r3s3 Tactic: 5544365258913999384 [03/25/2022-13:24:58] [V] [TRT] Tactic: 5544365258913999384 Time: 0.118016 [03/25/2022-13:24:58] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r3s3 Tactic: 5641967928706599451 [03/25/2022-13:24:58] [V] [TRT] Tactic: 5641967928706599451 Time: 0.250496 [03/25/2022-13:24:58] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 5721595115357140131 [03/25/2022-13:24:58] [V] [TRT] Tactic: 5721595115357140131 Time: 0.127872 [03/25/2022-13:24:58] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 5966973378912044513 [03/25/2022-13:24:58] [V] [TRT] Tactic: 5966973378912044513 Time: 0.083968 [03/25/2022-13:24:58] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 6004789655466615912 [03/25/2022-13:24:58] [V] [TRT] Tactic: 6004789655466615912 Time: 0.121856 [03/25/2022-13:24:58] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r3s3 Tactic: 6146901278630392829 [03/25/2022-13:24:58] [V] [TRT] Tactic: 6146901278630392829 Time: 0.081152 [03/25/2022-13:24:58] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16 Tactic: 6394572396369862482 [03/25/2022-13:24:58] [V] [TRT] Tactic: 6394572396369862482 Time: 0.251648 [03/25/2022-13:24:58] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_epifadd Tactic: 6434020722187266170 [03/25/2022-13:24:58] [V] [TRT] Tactic: 6434020722187266170 Time: 0.079872 [03/25/2022-13:24:58] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32 Tactic: 6781129591847482048 [03/25/2022-13:24:59] [V] [TRT] Tactic: 6781129591847482048 Time: 0.09792 [03/25/2022-13:24:59] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 6984451771200230840 [03/25/2022-13:24:59] [V] [TRT] Tactic: 6984451771200230840 Time: 0.119936 [03/25/2022-13:24:59] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 7048234086361926570 [03/25/2022-13:24:59] [V] [TRT] Tactic: 7048234086361926570 Time: 0.174336 [03/25/2022-13:24:59] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32 Tactic: 7077570591813340966 [03/25/2022-13:24:59] [V] [TRT] Tactic: 7077570591813340966 Time: 0.116864 [03/25/2022-13:24:59] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 7191893591576074000 [03/25/2022-13:24:59] [V] [TRT] Tactic: 7191893591576074000 Time: 0.152832 [03/25/2022-13:24:59] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_t1r3s3 Tactic: 7429976449747682901 [03/25/2022-13:24:59] [V] [TRT] Tactic: 7429976449747682901 Time: 0.10688 [03/25/2022-13:24:59] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 7438984192263206338 [03/25/2022-13:24:59] [V] [TRT] Tactic: 7438984192263206338 Time: 0.091648 [03/25/2022-13:24:59] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: 7504901284678552178 [03/25/2022-13:24:59] [V] [TRT] Tactic: 7504901284678552178 Time: 0.081792 [03/25/2022-13:24:59] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 8096257414008860171 [03/25/2022-13:24:59] [V] [TRT] Tactic: 8096257414008860171 Time: 0.096256 [03/25/2022-13:24:59] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 8128112048355596715 [03/25/2022-13:24:59] [V] [TRT] Tactic: 8128112048355596715 Time: 0.097408 [03/25/2022-13:24:59] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_indexed_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: 8751622450593766232 [03/25/2022-13:24:59] [V] [TRT] Tactic: 8751622450593766232 Time: 0.089472 [03/25/2022-13:24:59] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_indexed_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32 Tactic: 9064458886956700976 [03/25/2022-13:24:59] [V] [TRT] Tactic: 9064458886956700976 Time: 0.089856 [03/25/2022-13:24:59] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 9143438935315839085 [03/25/2022-13:24:59] [V] [TRT] Tactic: 9143438935315839085 Time: 0.151168 [03/25/2022-13:24:59] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32 Tactic: -9165697322068360861 [03/25/2022-13:24:59] [V] [TRT] Tactic: -9165697322068360861 Time: 0.080896 [03/25/2022-13:24:59] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_small_nt_v1 Tactic: -9118785798277698619 [03/25/2022-13:24:59] [V] [TRT] Tactic: -9118785798277698619 Time: 0.153344 [03/25/2022-13:24:59] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: -9108166971364503411 [03/25/2022-13:24:59] [V] [TRT] Tactic: -9108166971364503411 Time: 0.176128 [03/25/2022-13:24:59] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16 Tactic: -8861822316054763526 [03/25/2022-13:24:59] [V] [TRT] Tactic: -8861822316054763526 Time: 0.150272 [03/25/2022-13:24:59] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64 Tactic: -8791277710877987710 [03/25/2022-13:24:59] [V] [TRT] Tactic: -8791277710877987710 Time: 0.10752 [03/25/2022-13:24:59] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: -8691377209893505057 [03/25/2022-13:24:59] [V] [TRT] Tactic: -8691377209893505057 Time: 0.09088 [03/25/2022-13:24:59] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: -8520292213102999339 [03/25/2022-13:24:59] [V] [TRT] Tactic: -8520292213102999339 Time: 0.104832 [03/25/2022-13:24:59] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16_t1r3s3 Tactic: -8475551154769412306 [03/25/2022-13:24:59] [V] [TRT] Tactic: -8475551154769412306 Time: 0.157184 [03/25/2022-13:24:59] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -8417388128970254446 [03/25/2022-13:24:59] [V] [TRT] Tactic: -8417388128970254446 Time: 0.12224 [03/25/2022-13:24:59] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32 Tactic: -8263994888336646547 [03/25/2022-13:24:59] [V] [TRT] Tactic: -8263994888336646547 Time: 0.081664 [03/25/2022-13:24:59] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32 Tactic: -8205948405243401049 [03/25/2022-13:24:59] [V] [TRT] Tactic: -8205948405243401049 Time: 0.154624 [03/25/2022-13:24:59] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: -7992068592656168418 [03/25/2022-13:24:59] [V] [TRT] Tactic: -7992068592656168418 Time: 0.096128 [03/25/2022-13:24:59] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_t1r3s3 Tactic: -7898477046581738867 [03/25/2022-13:24:59] [V] [TRT] Tactic: -7898477046581738867 Time: 0.134272 [03/25/2022-13:24:59] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_epifadd Tactic: -7842775553137511386 [03/25/2022-13:24:59] [V] [TRT] Tactic: -7842775553137511386 Time: 0.082944 [03/25/2022-13:24:59] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32 Tactic: -7683887278997527517 [03/25/2022-13:24:59] [V] [TRT] Tactic: -7683887278997527517 Time: 0.149376 [03/25/2022-13:24:59] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: -7381370635708568663 [03/25/2022-13:24:59] [V] [TRT] Tactic: -7381370635708568663 Time: 0.098432 [03/25/2022-13:24:59] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -7129320389887881029 [03/25/2022-13:24:59] [V] [TRT] Tactic: -7129320389887881029 Time: 0.113536 [03/25/2022-13:24:59] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -6959995514028471820 [03/25/2022-13:24:59] [V] [TRT] Tactic: -6959995514028471820 Time: 0.129664 [03/25/2022-13:24:59] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_small_nt_v1 Tactic: -6400348606759295499 [03/25/2022-13:24:59] [V] [TRT] Tactic: -6400348606759295499 Time: 0.155776 [03/25/2022-13:24:59] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: -6371781333659293809 [03/25/2022-13:24:59] [V] [TRT] Tactic: -6371781333659293809 Time: 0.160768 [03/25/2022-13:24:59] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32 Tactic: -6256128573036943404 [03/25/2022-13:24:59] [V] [TRT] Tactic: -6256128573036943404 Time: 0.118784 [03/25/2022-13:24:59] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_small_nt_v1 Tactic: -5980889159865208399 [03/25/2022-13:24:59] [V] [TRT] Tactic: -5980889159865208399 Time: 0.15296 [03/25/2022-13:24:59] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_medium_nt_v1 Tactic: -5766140806760372989 [03/25/2022-13:24:59] [V] [TRT] Tactic: -5766140806760372989 Time: 0.176256 [03/25/2022-13:24:59] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: -5709079507616090666 [03/25/2022-13:24:59] [V] [TRT] Tactic: -5709079507616090666 Time: 0.080896 [03/25/2022-13:24:59] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: -5698636014239116282 [03/25/2022-13:24:59] [V] [TRT] Tactic: -5698636014239116282 Time: 0.079872 [03/25/2022-13:24:59] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16 Tactic: -5180570335464125033 [03/25/2022-13:24:59] [V] [TRT] Tactic: -5180570335464125033 Time: 0.162432 [03/25/2022-13:24:59] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -4933563390723451692 [03/25/2022-13:24:59] [V] [TRT] Tactic: -4933563390723451692 Time: 0.120192 [03/25/2022-13:24:59] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_medium_nt_v1 Tactic: -4516822589357530549 [03/25/2022-13:24:59] [V] [TRT] Tactic: -4516822589357530549 Time: 0.17152 [03/25/2022-13:24:59] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64 Tactic: -4232916483289779353 [03/25/2022-13:24:59] [V] [TRT] Tactic: -4232916483289779353 Time: 0.172416 [03/25/2022-13:24:59] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64 Tactic: -3460842194336717186 [03/25/2022-13:24:59] [V] [TRT] Tactic: -3460842194336717186 Time: 0.088448 [03/25/2022-13:24:59] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: -3413217501222406256 [03/25/2022-13:24:59] [V] [TRT] Tactic: -3413217501222406256 Time: 0.078848 [03/25/2022-13:24:59] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: -3280888557222886418 [03/25/2022-13:24:59] [V] [TRT] Tactic: -3280888557222886418 Time: 0.103552 [03/25/2022-13:24:59] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -3238475748440751107 [03/25/2022-13:24:59] [V] [TRT] Tactic: -3238475748440751107 Time: 0.091648 [03/25/2022-13:24:59] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: -3182884991006484042 [03/25/2022-13:24:59] [V] [TRT] Tactic: -3182884991006484042 Time: 0.083968 [03/25/2022-13:24:59] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -3173468756112541306 [03/25/2022-13:24:59] [V] [TRT] Tactic: -3173468756112541306 Time: 0.153216 [03/25/2022-13:24:59] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_large_nt_v1 Tactic: -2917455979290586480 [03/25/2022-13:24:59] [V] [TRT] Tactic: -2917455979290586480 Time: 0.157184 [03/25/2022-13:24:59] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -2741641298163591508 [03/25/2022-13:24:59] [V] [TRT] Tactic: -2741641298163591508 Time: 0.116608 [03/25/2022-13:24:59] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_large_nt_v1 Tactic: -2571022005763160364 [03/25/2022-13:24:59] [V] [TRT] Tactic: -2571022005763160364 Time: 0.168192 [03/25/2022-13:24:59] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: -2499089240293650188 [03/25/2022-13:24:59] [V] [TRT] Tactic: -2499089240293650188 Time: 0.16512 [03/25/2022-13:24:59] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: -2328318099174473157 [03/25/2022-13:24:59] [V] [TRT] Tactic: -2328318099174473157 Time: 0.16704 [03/25/2022-13:24:59] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: -2083778562631872334 [03/25/2022-13:24:59] [V] [TRT] Tactic: -2083778562631872334 Time: 0.09728 [03/25/2022-13:24:59] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32 Tactic: -2054375205435666404 [03/25/2022-13:24:59] [V] [TRT] Tactic: -2054375205435666404 Time: 0.114048 [03/25/2022-13:24:59] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -1546787387293556842 [03/25/2022-13:24:59] [V] [TRT] Tactic: -1546787387293556842 Time: 0.082048 [03/25/2022-13:24:59] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32 Tactic: -1498626619443284096 [03/25/2022-13:24:59] [V] [TRT] Tactic: -1498626619443284096 Time: 0.123776 [03/25/2022-13:24:59] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -1471245223605064669 [03/25/2022-13:24:59] [V] [TRT] Tactic: -1471245223605064669 Time: 0.094464 [03/25/2022-13:24:59] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32 Tactic: -1283580231568512025 [03/25/2022-13:24:59] [V] [TRT] Tactic: -1283580231568512025 Time: 0.185216 [03/25/2022-13:24:59] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: -1224421172675151280 [03/25/2022-13:24:59] [V] [TRT] Tactic: -1224421172675151280 Time: 0.096256 [03/25/2022-13:24:59] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_epifadd Tactic: -1173968681844185579 [03/25/2022-13:24:59] [V] [TRT] Tactic: -1173968681844185579 Time: 0.185344 [03/25/2022-13:24:59] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -921247911551089037 [03/25/2022-13:24:59] [V] [TRT] Tactic: -921247911551089037 Time: 0.091648 [03/25/2022-13:24:59] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32 Tactic: -762222380308749469 [03/25/2022-13:24:59] [V] [TRT] Tactic: -762222380308749469 Time: 0.130048 [03/25/2022-13:24:59] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: -556794153877490941 [03/25/2022-13:24:59] [V] [TRT] Tactic: -556794153877490941 Time: 0.130816 [03/25/2022-13:24:59] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -516725800067794372 [03/25/2022-13:24:59] [V] [TRT] Tactic: -516725800067794372 Time: 0.086912 [03/25/2022-13:24:59] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_large_nt_v1 Tactic: -428104331444385564 [03/25/2022-13:24:59] [V] [TRT] Tactic: -428104331444385564 Time: 0.190336 [03/25/2022-13:24:59] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16 Tactic: -366411318217594794 [03/25/2022-13:24:59] [V] [TRT] Tactic: -366411318217594794 Time: 0.215296 [03/25/2022-13:24:59] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16 Tactic: -351548418071036983 [03/25/2022-13:24:59] [V] [TRT] Tactic: -351548418071036983 Time: 0.162688 [03/25/2022-13:24:59] [V] [TRT] Fastest Tactic: 3651043333819148268 Time: 0.065792 [03/25/2022-13:24:59] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: 3651043333819148268 [03/25/2022-13:24:59] [V] [TRT] =============== Computing costs for [03/25/2022-13:24:59] [V] [TRT] *************** Autotuning format combination: Int8(6272,49:4,7,1), Int8(25088,49:4,7,1) -> Int8(25088,49:4,7,1) *************** [03/25/2022-13:24:59] [V] [TRT] *************** Autotuning format combination: Int8(6272,49:4,7,1), Int8(3136,49:32,7,1) -> Int8(3136,49:32,7,1) *************** [03/25/2022-13:24:59] [V] [TRT] *************** Autotuning format combination: Int8(784,49:32,7,1), Int8(3136,49:32,7,1) -> Int8(3136,49:32,7,1) *************** [03/25/2022-13:24:59] [V] [TRT] =============== Computing costs for [03/25/2022-13:24:59] [V] [TRT] *************** Autotuning format combination: Int8(100352,49,7,1) -> Float(100352,49,7,1) *************** [03/25/2022-13:24:59] [V] [TRT] --------------- Timing Runner: DequantizeLinear_901_dequantize_scale_node (Scale) [03/25/2022-13:24:59] [V] [TRT] Tactic: 0 Time: 0.06656 [03/25/2022-13:24:59] [V] [TRT] Fastest Tactic: 0 Time: 0.06656 [03/25/2022-13:24:59] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: Scale Tactic: 0 [03/25/2022-13:24:59] [V] [TRT] *************** Autotuning format combination: Int8(25088,49:4,7,1) -> Float(100352,49,7,1) *************** [03/25/2022-13:24:59] [V] [TRT] --------------- Timing Runner: DequantizeLinear_901_dequantize_scale_node (Scale) [03/25/2022-13:24:59] [V] [TRT] Tactic: 0 Time: 0.11648 [03/25/2022-13:24:59] [V] [TRT] Fastest Tactic: 0 Time: 0.11648 [03/25/2022-13:24:59] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: Scale Tactic: 0 [03/25/2022-13:24:59] [V] [TRT] *************** Autotuning format combination: Int8(3136,49:32,7,1) -> Float(100352,49,7,1) *************** [03/25/2022-13:24:59] [V] [TRT] --------------- Timing Runner: DequantizeLinear_901_dequantize_scale_node (Scale) [03/25/2022-13:24:59] [V] [TRT] Tactic: 0 Time: 0.10368 [03/25/2022-13:24:59] [V] [TRT] Fastest Tactic: 0 Time: 0.10368 [03/25/2022-13:24:59] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: Scale Tactic: 0 [03/25/2022-13:24:59] [V] [TRT] =============== Computing costs for [03/25/2022-13:24:59] [V] [TRT] *************** Autotuning format combination: Int8(25088,49:4,7,1) -> Int8(6272,49:4,7,1) *************** [03/25/2022-13:24:59] [V] [TRT] *************** Autotuning format combination: Int8(25088,49:4,7,1) -> Int8(784,49:32,7,1) *************** [03/25/2022-13:24:59] [V] [TRT] *************** Autotuning format combination: Int8(3136,49:32,7,1) -> Int8(784,49:32,7,1) *************** [03/25/2022-13:24:59] [V] [TRT] Skip timing cache hit (epiFadd mismatch) for node: sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866, LayerImpl: CaskConvolution, tactic: -6779804930216439173 [03/25/2022-13:25:00] [V] [TRT] --------------- Timing Runner: sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 (CudaGroupConvolution) [03/25/2022-13:25:00] [V] [TRT] CudaGroupConvolution has no valid tactics for this config, skipping [03/25/2022-13:25:00] [V] [TRT] --------------- Timing Runner: sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 (CudaDepthwiseConvolution) [03/25/2022-13:25:00] [V] [TRT] CudaDepthwiseConvolution has no valid tactics for this config, skipping [03/25/2022-13:25:00] [V] [TRT] --------------- Timing Runner: sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 (FusedConvActConvolution) [03/25/2022-13:25:00] [V] [TRT] FusedConvActConvolution has no valid tactics for this config, skipping [03/25/2022-13:25:00] [V] [TRT] --------------- Timing Runner: sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 (CaskConvolution) [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16_t1r1s1 Tactic: 68468667201176803 [03/25/2022-13:25:00] [V] [TRT] Tactic: 68468667201176803 Time: 0.084992 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 125145153013230687 [03/25/2022-13:25:00] [V] [TRT] Tactic: 125145153013230687 Time: 0.06848 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_interior_nt_v1 Tactic: 434957160407688216 [03/25/2022-13:25:00] [V] [TRT] Tactic: 434957160407688216 Time: 0.082944 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 805889586762897346 [03/25/2022-13:25:00] [V] [TRT] Tactic: 805889586762897346 Time: 0.048128 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: 857001784974286465 [03/25/2022-13:25:00] [V] [TRT] Tactic: 857001784974286465 Time: 0.036608 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 1214130898909872671 [03/25/2022-13:25:00] [V] [TRT] Tactic: 1214130898909872671 Time: 0.097664 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 1278425129871930205 [03/25/2022-13:25:00] [V] [TRT] Tactic: 1278425129871930205 Time: 0.047232 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 1583811548148740665 [03/25/2022-13:25:00] [V] [TRT] Tactic: 1583811548148740665 Time: 0.07424 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 1701344857577810806 [03/25/2022-13:25:00] [V] [TRT] Tactic: 1701344857577810806 Time: 0.065792 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 1797231177354918208 [03/25/2022-13:25:00] [V] [TRT] Tactic: 1797231177354918208 Time: 0.097408 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2004812516525036381 [03/25/2022-13:25:00] [V] [TRT] Tactic: 2004812516525036381 Time: 0.06464 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2030033463723799063 [03/25/2022-13:25:00] [V] [TRT] Tactic: 2030033463723799063 Time: 0.063744 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_interior_nt_v1 Tactic: 2346437292116182513 [03/25/2022-13:25:00] [V] [TRT] Tactic: 2346437292116182513 Time: 0.084992 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2376898825218218566 [03/25/2022-13:25:00] [V] [TRT] Tactic: 2376898825218218566 Time: 0.048512 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_interior_nt_v1 Tactic: 2522133112320625287 [03/25/2022-13:25:00] [V] [TRT] Tactic: 2522133112320625287 Time: 0.084352 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2548171972648455240 [03/25/2022-13:25:00] [V] [TRT] Tactic: 2548171972648455240 Time: 0.05056 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 2548946449357458230 [03/25/2022-13:25:00] [V] [TRT] Tactic: 2548946449357458230 Time: 0.10752 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: 2570666021825229009 [03/25/2022-13:25:00] [V] [TRT] Tactic: 2570666021825229009 Time: 0.075648 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 2678520742286844763 [03/25/2022-13:25:00] [V] [TRT] Tactic: 2678520742286844763 Time: 0.102144 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 2756291002030759362 [03/25/2022-13:25:00] [V] [TRT] Tactic: 2756291002030759362 Time: 0.059776 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2972948223367788520 [03/25/2022-13:25:00] [V] [TRT] Tactic: 2972948223367788520 Time: 0.055552 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_medium_nt_v1 Tactic: 2985940154541537814 [03/25/2022-13:25:00] [V] [TRT] Tactic: 2985940154541537814 Time: 0.08576 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 3043273137345374664 [03/25/2022-13:25:00] [V] [TRT] Tactic: 3043273137345374664 Time: 0.102784 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 3221677093659484230 [03/25/2022-13:25:00] [V] [TRT] Tactic: 3221677093659484230 Time: 0.083328 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: 3242897809704328258 [03/25/2022-13:25:00] [V] [TRT] Tactic: 3242897809704328258 Time: 0.091392 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 3312456766204252694 [03/25/2022-13:25:00] [V] [TRT] Tactic: 3312456766204252694 Time: 0.11584 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 3538565962642681625 [03/25/2022-13:25:00] [V] [TRT] Tactic: 3538565962642681625 Time: 0.073344 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 3541919052468401776 [03/25/2022-13:25:00] [V] [TRT] Tactic: 3541919052468401776 Time: 0.081536 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 3593397928177382100 [03/25/2022-13:25:00] [V] [TRT] Tactic: 3593397928177382100 Time: 0.097536 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 3670282018109435863 [03/25/2022-13:25:00] [V] [TRT] Tactic: 3670282018109435863 Time: 0.066432 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 3671413346254027573 [03/25/2022-13:25:00] [V] [TRT] Tactic: 3671413346254027573 Time: 0.064896 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_medium_nt_v1 Tactic: 3899284354987683408 [03/25/2022-13:25:00] [V] [TRT] Tactic: 3899284354987683408 Time: 0.083968 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 3927509214678622419 [03/25/2022-13:25:00] [V] [TRT] Tactic: 3927509214678622419 Time: 0.090112 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 4112572034735311841 [03/25/2022-13:25:00] [V] [TRT] Tactic: 4112572034735311841 Time: 0.120064 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 4239974928951431644 [03/25/2022-13:25:00] [V] [TRT] Tactic: 4239974928951431644 Time: 0.09216 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 4610760414797216079 [03/25/2022-13:25:00] [V] [TRT] Tactic: 4610760414797216079 Time: 0.06272 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_small_nt_v1 Tactic: 4717285412741024953 [03/25/2022-13:25:00] [V] [TRT] Tactic: 4717285412741024953 Time: 0.085888 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 4796956614760326119 [03/25/2022-13:25:00] [V] [TRT] Tactic: 4796956614760326119 Time: 0.065792 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: 4909502217677847353 [03/25/2022-13:25:00] [V] [TRT] Tactic: 4909502217677847353 Time: 0.040192 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 4919361344804309192 [03/25/2022-13:25:00] [V] [TRT] Tactic: 4919361344804309192 Time: 0.098304 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 5043674678294309681 [03/25/2022-13:25:00] [V] [TRT] Tactic: 5043674678294309681 Time: 0.061184 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_interior_nt_v1 Tactic: 5126565865931538390 [03/25/2022-13:25:00] [V] [TRT] Tactic: 5126565865931538390 Time: 0.08704 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5204702486885981735 [03/25/2022-13:25:00] [V] [TRT] Tactic: 5204702486885981735 Time: 0.068224 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 5375256703210220108 [03/25/2022-13:25:00] [V] [TRT] Tactic: 5375256703210220108 Time: 0.058624 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 5424258848951129084 [03/25/2022-13:25:00] [V] [TRT] Tactic: 5424258848951129084 Time: 0.039936 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5443897483205284103 [03/25/2022-13:25:00] [V] [TRT] Tactic: 5443897483205284103 Time: 0.06528 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5707566217891294846 [03/25/2022-13:25:00] [V] [TRT] Tactic: 5707566217891294846 Time: 0.051584 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 5986622376339202983 [03/25/2022-13:25:00] [V] [TRT] Tactic: 5986622376339202983 Time: 0.083072 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6007888770437705057 [03/25/2022-13:25:00] [V] [TRT] Tactic: 6007888770437705057 Time: 0.06656 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 6405251167055673379 [03/25/2022-13:25:00] [V] [TRT] Tactic: 6405251167055673379 Time: 0.066432 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 6433368103202497147 [03/25/2022-13:25:00] [V] [TRT] Tactic: 6433368103202497147 Time: 0.05248 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6441948709525127755 [03/25/2022-13:25:00] [V] [TRT] Tactic: 6441948709525127755 Time: 0.097664 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 6443933097134654777 [03/25/2022-13:25:00] [V] [TRT] Tactic: 6443933097134654777 Time: 0.068352 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6457435868048963632 [03/25/2022-13:25:00] [V] [TRT] Tactic: 6457435868048963632 Time: 0.062848 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1 Tactic: 6510345569544721081 [03/25/2022-13:25:00] [V] [TRT] Tactic: 6510345569544721081 Time: 0.07616 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 6793988781414507278 [03/25/2022-13:25:00] [V] [TRT] Tactic: 6793988781414507278 Time: 0.049664 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 6880710371738875469 [03/25/2022-13:25:00] [V] [TRT] Tactic: 6880710371738875469 Time: 0.070912 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6925201228918187099 [03/25/2022-13:25:00] [V] [TRT] Tactic: 6925201228918187099 Time: 0.04992 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 6991524515605108718 [03/25/2022-13:25:00] [V] [TRT] Tactic: 6991524515605108718 Time: 0.062208 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 7245509442265271220 [03/25/2022-13:25:00] [V] [TRT] Tactic: 7245509442265271220 Time: 0.081792 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 7318929579222925725 [03/25/2022-13:25:00] [V] [TRT] Tactic: 7318929579222925725 Time: 0.066816 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 7731430299029542276 [03/25/2022-13:25:00] [V] [TRT] Tactic: 7731430299029542276 Time: 0.04928 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 7738495016763012180 [03/25/2022-13:25:00] [V] [TRT] Tactic: 7738495016763012180 Time: 0.049152 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: 7886967395128926382 [03/25/2022-13:25:00] [V] [TRT] Tactic: 7886967395128926382 Time: 0.052352 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 8142283985160822229 [03/25/2022-13:25:00] [V] [TRT] Tactic: 8142283985160822229 Time: 0.049152 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 8173975624668590862 [03/25/2022-13:25:00] [V] [TRT] Tactic: 8173975624668590862 Time: 0.049408 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 8234775147403903473 [03/25/2022-13:25:00] [V] [TRT] Tactic: 8234775147403903473 Time: 0.050048 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 8524082966802584889 [03/25/2022-13:25:00] [V] [TRT] Tactic: 8524082966802584889 Time: 0.055296 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: 8684013308930763400 [03/25/2022-13:25:00] [V] [TRT] Tactic: 8684013308930763400 Time: 0.083968 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 8765382722978397630 [03/25/2022-13:25:00] [V] [TRT] Tactic: 8765382722978397630 Time: 0.055552 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 8843193587782643431 [03/25/2022-13:25:00] [V] [TRT] Tactic: 8843193587782643431 Time: 0.071936 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: 8883810517410230831 [03/25/2022-13:25:00] [V] [TRT] Tactic: 8883810517410230831 Time: 0.052864 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 8930797211803511337 [03/25/2022-13:25:00] [V] [TRT] Tactic: 8930797211803511337 Time: 0.061568 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 8935070489925739043 [03/25/2022-13:25:00] [V] [TRT] Tactic: 8935070489925739043 Time: 0.054272 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1 Tactic: 9062173295331155069 [03/25/2022-13:25:00] [V] [TRT] Tactic: 9062173295331155069 Time: 0.102656 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_small_nt_v1 Tactic: -9118785798277698619 [03/25/2022-13:25:00] [V] [TRT] Tactic: -9118785798277698619 Time: 0.085632 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -8985599729413291927 [03/25/2022-13:25:00] [V] [TRT] Tactic: -8985599729413291927 Time: 0.058112 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -8972697510150675429 [03/25/2022-13:25:00] [V] [TRT] Tactic: -8972697510150675429 Time: 0.074624 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -8943710627305202139 [03/25/2022-13:25:00] [V] [TRT] Tactic: -8943710627305202139 Time: 0.089728 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -8859846367886814331 [03/25/2022-13:25:00] [V] [TRT] Tactic: -8859846367886814331 Time: 0.093184 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -8638624340850784688 [03/25/2022-13:25:00] [V] [TRT] Tactic: -8638624340850784688 Time: 0.073856 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -8556775352640313933 [03/25/2022-13:25:00] [V] [TRT] Tactic: -8556775352640313933 Time: 0.051072 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -8382298409581540699 [03/25/2022-13:25:00] [V] [TRT] Tactic: -8382298409581540699 Time: 0.132864 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -8172318747337038866 [03/25/2022-13:25:00] [V] [TRT] Tactic: -8172318747337038866 Time: 0.081408 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -8038164441468184723 [03/25/2022-13:25:00] [V] [TRT] Tactic: -8038164441468184723 Time: 0.066048 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -7844028314176826857 [03/25/2022-13:25:00] [V] [TRT] Tactic: -7844028314176826857 Time: 0.104064 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -7674507941016740570 [03/25/2022-13:25:00] [V] [TRT] Tactic: -7674507941016740570 Time: 0.039808 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -7364286662638617917 [03/25/2022-13:25:00] [V] [TRT] Tactic: -7364286662638617917 Time: 0.048256 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r1s1 Tactic: -7361755530333096258 [03/25/2022-13:25:00] [V] [TRT] Tactic: -7361755530333096258 Time: 0.108672 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_t1r1s1 Tactic: -7289760022626653388 [03/25/2022-13:25:00] [V] [TRT] Tactic: -7289760022626653388 Time: 0.05312 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -7106539943789766885 [03/25/2022-13:25:00] [V] [TRT] Tactic: -7106539943789766885 Time: 0.081664 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -6969478418607271266 [03/25/2022-13:25:00] [V] [TRT] Tactic: -6969478418607271266 Time: 0.08064 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -6930438165437733000 [03/25/2022-13:25:00] [V] [TRT] Tactic: -6930438165437733000 Time: 0.119424 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -6879607992933502380 [03/25/2022-13:25:00] [V] [TRT] Tactic: -6879607992933502380 Time: 0.066048 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -6839669803644810934 [03/25/2022-13:25:00] [V] [TRT] Tactic: -6839669803644810934 Time: 0.060672 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -6812830108414456369 [03/25/2022-13:25:00] [V] [TRT] Tactic: -6812830108414456369 Time: 0.060928 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -6779804930216439173 [03/25/2022-13:25:00] [V] [TRT] Tactic: -6779804930216439173 Time: 0.036352 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -6527178416855951297 [03/25/2022-13:25:00] [V] [TRT] Tactic: -6527178416855951297 Time: 0.099712 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -6510232214299595844 [03/25/2022-13:25:00] [V] [TRT] Tactic: -6510232214299595844 Time: 0.099712 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_small_nt_v1 Tactic: -6400348606759295499 [03/25/2022-13:25:00] [V] [TRT] Tactic: -6400348606759295499 Time: 0.084608 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_simple_t1r1s1 Tactic: -6346247605026339453 [03/25/2022-13:25:00] [V] [TRT] Tactic: -6346247605026339453 Time: 0.089216 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -6232597026469067819 [03/25/2022-13:25:00] [V] [TRT] Tactic: -6232597026469067819 Time: 0.06784 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_small_nt_v1 Tactic: -5980889159865208399 [03/25/2022-13:25:00] [V] [TRT] Tactic: -5980889159865208399 Time: 0.082816 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_medium_nt_v1 Tactic: -5766140806760372989 [03/25/2022-13:25:00] [V] [TRT] Tactic: -5766140806760372989 Time: 0.087296 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -5697614955743334137 [03/25/2022-13:25:00] [V] [TRT] Tactic: -5697614955743334137 Time: 0.081792 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -5671123121710113970 [03/25/2022-13:25:00] [V] [TRT] Tactic: -5671123121710113970 Time: 0.094976 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -5615581362569252260 [03/25/2022-13:25:00] [V] [TRT] Tactic: -5615581362569252260 Time: 0.096512 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -5562968047117507056 [03/25/2022-13:25:00] [V] [TRT] Tactic: -5562968047117507056 Time: 0.06656 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -5516472881360101487 [03/25/2022-13:25:00] [V] [TRT] Tactic: -5516472881360101487 Time: 0.073472 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -5311474420963248369 [03/25/2022-13:25:00] [V] [TRT] Tactic: -5311474420963248369 Time: 0.116992 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -5170003087447722174 [03/25/2022-13:25:00] [V] [TRT] Tactic: -5170003087447722174 Time: 0.097536 [03/25/2022-13:25:00] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -4889586143772361690 [03/25/2022-13:25:01] [V] [TRT] Tactic: -4889586143772361690 Time: 0.060928 [03/25/2022-13:25:01] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -4889498558023475527 [03/25/2022-13:25:01] [V] [TRT] Tactic: -4889498558023475527 Time: 0.052352 [03/25/2022-13:25:01] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -4849712423393454704 [03/25/2022-13:25:01] [V] [TRT] Tactic: -4849712423393454704 Time: 0.061184 [03/25/2022-13:25:01] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: -4681913707320020520 [03/25/2022-13:25:01] [V] [TRT] Tactic: -4681913707320020520 Time: 0.036736 [03/25/2022-13:25:01] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_medium_nt_v1 Tactic: -4516822589357530549 [03/25/2022-13:25:01] [V] [TRT] Tactic: -4516822589357530549 Time: 0.087936 [03/25/2022-13:25:01] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -4455415102719506646 [03/25/2022-13:25:01] [V] [TRT] Tactic: -4455415102719506646 Time: 0.072448 [03/25/2022-13:25:01] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -4425346730823666456 [03/25/2022-13:25:01] [V] [TRT] Tactic: -4425346730823666456 Time: 0.083968 [03/25/2022-13:25:01] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r1s1 Tactic: -4260476497340370474 [03/25/2022-13:25:01] [V] [TRT] Tactic: -4260476497340370474 Time: 0.133632 [03/25/2022-13:25:01] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -4182501876984672402 [03/25/2022-13:25:01] [V] [TRT] Tactic: -4182501876984672402 Time: 0.093824 [03/25/2022-13:25:01] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_t1r1s1 Tactic: -4151617293257698859 [03/25/2022-13:25:01] [V] [TRT] Tactic: -4151617293257698859 Time: 0.06848 [03/25/2022-13:25:01] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -3862908719298381451 [03/25/2022-13:25:01] [V] [TRT] Tactic: -3862908719298381451 Time: 0.036992 [03/25/2022-13:25:01] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -3825889760337461729 [03/25/2022-13:25:01] [V] [TRT] Tactic: -3825889760337461729 Time: 0.098176 [03/25/2022-13:25:01] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -3797022944823726673 [03/25/2022-13:25:01] [V] [TRT] Tactic: -3797022944823726673 Time: 0.072192 [03/25/2022-13:25:01] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -3613322253849278738 [03/25/2022-13:25:01] [V] [TRT] Tactic: -3613322253849278738 Time: 0.120064 [03/25/2022-13:25:01] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -3577322188448771475 [03/25/2022-13:25:01] [V] [TRT] Tactic: -3577322188448771475 Time: 0.075648 [03/25/2022-13:25:01] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16_t1r1s1 Tactic: -3531681826488401618 [03/25/2022-13:25:01] [V] [TRT] Tactic: -3531681826488401618 Time: 0.08192 [03/25/2022-13:25:01] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -3305554949874552860 [03/25/2022-13:25:01] [V] [TRT] Tactic: -3305554949874552860 Time: 0.101888 [03/25/2022-13:25:01] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -3288585994448820820 [03/25/2022-13:25:01] [V] [TRT] Tactic: -3288585994448820820 Time: 0.067968 [03/25/2022-13:25:01] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -2754311112012636251 [03/25/2022-13:25:01] [V] [TRT] Tactic: -2754311112012636251 Time: 0.076672 [03/25/2022-13:25:01] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -2432868635536396215 [03/25/2022-13:25:01] [V] [TRT] Tactic: -2432868635536396215 Time: 0.074496 [03/25/2022-13:25:01] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -2379804152300264660 [03/25/2022-13:25:01] [V] [TRT] Tactic: -2379804152300264660 Time: 0.102016 [03/25/2022-13:25:01] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: -2352253835013627337 [03/25/2022-13:25:01] [V] [TRT] Tactic: -2352253835013627337 Time: 0.039808 [03/25/2022-13:25:01] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -2335587136911650799 [03/25/2022-13:25:01] [V] [TRT] Tactic: -2335587136911650799 Time: 0.065024 [03/25/2022-13:25:01] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r1s1 Tactic: -2315453944962430928 [03/25/2022-13:25:01] [V] [TRT] Tactic: -2315453944962430928 Time: 0.050176 [03/25/2022-13:25:01] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: -2238364958919154661 [03/25/2022-13:25:01] [V] [TRT] Tactic: -2238364958919154661 Time: 0.076032 [03/25/2022-13:25:01] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -1916483171117495388 [03/25/2022-13:25:01] [V] [TRT] Tactic: -1916483171117495388 Time: 0.06208 [03/25/2022-13:25:01] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -1740762957710554518 [03/25/2022-13:25:01] [V] [TRT] Tactic: -1740762957710554518 Time: 0.101888 [03/25/2022-13:25:01] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -1549742793039499659 [03/25/2022-13:25:01] [V] [TRT] Tactic: -1549742793039499659 Time: 0.068352 [03/25/2022-13:25:01] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -1499578657823798783 [03/25/2022-13:25:01] [V] [TRT] Tactic: -1499578657823798783 Time: 0.073344 [03/25/2022-13:25:01] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r1s1 Tactic: -1494157908358500249 [03/25/2022-13:25:01] [V] [TRT] Tactic: -1494157908358500249 Time: 0.093824 [03/25/2022-13:25:01] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_t1r1s1 Tactic: -1328736756812546664 [03/25/2022-13:25:01] [V] [TRT] Tactic: -1328736756812546664 Time: 0.083072 [03/25/2022-13:25:01] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -1006589727652607355 [03/25/2022-13:25:01] [V] [TRT] Tactic: -1006589727652607355 Time: 0.095744 [03/25/2022-13:25:01] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: -713022856474991236 [03/25/2022-13:25:01] [V] [TRT] Tactic: -713022856474991236 Time: 0.119936 [03/25/2022-13:25:01] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: -619668460699260222 [03/25/2022-13:25:01] [V] [TRT] Tactic: -619668460699260222 Time: 0.075648 [03/25/2022-13:25:01] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -405554772060757402 [03/25/2022-13:25:01] [V] [TRT] Tactic: -405554772060757402 Time: 0.06912 [03/25/2022-13:25:01] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -375949437730908730 [03/25/2022-13:25:01] [V] [TRT] Tactic: -375949437730908730 Time: 0.059136 [03/25/2022-13:25:01] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: -233227833606287806 [03/25/2022-13:25:01] [V] [TRT] Tactic: -233227833606287806 Time: 0.061312 [03/25/2022-13:25:01] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -111878368089469751 [03/25/2022-13:25:01] [V] [TRT] Tactic: -111878368089469751 Time: 0.080896 [03/25/2022-13:25:01] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -48936598874722005 [03/25/2022-13:25:01] [V] [TRT] Tactic: -48936598874722005 Time: 0.058368 [03/25/2022-13:25:01] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -19707840769375107 [03/25/2022-13:25:01] [V] [TRT] Tactic: -19707840769375107 Time: 0.075136 [03/25/2022-13:25:01] [V] [TRT] Fastest Tactic: -6779804930216439173 Time: 0.036352 [03/25/2022-13:25:01] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: -6779804930216439173 [03/25/2022-13:25:01] [V] [TRT] =============== Computing costs for [03/25/2022-13:25:01] [V] [TRT] *************** Autotuning format combination: Int8(6272,49:4,7,1) -> Int8(6272,49:4,7,1) *************** [03/25/2022-13:25:01] [V] [TRT] *************** Autotuning format combination: Int8(6272,49:4,7,1) -> Int8(784,49:32,7,1) *************** [03/25/2022-13:25:01] [V] [TRT] *************** Autotuning format combination: Int8(784,49:32,7,1) -> Int8(784,49:32,7,1) *************** [03/25/2022-13:25:01] [V] [TRT] Skip timing cache hit (epiFadd mismatch) for node: sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881, LayerImpl: CaskConvolution, tactic: 3651043333819148268 [03/25/2022-13:25:01] [V] [TRT] --------------- Timing Runner: sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 (CudaGroupConvolution) [03/25/2022-13:25:01] [V] [TRT] CudaGroupConvolution has no valid tactics for this config, skipping [03/25/2022-13:25:01] [V] [TRT] --------------- Timing Runner: sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 (CudaDepthwiseConvolution) [03/25/2022-13:25:01] [V] [TRT] CudaDepthwiseConvolution has no valid tactics for this config, skipping [03/25/2022-13:25:01] [V] [TRT] --------------- Timing Runner: sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 (FusedConvActConvolution) [03/25/2022-13:25:01] [V] [TRT] FusedConvActConvolution has no valid tactics for this config, skipping [03/25/2022-13:25:01] [V] [TRT] --------------- Timing Runner: sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 (CaskConvolution) [03/25/2022-13:25:01] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_epifadd Tactic: 177040020707947851 [03/25/2022-13:25:01] [V] [TRT] Tactic: 177040020707947851 Time: 0.180224 [03/25/2022-13:25:01] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 184229963126259101 [03/25/2022-13:25:01] [V] [TRT] Tactic: 184229963126259101 Time: 0.13632 [03/25/2022-13:25:01] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 289888059097454627 [03/25/2022-13:25:01] [V] [TRT] Tactic: 289888059097454627 Time: 0.160768 [03/25/2022-13:25:01] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32 Tactic: 328135613486708155 [03/25/2022-13:25:01] [V] [TRT] Tactic: 328135613486708155 Time: 0.283264 [03/25/2022-13:25:01] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 680740992583869928 [03/25/2022-13:25:01] [V] [TRT] Tactic: 680740992583869928 Time: 0.17792 [03/25/2022-13:25:01] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32 Tactic: 1111159740952609683 [03/25/2022-13:25:01] [V] [TRT] Tactic: 1111159740952609683 Time: 0.13312 [03/25/2022-13:25:01] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: 1134860903395928905 [03/25/2022-13:25:01] [V] [TRT] Tactic: 1134860903395928905 Time: 0.12544 [03/25/2022-13:25:01] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: 1276591930377039442 [03/25/2022-13:25:01] [V] [TRT] Tactic: 1276591930377039442 Time: 0.138496 [03/25/2022-13:25:01] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r3s3 Tactic: 1388866374720163187 [03/25/2022-13:25:01] [V] [TRT] Tactic: 1388866374720163187 Time: 0.213248 [03/25/2022-13:25:01] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32 Tactic: 1399501420456320585 [03/25/2022-13:25:01] [V] [TRT] Tactic: 1399501420456320585 Time: 0.168192 [03/25/2022-13:25:01] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: 1550399266192842845 [03/25/2022-13:25:01] [V] [TRT] Tactic: 1550399266192842845 Time: 0.166656 [03/25/2022-13:25:01] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 1572887561103143487 [03/25/2022-13:25:01] [V] [TRT] Tactic: 1572887561103143487 Time: 0.1312 [03/25/2022-13:25:01] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x96x64_stage5_warpsize2x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 1853122447892949466 [03/25/2022-13:25:01] [V] [TRT] Tactic: 1853122447892949466 Time: 0.16704 [03/25/2022-13:25:01] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: 2133329569091732311 [03/25/2022-13:25:01] [V] [TRT] Tactic: 2133329569091732311 Time: 0.16576 [03/25/2022-13:25:01] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32 Tactic: 2325023763229477890 [03/25/2022-13:25:01] [V] [TRT] Tactic: 2325023763229477890 Time: 0.090368 [03/25/2022-13:25:01] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: 2579824863892891529 [03/25/2022-13:25:01] [V] [TRT] Tactic: 2579824863892891529 Time: 0.233728 [03/25/2022-13:25:01] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32 Tactic: 2783960536172159663 [03/25/2022-13:25:01] [V] [TRT] Tactic: 2783960536172159663 Time: 0.105088 [03/25/2022-13:25:01] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32 Tactic: 2821711838552913693 [03/25/2022-13:25:01] [V] [TRT] Tactic: 2821711838552913693 Time: 0.147968 [03/25/2022-13:25:01] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x128_stage3_warpsize1x2x1_g1_sptensor16x8x64 Tactic: 2945009978756227538 [03/25/2022-13:25:01] [V] [TRT] Tactic: 2945009978756227538 Time: 0.093696 [03/25/2022-13:25:01] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_medium_nt_v1 Tactic: 2985940154541537814 [03/25/2022-13:25:01] [V] [TRT] Tactic: 2985940154541537814 Time: 0.177792 [03/25/2022-13:25:01] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 3284282970967328046 [03/25/2022-13:25:02] [V] [TRT] Tactic: 3284282970967328046 Time: 0.197376 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 3401614690060226673 [03/25/2022-13:25:02] [V] [TRT] Tactic: 3401614690060226673 Time: 0.162688 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32 Tactic: 3456719996792527006 [03/25/2022-13:25:02] [V] [TRT] Tactic: 3456719996792527006 Time: 0.148224 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 3512426920013359699 [03/25/2022-13:25:02] [V] [TRT] Tactic: 3512426920013359699 Time: 0.128896 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64 Tactic: 3651043333819148268 [03/25/2022-13:25:02] [V] [TRT] Tactic: 3651043333819148268 Time: 0.070784 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_medium_nt_v1 Tactic: 3899284354987683408 [03/25/2022-13:25:02] [V] [TRT] Tactic: 3899284354987683408 Time: 0.169984 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32 Tactic: 4042202769383439184 [03/25/2022-13:25:02] [V] [TRT] Tactic: 4042202769383439184 Time: 0.099072 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_large_nt_v1 Tactic: 4182625619810185112 [03/25/2022-13:25:02] [V] [TRT] Tactic: 4182625619810185112 Time: 0.195584 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 4214794893922618058 [03/25/2022-13:25:02] [V] [TRT] Tactic: 4214794893922618058 Time: 0.164736 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_epifadd Tactic: 4259547356717612415 [03/25/2022-13:25:02] [V] [TRT] Tactic: 4259547356717612415 Time: 0.133376 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 4384868749799132354 [03/25/2022-13:25:02] [V] [TRT] Tactic: 4384868749799132354 Time: 0.233216 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64 Tactic: 4414594337986714263 [03/25/2022-13:25:02] [V] [TRT] Tactic: 4414594337986714263 Time: 0.073088 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_small_nt_v1 Tactic: 4717285412741024953 [03/25/2022-13:25:02] [V] [TRT] Tactic: 4717285412741024953 Time: 0.17088 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32 Tactic: 4734519122557206480 [03/25/2022-13:25:02] [V] [TRT] Tactic: 4734519122557206480 Time: 0.087424 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: 4922297020351187339 [03/25/2022-13:25:02] [V] [TRT] Tactic: 4922297020351187339 Time: 0.10944 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16_t1r3s3 Tactic: 4931167631624420067 [03/25/2022-13:25:02] [V] [TRT] Tactic: 4931167631624420067 Time: 0.162304 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_epifadd Tactic: 5121596860264626879 [03/25/2022-13:25:02] [V] [TRT] Tactic: 5121596860264626879 Time: 0.086016 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 5136656982162849059 [03/25/2022-13:25:02] [V] [TRT] Tactic: 5136656982162849059 Time: 0.19776 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: 5158259316594207439 [03/25/2022-13:25:02] [V] [TRT] Tactic: 5158259316594207439 Time: 0.099328 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 5189825015507701541 [03/25/2022-13:25:02] [V] [TRT] Tactic: 5189825015507701541 Time: 0.312192 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32 Tactic: 5424417905073460656 [03/25/2022-13:25:02] [V] [TRT] Tactic: 5424417905073460656 Time: 0.147968 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32 Tactic: 5442043907221427810 [03/25/2022-13:25:02] [V] [TRT] Tactic: 5442043907221427810 Time: 0.129152 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x256x64_stage3_warpsize2x4x1_g1_tensor16x8x32_t1r3s3 Tactic: 5544365258913999384 [03/25/2022-13:25:02] [V] [TRT] Tactic: 5544365258913999384 Time: 0.127104 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16_t1r3s3 Tactic: 5641967928706599451 [03/25/2022-13:25:02] [V] [TRT] Tactic: 5641967928706599451 Time: 0.269952 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 5721595115357140131 [03/25/2022-13:25:02] [V] [TRT] Tactic: 5721595115357140131 Time: 0.133888 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 5966973378912044513 [03/25/2022-13:25:02] [V] [TRT] Tactic: 5966973378912044513 Time: 0.09024 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: 6004789655466615912 [03/25/2022-13:25:02] [V] [TRT] Tactic: 6004789655466615912 Time: 0.131456 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r3s3 Tactic: 6146901278630392829 [03/25/2022-13:25:02] [V] [TRT] Tactic: 6146901278630392829 Time: 0.087936 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16 Tactic: 6394572396369862482 [03/25/2022-13:25:02] [V] [TRT] Tactic: 6394572396369862482 Time: 0.270848 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_epifadd Tactic: 6434020722187266170 [03/25/2022-13:25:02] [V] [TRT] Tactic: 6434020722187266170 Time: 0.086016 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32 Tactic: 6781129591847482048 [03/25/2022-13:25:02] [V] [TRT] Tactic: 6781129591847482048 Time: 0.1056 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 6984451771200230840 [03/25/2022-13:25:02] [V] [TRT] Tactic: 6984451771200230840 Time: 0.12992 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: 7048234086361926570 [03/25/2022-13:25:02] [V] [TRT] Tactic: 7048234086361926570 Time: 0.188416 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32 Tactic: 7077570591813340966 [03/25/2022-13:25:02] [V] [TRT] Tactic: 7077570591813340966 Time: 0.126208 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 7191893591576074000 [03/25/2022-13:25:02] [V] [TRT] Tactic: 7191893591576074000 Time: 0.164992 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32_t1r3s3 Tactic: 7429976449747682901 [03/25/2022-13:25:02] [V] [TRT] Tactic: 7429976449747682901 Time: 0.113664 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 7438984192263206338 [03/25/2022-13:25:02] [V] [TRT] Tactic: 7438984192263206338 Time: 0.098432 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: 7504901284678552178 [03/25/2022-13:25:02] [V] [TRT] Tactic: 7504901284678552178 Time: 0.087936 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 8096257414008860171 [03/25/2022-13:25:02] [V] [TRT] Tactic: 8096257414008860171 Time: 0.103936 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: 8128112048355596715 [03/25/2022-13:25:02] [V] [TRT] Tactic: 8128112048355596715 Time: 0.10496 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_indexed_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: 8751622450593766232 [03/25/2022-13:25:02] [V] [TRT] Tactic: 8751622450593766232 Time: 0.096256 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_indexed_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32 Tactic: 9064458886956700976 [03/25/2022-13:25:02] [V] [TRT] Tactic: 9064458886956700976 Time: 0.09664 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 9143438935315839085 [03/25/2022-13:25:02] [V] [TRT] Tactic: 9143438935315839085 Time: 0.162816 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32 Tactic: -9165697322068360861 [03/25/2022-13:25:02] [V] [TRT] Tactic: -9165697322068360861 Time: 0.087936 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_small_nt_v1 Tactic: -9118785798277698619 [03/25/2022-13:25:02] [V] [TRT] Tactic: -9118785798277698619 Time: 0.165504 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: -9108166971364503411 [03/25/2022-13:25:02] [V] [TRT] Tactic: -9108166971364503411 Time: 0.190464 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage1_warpsize4x2x1_g1_tensor8x8x16 Tactic: -8861822316054763526 [03/25/2022-13:25:02] [V] [TRT] Tactic: -8861822316054763526 Time: 0.162304 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x128_stage3_warpsize1x1x1_g1_sptensor16x8x64 Tactic: -8791277710877987710 [03/25/2022-13:25:02] [V] [TRT] Tactic: -8791277710877987710 Time: 0.11584 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: -8691377209893505057 [03/25/2022-13:25:02] [V] [TRT] Tactic: -8691377209893505057 Time: 0.098304 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: -8520292213102999339 [03/25/2022-13:25:02] [V] [TRT] Tactic: -8520292213102999339 Time: 0.112768 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16_t1r3s3 Tactic: -8475551154769412306 [03/25/2022-13:25:02] [V] [TRT] Tactic: -8475551154769412306 Time: 0.169856 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x192x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -8417388128970254446 [03/25/2022-13:25:02] [V] [TRT] Tactic: -8417388128970254446 Time: 0.132096 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32 Tactic: -8263994888336646547 [03/25/2022-13:25:02] [V] [TRT] Tactic: -8263994888336646547 Time: 0.088064 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32 Tactic: -8205948405243401049 [03/25/2022-13:25:02] [V] [TRT] Tactic: -8205948405243401049 Time: 0.166912 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: -7992068592656168418 [03/25/2022-13:25:02] [V] [TRT] Tactic: -7992068592656168418 Time: 0.103552 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x32x64_stage4_warpsize2x1x1_g1_tensor16x8x32_t1r3s3 Tactic: -7898477046581738867 [03/25/2022-13:25:02] [V] [TRT] Tactic: -7898477046581738867 Time: 0.14528 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_epifadd Tactic: -7842775553137511386 [03/25/2022-13:25:02] [V] [TRT] Tactic: -7842775553137511386 Time: 0.0896 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32 Tactic: -7683887278997527517 [03/25/2022-13:25:02] [V] [TRT] Tactic: -7683887278997527517 Time: 0.161152 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32 Tactic: -7381370635708568663 [03/25/2022-13:25:02] [V] [TRT] Tactic: -7381370635708568663 Time: 0.106368 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x96x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -7129320389887881029 [03/25/2022-13:25:02] [V] [TRT] Tactic: -7129320389887881029 Time: 0.122496 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -6959995514028471820 [03/25/2022-13:25:02] [V] [TRT] Tactic: -6959995514028471820 Time: 0.139904 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_small_nt_v1 Tactic: -6400348606759295499 [03/25/2022-13:25:02] [V] [TRT] Tactic: -6400348606759295499 Time: 0.167936 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16_t1r3s3 Tactic: -6371781333659293809 [03/25/2022-13:25:02] [V] [TRT] Tactic: -6371781333659293809 Time: 0.173568 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x96x64_stage4_warpsize4x2x1_g1_tensor16x8x32 Tactic: -6256128573036943404 [03/25/2022-13:25:02] [V] [TRT] Tactic: -6256128573036943404 Time: 0.127872 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_small_nt_v1 Tactic: -5980889159865208399 [03/25/2022-13:25:02] [V] [TRT] Tactic: -5980889159865208399 Time: 0.164864 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_medium_nt_v1 Tactic: -5766140806760372989 [03/25/2022-13:25:02] [V] [TRT] Tactic: -5766140806760372989 Time: 0.190592 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: -5709079507616090666 [03/25/2022-13:25:02] [V] [TRT] Tactic: -5709079507616090666 Time: 0.087424 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: -5698636014239116282 [03/25/2022-13:25:02] [V] [TRT] Tactic: -5698636014239116282 Time: 0.086016 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage1_warpsize4x1x1_g1_tensor8x8x16 Tactic: -5180570335464125033 [03/25/2022-13:25:02] [V] [TRT] Tactic: -5180570335464125033 Time: 0.175232 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -4933563390723451692 [03/25/2022-13:25:02] [V] [TRT] Tactic: -4933563390723451692 Time: 0.129792 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_medium_nt_v1 Tactic: -4516822589357530549 [03/25/2022-13:25:02] [V] [TRT] Tactic: -4516822589357530549 Time: 0.1856 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x16x128_stage3_warpsize1x1x1_g1_sptensor16x8x64 Tactic: -4232916483289779353 [03/25/2022-13:25:02] [V] [TRT] Tactic: -4232916483289779353 Time: 0.183936 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64 Tactic: -3460842194336717186 [03/25/2022-13:25:02] [V] [TRT] Tactic: -3460842194336717186 Time: 0.095232 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: -3413217501222406256 [03/25/2022-13:25:02] [V] [TRT] Tactic: -3413217501222406256 Time: 0.084992 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x96x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: -3280888557222886418 [03/25/2022-13:25:02] [V] [TRT] Tactic: -3280888557222886418 Time: 0.112 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -3238475748440751107 [03/25/2022-13:25:02] [V] [TRT] Tactic: -3238475748440751107 Time: 0.098944 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: -3182884991006484042 [03/25/2022-13:25:02] [V] [TRT] Tactic: -3182884991006484042 Time: 0.090496 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -3173468756112541306 [03/25/2022-13:25:02] [V] [TRT] Tactic: -3173468756112541306 Time: 0.165504 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x128_ldg16_relu_large_nt_v1 Tactic: -2917455979290586480 [03/25/2022-13:25:02] [V] [TRT] Tactic: -2917455979290586480 Time: 0.169728 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x128x64_stage3_warpsize4x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -2741641298163591508 [03/25/2022-13:25:02] [V] [TRT] Tactic: -2741641298163591508 Time: 0.125824 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: ampere_int8_i8816cudnn_int8_128x128_ldg16_relu_large_nt_v1 Tactic: -2571022005763160364 [03/25/2022-13:25:02] [V] [TRT] Tactic: -2571022005763160364 Time: 0.181888 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: -2499089240293650188 [03/25/2022-13:25:02] [V] [TRT] Tactic: -2499089240293650188 Time: 0.178304 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage1_warpsize2x2x1_g1_tensor8x8x16 Tactic: -2328318099174473157 [03/25/2022-13:25:02] [V] [TRT] Tactic: -2328318099174473157 Time: 0.180736 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: -2083778562631872334 [03/25/2022-13:25:02] [V] [TRT] Tactic: -2083778562631872334 Time: 0.104832 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize48x128x64_stage3_warpsize1x4x1_g1_tensor16x8x32 Tactic: -2054375205435666404 [03/25/2022-13:25:02] [V] [TRT] Tactic: -2054375205435666404 Time: 0.121088 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -1546787387293556842 [03/25/2022-13:25:02] [V] [TRT] Tactic: -1546787387293556842 Time: 0.08832 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x32x64_stage4_warpsize4x1x1_g1_tensor16x8x32 Tactic: -1498626619443284096 [03/25/2022-13:25:02] [V] [TRT] Tactic: -1498626619443284096 Time: 0.133888 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x192x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -1471245223605064669 [03/25/2022-13:25:02] [V] [TRT] Tactic: -1471245223605064669 Time: 0.102016 [03/25/2022-13:25:02] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32 Tactic: -1283580231568512025 [03/25/2022-13:25:03] [V] [TRT] Tactic: -1283580231568512025 Time: 0.200448 [03/25/2022-13:25:03] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize192x64x64_stage3_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: -1224421172675151280 [03/25/2022-13:25:03] [V] [TRT] Tactic: -1224421172675151280 Time: 0.10432 [03/25/2022-13:25:03] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_epifadd Tactic: -1173968681844185579 [03/25/2022-13:25:03] [V] [TRT] Tactic: -1173968681844185579 Time: 0.200064 [03/25/2022-13:25:03] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -921247911551089037 [03/25/2022-13:25:03] [V] [TRT] Tactic: -921247911551089037 Time: 0.099072 [03/25/2022-13:25:03] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32 Tactic: -762222380308749469 [03/25/2022-13:25:03] [V] [TRT] Tactic: -762222380308749469 Time: 0.129792 [03/25/2022-13:25:03] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_epifadd Tactic: -556794153877490941 [03/25/2022-13:25:03] [V] [TRT] Tactic: -556794153877490941 Time: 0.131072 [03/25/2022-13:25:03] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r3s3 Tactic: -516725800067794372 [03/25/2022-13:25:03] [V] [TRT] Tactic: -516725800067794372 Time: 0.08704 [03/25/2022-13:25:03] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: ampere_int8_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_large_nt_v1 Tactic: -428104331444385564 [03/25/2022-13:25:03] [V] [TRT] Tactic: -428104331444385564 Time: 0.19072 [03/25/2022-13:25:03] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage1_warpsize2x1x1_g1_tensor8x8x16 Tactic: -366411318217594794 [03/25/2022-13:25:03] [V] [TRT] Tactic: -366411318217594794 Time: 0.215552 [03/25/2022-13:25:03] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm75_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage1_warpsize2x4x1_g1_tensor8x8x16 Tactic: -351548418071036983 [03/25/2022-13:25:03] [V] [TRT] Tactic: -351548418071036983 Time: 0.162688 [03/25/2022-13:25:03] [V] [TRT] Fastest Tactic: 3651043333819148268 Time: 0.070784 [03/25/2022-13:25:03] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: 3651043333819148268 [03/25/2022-13:25:03] [V] [TRT] =============== Computing costs for [03/25/2022-13:25:03] [V] [TRT] *************** Autotuning format combination: Int8(6272,49:4,7,1), Float(100352,49,7,1) -> Float(100352,49,7,1) *************** [03/25/2022-13:25:03] [V] [TRT] --------------- Timing Runner: sections.3.2.conv3.module.weight + QuantizeLinear_890_quantize_scale_node + Conv_894 + Add_902 + Relu_903 (CudaDepthwiseConvolution) [03/25/2022-13:25:03] [V] [TRT] CudaDepthwiseConvolution has no valid tactics for this config, skipping [03/25/2022-13:25:03] [V] [TRT] --------------- Timing Runner: sections.3.2.conv3.module.weight + QuantizeLinear_890_quantize_scale_node + Conv_894 + Add_902 + Relu_903 (CaskConvolution) [03/25/2022-13:25:03] [V] [TRT] sections.3.2.conv3.module.weight + QuantizeLinear_890_quantize_scale_node + Conv_894 + Add_902 + Relu_903 Set Tactic Name: ampere_fp32_icudnn_int8x4_128x128_relu_small_nn_v1 Tactic: 1508480131241957639 [03/25/2022-13:25:03] [V] [TRT] Tactic: 1508480131241957639 Time: 0.284672 [03/25/2022-13:25:03] [V] [TRT] sections.3.2.conv3.module.weight + QuantizeLinear_890_quantize_scale_node + Conv_894 + Add_902 + Relu_903 Set Tactic Name: ampere_fp32_icudnn_int8x4_128x128_relu_interior_nn_v1 Tactic: 2141154648944475104 [03/25/2022-13:25:03] [V] [TRT] Tactic: 2141154648944475104 Time: 0.28352 [03/25/2022-13:25:03] [V] [TRT] sections.3.2.conv3.module.weight + QuantizeLinear_890_quantize_scale_node + Conv_894 + Add_902 + Relu_903 Set Tactic Name: ampere_fp32_icudnn_int8x4_128x128_relu_medium_nn_v1 Tactic: 3239257003214966313 [03/25/2022-13:25:03] [V] [TRT] Tactic: 3239257003214966313 Time: 0.28544 [03/25/2022-13:25:03] [V] [TRT] sections.3.2.conv3.module.weight + QuantizeLinear_890_quantize_scale_node + Conv_894 + Add_902 + Relu_903 Set Tactic Name: ampere_fp32_icudnn_int8x4_128x32_relu_xregs_small_nn_v1 Tactic: 5592640619112287921 [03/25/2022-13:25:03] [V] [TRT] Tactic: 5592640619112287921 Time: 0.306688 [03/25/2022-13:25:03] [V] [TRT] sections.3.2.conv3.module.weight + QuantizeLinear_890_quantize_scale_node + Conv_894 + Add_902 + Relu_903 Set Tactic Name: ampere_fp32_icudnn_int8x4_128x64_relu_small_nn_v1 Tactic: 7621465827583909090 [03/25/2022-13:25:03] [V] [TRT] Tactic: 7621465827583909090 Time: 0.277248 [03/25/2022-13:25:03] [V] [TRT] sections.3.2.conv3.module.weight + QuantizeLinear_890_quantize_scale_node + Conv_894 + Add_902 + Relu_903 Set Tactic Name: ampere_fp32_icudnn_int8x4_128x32_relu_interior_nn_v1 Tactic: -6580271968881459581 [03/25/2022-13:25:03] [V] [TRT] Tactic: -6580271968881459581 Time: 0.299648 [03/25/2022-13:25:03] [V] [TRT] sections.3.2.conv3.module.weight + QuantizeLinear_890_quantize_scale_node + Conv_894 + Add_902 + Relu_903 Set Tactic Name: ampere_fp32_icudnn_int8x4_128x64_relu_medium_nn_v1 Tactic: -5576936487443445631 [03/25/2022-13:25:03] [V] [TRT] Tactic: -5576936487443445631 Time: 0.278272 [03/25/2022-13:25:03] [V] [TRT] sections.3.2.conv3.module.weight + QuantizeLinear_890_quantize_scale_node + Conv_894 + Add_902 + Relu_903 Set Tactic Name: ampere_fp32_icudnn_int8x4_128x32_relu_xregs_interior_nn_v1 Tactic: -4443833619060044580 [03/25/2022-13:25:03] [V] [TRT] Tactic: -4443833619060044580 Time: 0.302976 [03/25/2022-13:25:03] [V] [TRT] sections.3.2.conv3.module.weight + QuantizeLinear_890_quantize_scale_node + Conv_894 + Add_902 + Relu_903 Set Tactic Name: ampere_fp32_icudnn_int8x4_128x32_relu_medium_nn_v1 Tactic: -2297737319934264721 [03/25/2022-13:25:03] [V] [TRT] Tactic: -2297737319934264721 Time: 0.31104 [03/25/2022-13:25:03] [V] [TRT] sections.3.2.conv3.module.weight + QuantizeLinear_890_quantize_scale_node + Conv_894 + Add_902 + Relu_903 Set Tactic Name: ampere_fp32_icudnn_int8x4_128x32_relu_xregs_medium_nn_v1 Tactic: -1425085658556684465 [03/25/2022-13:25:03] [V] [TRT] Tactic: -1425085658556684465 Time: 0.319104 [03/25/2022-13:25:03] [V] [TRT] sections.3.2.conv3.module.weight + QuantizeLinear_890_quantize_scale_node + Conv_894 + Add_902 + Relu_903 Set Tactic Name: ampere_fp32_icudnn_int8x4_128x32_relu_small_nn_v1 Tactic: -108011214168778087 [03/25/2022-13:25:03] [V] [TRT] Tactic: -108011214168778087 Time: 0.308352 [03/25/2022-13:25:03] [V] [TRT] sections.3.2.conv3.module.weight + QuantizeLinear_890_quantize_scale_node + Conv_894 + Add_902 + Relu_903 Set Tactic Name: ampere_fp32_icudnn_int8x4_128x64_relu_interior_nn_v1 Tactic: -42427192380281294 [03/25/2022-13:25:03] [V] [TRT] Tactic: -42427192380281294 Time: 0.268672 [03/25/2022-13:25:03] [V] [TRT] Fastest Tactic: -42427192380281294 Time: 0.268672 [03/25/2022-13:25:03] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: -42427192380281294 [03/25/2022-13:25:03] [V] [TRT] *************** Autotuning format combination: Int8(784,49:32,7,1), Float(3136,49:32,7,1) -> Float(3136,49:32,7,1) *************** [03/25/2022-13:25:03] [V] [TRT] --------------- Timing Runner: sections.3.2.conv3.module.weight + QuantizeLinear_890_quantize_scale_node + Conv_894 + Add_902 + Relu_903 (CaskConvolution) [03/25/2022-13:25:03] [V] [TRT] sections.3.2.conv3.module.weight + QuantizeLinear_890_quantize_scale_node + Conv_894 + Add_902 + Relu_903 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8f32_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 2185926646498217601 [03/25/2022-13:25:03] [V] [TRT] Tactic: 2185926646498217601 Time: 0.107648 [03/25/2022-13:25:03] [V] [TRT] sections.3.2.conv3.module.weight + QuantizeLinear_890_quantize_scale_node + Conv_894 + Add_902 + Relu_903 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8f32_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: 2818014835119698671 [03/25/2022-13:25:03] [V] [TRT] Tactic: 2818014835119698671 Time: 0.109824 [03/25/2022-13:25:03] [V] [TRT] sections.3.2.conv3.module.weight + QuantizeLinear_890_quantize_scale_node + Conv_894 + Add_902 + Relu_903 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8f32_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 3721599319722771137 [03/25/2022-13:25:03] [V] [TRT] Tactic: 3721599319722771137 Time: 0.152704 [03/25/2022-13:25:03] [V] [TRT] sections.3.2.conv3.module.weight + QuantizeLinear_890_quantize_scale_node + Conv_894 + Add_902 + Relu_903 Set Tactic Name: ampere_fp32_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_interior_nt_v1 Tactic: 4178917718361232468 [03/25/2022-13:25:03] [V] [TRT] Tactic: 4178917718361232468 Time: 0.13696 [03/25/2022-13:25:03] [V] [TRT] sections.3.2.conv3.module.weight + QuantizeLinear_890_quantize_scale_node + Conv_894 + Add_902 + Relu_903 Set Tactic Name: ampere_fp32_i8816cudnn_int8_256x128_ldg16_relu_medium_nt_v1 Tactic: 5012796702462679112 [03/25/2022-13:25:03] [V] [TRT] Tactic: 5012796702462679112 Time: 0.163456 [03/25/2022-13:25:03] [V] [TRT] sections.3.2.conv3.module.weight + QuantizeLinear_890_quantize_scale_node + Conv_894 + Add_902 + Relu_903 Set Tactic Name: ampere_fp32_i8816cudnn_int8_128x128_ldg16_relu_medium_nt_v1 Tactic: 6556170942941957134 [03/25/2022-13:25:03] [V] [TRT] Tactic: 6556170942941957134 Time: 0.14144 [03/25/2022-13:25:03] [V] [TRT] sections.3.2.conv3.module.weight + QuantizeLinear_890_quantize_scale_node + Conv_894 + Add_902 + Relu_903 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8f32_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_t1r1s1 Tactic: 6618077155362058131 [03/25/2022-13:25:03] [V] [TRT] Tactic: 6618077155362058131 Time: 0.174208 [03/25/2022-13:25:03] [V] [TRT] sections.3.2.conv3.module.weight + QuantizeLinear_890_quantize_scale_node + Conv_894 + Add_902 + Relu_903 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8f32_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_t1r1s1 Tactic: 6969462133921577484 [03/25/2022-13:25:03] [V] [TRT] Tactic: 6969462133921577484 Time: 0.12992 [03/25/2022-13:25:03] [V] [TRT] sections.3.2.conv3.module.weight + QuantizeLinear_890_quantize_scale_node + Conv_894 + Add_902 + Relu_903 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8f32_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 7348718297764976805 [03/25/2022-13:25:03] [V] [TRT] Tactic: 7348718297764976805 Time: 0.13568 [03/25/2022-13:25:03] [V] [TRT] sections.3.2.conv3.module.weight + QuantizeLinear_890_quantize_scale_node + Conv_894 + Add_902 + Relu_903 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8f32_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 7869897696365535632 [03/25/2022-13:25:03] [V] [TRT] Tactic: 7869897696365535632 Time: 0.172288 [03/25/2022-13:25:03] [V] [TRT] sections.3.2.conv3.module.weight + QuantizeLinear_890_quantize_scale_node + Conv_894 + Add_902 + Relu_903 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8f32_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 7970271430402365872 [03/25/2022-13:25:03] [V] [TRT] Tactic: 7970271430402365872 Time: 0.114176 [03/25/2022-13:25:03] [V] [TRT] sections.3.2.conv3.module.weight + QuantizeLinear_890_quantize_scale_node + Conv_894 + Add_902 + Relu_903 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8f32_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 8584126525867982141 [03/25/2022-13:25:03] [V] [TRT] Tactic: 8584126525867982141 Time: 0.108288 [03/25/2022-13:25:03] [V] [TRT] sections.3.2.conv3.module.weight + QuantizeLinear_890_quantize_scale_node + Conv_894 + Add_902 + Relu_903 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8f32_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -8912999970161746151 [03/25/2022-13:25:03] [V] [TRT] Tactic: -8912999970161746151 Time: 0.11456 [03/25/2022-13:25:03] [V] [TRT] sections.3.2.conv3.module.weight + QuantizeLinear_890_quantize_scale_node + Conv_894 + Add_902 + Relu_903 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8f32_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r1s1 Tactic: -8893439100868426414 [03/25/2022-13:25:03] [V] [TRT] Tactic: -8893439100868426414 Time: 0.109056 [03/25/2022-13:25:03] [V] [TRT] sections.3.2.conv3.module.weight + QuantizeLinear_890_quantize_scale_node + Conv_894 + Add_902 + Relu_903 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8f32_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -8069335661823714580 [03/25/2022-13:25:03] [V] [TRT] Tactic: -8069335661823714580 Time: 0.13056 [03/25/2022-13:25:03] [V] [TRT] sections.3.2.conv3.module.weight + QuantizeLinear_890_quantize_scale_node + Conv_894 + Add_902 + Relu_903 Set Tactic Name: ampere_fp32_i8816cudnn_int8_128x128_ldg16_relu_small_nt_v1 Tactic: -7988637803896331454 [03/25/2022-13:25:03] [V] [TRT] Tactic: -7988637803896331454 Time: 0.138368 [03/25/2022-13:25:03] [V] [TRT] sections.3.2.conv3.module.weight + QuantizeLinear_890_quantize_scale_node + Conv_894 + Add_902 + Relu_903 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8f32_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x256x64_stage4_warpsize2x4x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -7957004747664952265 [03/25/2022-13:25:03] [V] [TRT] Tactic: -7957004747664952265 Time: 0.127872 [03/25/2022-13:25:03] [V] [TRT] sections.3.2.conv3.module.weight + QuantizeLinear_890_quantize_scale_node + Conv_894 + Add_902 + Relu_903 Set Tactic Name: ampere_fp32_i8816cudnn_int8_256x128_ldg16_relu_interior_nt_v1 Tactic: -7904635102498369361 [03/25/2022-13:25:03] [V] [TRT] Tactic: -7904635102498369361 Time: 0.163456 [03/25/2022-13:25:03] [V] [TRT] sections.3.2.conv3.module.weight + QuantizeLinear_890_quantize_scale_node + Conv_894 + Add_902 + Relu_903 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8f32_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -7633993163187901093 [03/25/2022-13:25:03] [V] [TRT] Tactic: -7633993163187901093 Time: 0.116096 [03/25/2022-13:25:03] [V] [TRT] sections.3.2.conv3.module.weight + QuantizeLinear_890_quantize_scale_node + Conv_894 + Add_902 + Relu_903 Set Tactic Name: ampere_fp32_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_small_nt_v1 Tactic: -7606074703023778034 [03/25/2022-13:25:03] [V] [TRT] Tactic: -7606074703023778034 Time: 0.138368 [03/25/2022-13:25:03] [V] [TRT] sections.3.2.conv3.module.weight + QuantizeLinear_890_quantize_scale_node + Conv_894 + Add_902 + Relu_903 Set Tactic Name: ampere_fp32_i8816cudnn_int8_256x128_ldg16_relu_small_nt_v1 Tactic: -7282232519526877434 [03/25/2022-13:25:03] [V] [TRT] Tactic: -7282232519526877434 Time: 0.162048 [03/25/2022-13:25:03] [V] [TRT] sections.3.2.conv3.module.weight + QuantizeLinear_890_quantize_scale_node + Conv_894 + Add_902 + Relu_903 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8f32_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x32x64_stage6_warpsize2x1x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -7007036200777870226 [03/25/2022-13:25:03] [V] [TRT] Tactic: -7007036200777870226 Time: 0.15104 [03/25/2022-13:25:03] [V] [TRT] sections.3.2.conv3.module.weight + QuantizeLinear_890_quantize_scale_node + Conv_894 + Add_902 + Relu_903 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8f32_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -6406011580107094428 [03/25/2022-13:25:03] [V] [TRT] Tactic: -6406011580107094428 Time: 0.115712 [03/25/2022-13:25:03] [V] [TRT] sections.3.2.conv3.module.weight + QuantizeLinear_890_quantize_scale_node + Conv_894 + Add_902 + Relu_903 Set Tactic Name: ampere_fp32_i8816cudnn_int8_256x64_ldg16_relu_singleBuffer_medium_nt_v1 Tactic: -5603587790314027122 [03/25/2022-13:25:03] [V] [TRT] Tactic: -5603587790314027122 Time: 0.140928 [03/25/2022-13:25:03] [V] [TRT] sections.3.2.conv3.module.weight + QuantizeLinear_890_quantize_scale_node + Conv_894 + Add_902 + Relu_903 Set Tactic Name: ampere_fp32_i8816cudnn_int8_128x128_ldg16_relu_interior_nt_v1 Tactic: -5416590980288859834 [03/25/2022-13:25:03] [V] [TRT] Tactic: -5416590980288859834 Time: 0.144 [03/25/2022-13:25:03] [V] [TRT] sections.3.2.conv3.module.weight + QuantizeLinear_890_quantize_scale_node + Conv_894 + Add_902 + Relu_903 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8f32_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: -4704615512436071369 [03/25/2022-13:25:03] [V] [TRT] Tactic: -4704615512436071369 Time: 0.117632 [03/25/2022-13:25:03] [V] [TRT] sections.3.2.conv3.module.weight + QuantizeLinear_890_quantize_scale_node + Conv_894 + Add_902 + Relu_903 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8f32_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x64_stage4_warpsize4x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -3665201838779845683 [03/25/2022-13:25:03] [V] [TRT] Tactic: -3665201838779845683 Time: 0.131328 [03/25/2022-13:25:03] [V] [TRT] sections.3.2.conv3.module.weight + QuantizeLinear_890_quantize_scale_node + Conv_894 + Add_902 + Relu_903 Set Tactic Name: ampere_fp32_i8816cudnn_int8_256x64_ldg16_relu_interior_nt_v1 Tactic: -3644377136375731441 [03/25/2022-13:25:03] [V] [TRT] Tactic: -3644377136375731441 Time: 0.142464 [03/25/2022-13:25:03] [V] [TRT] sections.3.2.conv3.module.weight + QuantizeLinear_890_quantize_scale_node + Conv_894 + Add_902 + Relu_903 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8f32_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize32x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -3502495740607894730 [03/25/2022-13:25:03] [V] [TRT] Tactic: -3502495740607894730 Time: 0.138752 [03/25/2022-13:25:03] [V] [TRT] sections.3.2.conv3.module.weight + QuantizeLinear_890_quantize_scale_node + Conv_894 + Add_902 + Relu_903 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8f32_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r1s1 Tactic: -2342404147487779225 [03/25/2022-13:25:03] [V] [TRT] Tactic: -2342404147487779225 Time: 0.117888 [03/25/2022-13:25:03] [V] [TRT] sections.3.2.conv3.module.weight + QuantizeLinear_890_quantize_scale_node + Conv_894 + Add_902 + Relu_903 Set Tactic Name: ampere_fp32_i8816cudnn_int8_256x64_ldg16_relu_small_nt_v1 Tactic: -1610768292520086910 [03/25/2022-13:25:03] [V] [TRT] Tactic: -1610768292520086910 Time: 0.144896 [03/25/2022-13:25:03] [V] [TRT] sections.3.2.conv3.module.weight + QuantizeLinear_890_quantize_scale_node + Conv_894 + Add_902 + Relu_903 Set Tactic Name: ampere_fp32_i8816cudnn_int8_256x64_ldg16_relu_medium_nt_v1 Tactic: -621838502160440068 [03/25/2022-13:25:03] [V] [TRT] Tactic: -621838502160440068 Time: 0.147584 [03/25/2022-13:25:03] [V] [TRT] Fastest Tactic: 2185926646498217601 Time: 0.107648 [03/25/2022-13:25:03] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: 2185926646498217601 [03/25/2022-13:25:03] [V] [TRT] =============== Computing costs for [03/25/2022-13:25:03] [V] [TRT] *************** Autotuning format combination: Float(100352,49,7,1) -> Float(2048,1,1,1) *************** [03/25/2022-13:25:03] [V] [TRT] --------------- Timing Runner: GlobalAveragePool_904 (TiledPooling) [03/25/2022-13:25:03] [V] [TRT] Tactic: 8192257 Time: 0.256896 [03/25/2022-13:25:03] [V] [TRT] Tactic: 8257793 Time: 0.147712 [03/25/2022-13:25:03] [V] [TRT] Tactic: 8323329 Time: 0.142464 [03/25/2022-13:25:03] [V] [TRT] Tactic: 8388865 Time: 0.139904 [03/25/2022-13:25:03] [V] [TRT] Tactic: 8454401 Time: 0.13952 [03/25/2022-13:25:03] [V] [TRT] Tactic: 8519937 Time: 0.139136 [03/25/2022-13:25:03] [V] [TRT] Tactic: 8585473 Time: 0.13888 [03/25/2022-13:25:03] [V] [TRT] Tactic: 8651009 Time: 0.13888 [03/25/2022-13:25:03] [V] [TRT] Fastest Tactic: 8651009 Time: 0.13888 [03/25/2022-13:25:03] [V] [TRT] --------------- Timing Runner: GlobalAveragePool_904 (CudnnPooling) [03/25/2022-13:25:03] [V] [TRT] Tactic: -1 Time: 0.059776 [03/25/2022-13:25:03] [V] [TRT] Fastest Tactic: -1 Time: 0.059776 [03/25/2022-13:25:03] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CudnnPooling Tactic: -1 [03/25/2022-13:25:03] [V] [TRT] =============== Computing costs for [03/25/2022-13:25:03] [V] [TRT] *************** Autotuning format combination: Float(2048,1,1,1) -> Float(1000,1,1,1) *************** [03/25/2022-13:25:03] [V] [TRT] --------------- Timing Runner: Gemm_911 (CudaDepthwiseConvolution) [03/25/2022-13:25:03] [V] [TRT] CudaDepthwiseConvolution has no valid tactics for this config, skipping [03/25/2022-13:25:03] [V] [TRT] --------------- Timing Runner: Gemm_911 (FusedConvActConvolution) [03/25/2022-13:25:03] [V] [TRT] FusedConvActConvolution has no valid tactics for this config, skipping [03/25/2022-13:25:04] [V] [TRT] --------------- Timing Runner: Gemm_911 (CudnnConvolution) [03/25/2022-13:25:04] [V] [TRT] Tactic: 0 Time: 0.288768 [03/25/2022-13:25:04] [V] [TRT] Tactic: 1 Time: 0.195968 [03/25/2022-13:25:04] [V] [TRT] Tactic: 2 Time: 0.263168 [03/25/2022-13:25:04] [V] [TRT] Tactic: 4 skipped. Scratch requested: 5330763776, available: 16777216 [03/25/2022-13:25:04] [V] [TRT] Tactic: 5 skipped. Scratch requested: 331587584, available: 16777216 [03/25/2022-13:25:04] [V] [TRT] Tactic: 56 Time: 0.28864 [03/25/2022-13:25:04] [V] [TRT] Tactic: 57 Time: 0.195584 [03/25/2022-13:25:04] [V] [TRT] Tactic: 58 Time: 0.262912 [03/25/2022-13:25:04] [V] [TRT] Tactic: 60 skipped. Scratch requested: 5330763776, available: 16777216 [03/25/2022-13:25:04] [V] [TRT] Tactic: 61 skipped. Scratch requested: 331587584, available: 16777216 [03/25/2022-13:25:04] [V] [TRT] Tactic: 112 Time: 0.288384 [03/25/2022-13:25:04] [V] [TRT] Tactic: 113 Time: 0.288384 [03/25/2022-13:25:04] [V] [TRT] Tactic: 114 Time: 0.262912 [03/25/2022-13:25:04] [V] [TRT] Tactic: 116 skipped. Scratch requested: 5330763776, available: 16777216 [03/25/2022-13:25:04] [V] [TRT] Tactic: 117 skipped. Scratch requested: 331587584, available: 16777216 [03/25/2022-13:25:04] [I] [TRT] Some tactics do not have sufficient workspace memory to run. Increasing workspace size may increase performance, please check verbose output. [03/25/2022-13:25:04] [V] [TRT] Fastest Tactic: 57 Time: 0.195584 [03/25/2022-13:25:04] [V] [TRT] Setting workspace to 331587584enables more tactics for profiling [03/25/2022-13:25:04] [V] [TRT] --------------- Timing Runner: Gemm_911 (CublasConvolution) [03/25/2022-13:25:04] [V] [TRT] Tactic: 0 Time: 0.071296 [03/25/2022-13:25:04] [V] [TRT] Tactic: 1 Time: 0.066304 [03/25/2022-13:25:04] [V] [TRT] Tactic: 2 Time: 0.033792 [03/25/2022-13:25:04] [V] [TRT] Tactic: 3 Time: 0.031104 [03/25/2022-13:25:04] [V] [TRT] Fastest Tactic: 3 Time: 0.031104 [03/25/2022-13:25:05] [V] [TRT] --------------- Timing Runner: Gemm_911 (CaskConvolution) [03/25/2022-13:25:05] [V] [TRT] Gemm_911 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_f32f32_tf32f32_f32_nchwkrsc_nchw_tilesize64x128x32_stage5_warpsize2x2x1_g1_tensor16x8x8_t1r1s1_aligna4_alignc4 Tactic: 621442388677115936 [03/25/2022-13:25:05] [V] [TRT] Tactic: 621442388677115936 Time: 0.163456 [03/25/2022-13:25:05] [V] [TRT] Gemm_911 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_f32f32_tf32f32_f32_nchwkrsc_nchw_tilesize64x32x64_stage5_warpsize2x2x1_g1_tensor16x8x8_simple_t1r1s1_aligna4_alignc4 Tactic: 697756878388249475 [03/25/2022-13:25:05] [V] [TRT] Tactic: 697756878388249475 Time: 0.128896 [03/25/2022-13:25:05] [V] [TRT] Gemm_911 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_f32f32_tf32f32_f32_nchwkrsc_nchw_tilesize128x64x32_stage5_warpsize2x2x1_g1_tensor16x8x8_simple_t1r1s1_aligna4_alignc4 Tactic: 1159272950022995759 [03/25/2022-13:25:05] [V] [TRT] Tactic: 1159272950022995759 Time: 0.33664 [03/25/2022-13:25:05] [V] [TRT] Gemm_911 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_f32f32_tf32f32_f32_nchwkrsc_nchw_tilesize64x32x64_stage5_warpsize2x2x1_g1_tensor16x8x8_t1r1s1_aligna4_alignc4 Tactic: 1327459216264546184 [03/25/2022-13:25:05] [V] [TRT] Tactic: 1327459216264546184 Time: 0.131968 [03/25/2022-13:25:05] [V] [TRT] Gemm_911 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_f32f32_tf32f32_f32_nchwkrsc_nchw_tilesize128x256x32_stage3_warpsize2x4x1_g1_tensor16x8x8_simple_t1r1s1_aligna4_alignc4 Tactic: 2310520233542099555 [03/25/2022-13:25:05] [V] [TRT] Tactic: 2310520233542099555 Time: 0.4704 [03/25/2022-13:25:05] [V] [TRT] Gemm_911 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_f32f32_tf32f32_f32_nchwkrsc_nchw_tilesize128x64x32_stage5_warpsize2x2x1_g1_tensor16x8x8_t1r1s1_aligna4_alignc4 Tactic: 4032904638566464623 [03/25/2022-13:25:05] [V] [TRT] Tactic: 4032904638566464623 Time: 0.338176 [03/25/2022-13:25:05] [V] [TRT] Gemm_911 Set Tactic Name: ampere_scudnn_128x64_relu_small_nn_v1 Tactic: 4549827808004681195 [03/25/2022-13:25:05] [V] [TRT] Tactic: 4549827808004681195 Time: 0.466688 [03/25/2022-13:25:05] [V] [TRT] Gemm_911 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_f32f32_tf32f32_f32_nchwkrsc_nchw_tilesize128x64x64_stage3_warpsize2x2x1_g1_tensor16x8x8_t1r1s1_aligna4_alignc4 Tactic: 4918658762935651592 [03/25/2022-13:25:05] [V] [TRT] Tactic: 4918658762935651592 Time: 0.333184 [03/25/2022-13:25:05] [V] [TRT] Gemm_911 Set Tactic Name: ampere_scudnn_128x128_relu_small_nn_v1 Tactic: 5779835512569528575 [03/25/2022-13:25:05] [V] [TRT] Tactic: 5779835512569528575 Time: 0.600704 [03/25/2022-13:25:05] [V] [TRT] Gemm_911 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_f32f32_tf32f32_f32_nchwkrsc_nchw_tilesize256x128x32_stage3_warpsize4x2x1_g1_tensor16x8x8_t1r1s1_aligna4_alignc4 Tactic: 5837905844602864231 [03/25/2022-13:25:05] [V] [TRT] Tactic: 5837905844602864231 Time: 0.64768 [03/25/2022-13:25:05] [V] [TRT] Gemm_911 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_f32f32_tf32f32_f32_nchwkrsc_nchw_tilesize128x128x32_stage4_warpsize2x2x1_g1_tensor16x8x8_simple_t1r1s1_aligna4_alignc4 Tactic: 6166883504066133838 [03/25/2022-13:25:05] [V] [TRT] Tactic: 6166883504066133838 Time: 0.43072 [03/25/2022-13:25:05] [V] [TRT] Gemm_911 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_f32f32_tf32f32_f32_nchwkrsc_nchw_tilesize256x64x32_stage3_warpsize4x2x1_g1_tensor16x8x8_t1r1s1_aligna4_alignc4 Tactic: 6210049212073459059 [03/25/2022-13:25:05] [V] [TRT] Tactic: 6210049212073459059 Time: 0.58944 [03/25/2022-13:25:05] [V] [TRT] Gemm_911 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_f32f32_tf32f32_f32_nchwkrsc_nchw_tilesize128x64x64_stage3_warpsize2x2x1_g1_tensor16x8x8_simple_t1r1s1_aligna4_alignc4 Tactic: 6500866402607985231 [03/25/2022-13:25:05] [V] [TRT] Tactic: 6500866402607985231 Time: 0.328448 [03/25/2022-13:25:05] [V] [TRT] Gemm_911 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_f32f32_tf32f32_f32_nchwkrsc_nchw_tilesize256x64x32_stage3_warpsize4x2x1_g1_tensor16x8x8_simple_t1r1s1_aligna4_alignc4 Tactic: 6723161876874263939 [03/25/2022-13:25:05] [V] [TRT] Tactic: 6723161876874263939 Time: 0.58752 [03/25/2022-13:25:05] [V] [TRT] Gemm_911 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_f32f32_tf32f32_f32_nchwkrsc_nchw_tilesize64x128x32_stage5_warpsize2x2x1_g1_tensor16x8x8_simple_t1r1s1_aligna4_alignc4 Tactic: 6868013466259746979 [03/25/2022-13:25:05] [V] [TRT] Tactic: 6868013466259746979 Time: 0.161792 [03/25/2022-13:25:05] [V] [TRT] Gemm_911 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_f32f32_tf32f32_f32_nchwkrsc_nchw_tilesize64x64x64_stage4_warpsize2x2x1_g1_tensor16x8x8_simple_t1r1s1_aligna4_alignc4 Tactic: 6901214267543238617 [03/25/2022-13:25:05] [V] [TRT] Tactic: 6901214267543238617 Time: 0.138368 [03/25/2022-13:25:05] [V] [TRT] Gemm_911 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_f32f32_tf32f32_f32_nchwkrsc_nchw_tilesize128x128x32_stage4_warpsize2x2x1_g1_tensor16x8x8_t1r1s1_aligna4_alignc4 Tactic: 8048119769507926928 [03/25/2022-13:25:05] [V] [TRT] Tactic: 8048119769507926928 Time: 0.414848 [03/25/2022-13:25:05] [V] [TRT] Gemm_911 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_f32f32_tf32f32_f32_nchwkrsc_nchw_tilesize128x128x16_stage4_warpsize2x2x1_g1_tensor16x8x8_t1r1s1_aligna4_alignc4 Tactic: 9151672657204310840 [03/25/2022-13:25:05] [V] [TRT] Tactic: 9151672657204310840 Time: 0.368384 [03/25/2022-13:25:05] [V] [TRT] Gemm_911 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_f32f32_tf32f32_f32_nchwkrsc_nchw_tilesize256x128x32_stage3_warpsize4x2x1_g1_tensor16x8x8_simple_t1r1s1_aligna4_alignc4 Tactic: -8877086111929938764 [03/25/2022-13:25:05] [V] [TRT] Tactic: -8877086111929938764 Time: 0.617728 [03/25/2022-13:25:05] [V] [TRT] Gemm_911 Set Tactic Name: ampere_scudnn_128x32_relu_interior_nn_v1 Tactic: -7491730084094677098 [03/25/2022-13:25:05] [V] [TRT] Tactic: -7491730084094677098 Time: 0.52224 [03/25/2022-13:25:05] [V] [TRT] Gemm_911 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_f32f32_tf32f32_f32_nchwkrsc_nchw_tilesize128x128x16_stage4_warpsize2x2x1_g1_tensor16x8x8_simple_t1r1s1_aligna4_alignc4 Tactic: -6622064180404051845 [03/25/2022-13:25:05] [V] [TRT] Tactic: -6622064180404051845 Time: 0.370304 [03/25/2022-13:25:05] [V] [TRT] Gemm_911 Set Tactic Name: ampere_scudnn_128x32_relu_small_nn_v1 Tactic: -6313876406580483184 [03/25/2022-13:25:05] [V] [TRT] Tactic: -6313876406580483184 Time: 0.510976 [03/25/2022-13:25:05] [V] [TRT] Gemm_911 Set Tactic Name: ampere_scudnn_128x128_relu_interior_nn_v1 Tactic: -6273689210331812572 [03/25/2022-13:25:05] [V] [TRT] Tactic: -6273689210331812572 Time: 0.589696 [03/25/2022-13:25:05] [V] [TRT] Gemm_911 Set Tactic Name: ampere_scudnn_128x64_relu_interior_nn_v1 Tactic: -4337126844824617177 [03/25/2022-13:25:05] [V] [TRT] Tactic: -4337126844824617177 Time: 0.459264 [03/25/2022-13:25:05] [V] [TRT] Gemm_911 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_f32f32_tf32f32_f32_nchwkrsc_nchw_tilesize128x256x32_stage3_warpsize2x4x1_g1_tensor16x8x8_t1r1s1_aligna4_alignc4 Tactic: -2777237991111865351 [03/25/2022-13:25:05] [V] [TRT] Tactic: -2777237991111865351 Time: 0.474496 [03/25/2022-13:25:05] [V] [TRT] Gemm_911 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_f32f32_tf32f32_f32_nchwkrsc_nchw_tilesize64x64x64_stage4_warpsize2x2x1_g1_tensor16x8x8_t1r1s1_aligna4_alignc4 Tactic: -1349585930542925704 [03/25/2022-13:25:05] [V] [TRT] Tactic: -1349585930542925704 Time: 0.139776 [03/25/2022-13:25:05] [V] [TRT] Gemm_911 Set Tactic Name: ampere_scudnn_128x128_relu_medium_nn_v1 Tactic: -1123676555321336786 [03/25/2022-13:25:05] [V] [TRT] Tactic: -1123676555321336786 Time: 0.592896 [03/25/2022-13:25:05] [V] [TRT] Gemm_911 Set Tactic Name: ampere_scudnn_128x64_relu_medium_nn_v1 Tactic: -701551393537224327 [03/25/2022-13:25:05] [V] [TRT] Tactic: -701551393537224327 Time: 0.471936 [03/25/2022-13:25:05] [V] [TRT] Fastest Tactic: 697756878388249475 Time: 0.128896 [03/25/2022-13:25:05] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CublasConvolution Tactic: 3 [03/25/2022-13:25:05] [V] [TRT] *************** Autotuning format combination: Float(2048,1,2048,2048) -> Float(1000,1,1000,1000) *************** [03/25/2022-13:25:05] [V] [TRT] --------------- Timing Runner: Gemm_911 (CudnnConvolution) [03/25/2022-13:25:05] [V] [TRT] CudnnConvolution has no valid tactics for this config, skipping [03/25/2022-13:25:05] [V] [TRT] --------------- Timing Runner: Gemm_911 (CublasConvolution) [03/25/2022-13:25:05] [V] [TRT] CublasConvolution has no valid tactics for this config, skipping [03/25/2022-13:25:05] [V] [TRT] --------------- Timing Runner: Gemm_911 (CaskConvolution) [03/25/2022-13:25:05] [V] [TRT] Gemm_911 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_f32f32_f32f32_f32_nhwckrsc_nhwc_tilesize256x128x8_stage3_warpsize2x4x1_g1_ffma_t1r1s1 Tactic: 676988335020687107 [03/25/2022-13:25:05] [V] [TRT] Tactic: 676988335020687107 Time: 0.996864 [03/25/2022-13:25:05] [V] [TRT] Gemm_911 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_f32f32_f32f32_f32_nhwckrsc_nhwc_tilesize128x256x8_stage3_warpsize2x4x1_g1_ffma_t1r1s1 Tactic: 1149579359391877453 [03/25/2022-13:25:05] [V] [TRT] Tactic: 1149579359391877453 Time: 0.993792 [03/25/2022-13:25:05] [V] [TRT] Gemm_911 Set Tactic Name: ampere_scudnn_128x128_relu_exp_interior_nhwc_tn_v1 Tactic: 1663866669559596164 [03/25/2022-13:25:05] [V] [TRT] Tactic: 1663866669559596164 Time: 0.508288 [03/25/2022-13:25:05] [V] [TRT] Gemm_911 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_f32f32_f32f32_f32_nhwckrsc_nhwc_tilesize64x64x8_stage3_warpsize1x4x1_g1_ffma_t1r1s1 Tactic: 1995961315573863697 [03/25/2022-13:25:05] [V] [TRT] Tactic: 1995961315573863697 Time: 0.15424 [03/25/2022-13:25:05] [V] [TRT] Gemm_911 Set Tactic Name: ampere_scudnn_128x64_sliced1x2_ldg4_relu_exp_small_nhwc_tn_v1 Tactic: 2860655430572478466 [03/25/2022-13:25:05] [V] [TRT] Tactic: 2860655430572478466 Time: 0.265856 [03/25/2022-13:25:05] [V] [TRT] Gemm_911 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_f32f32_f32f32_f32_nhwckrsc_nhwc_tilesize256x64x8_stage3_warpsize1x4x1_g1_ffma_simple_t1r1s1 Tactic: 4232768147062126270 [03/25/2022-13:25:05] [V] [TRT] Tactic: 4232768147062126270 Time: 0.637696 [03/25/2022-13:25:05] [V] [TRT] Gemm_911 Set Tactic Name: ampere_scudnn_128x32_sliced1x4_ldg4_relu_exp_medium_nhwc_tn_v1 Tactic: 4474630279712975759 [03/25/2022-13:25:05] [V] [TRT] Tactic: 4474630279712975759 Time: 0.140416 [03/25/2022-13:25:05] [V] [TRT] Gemm_911 Set Tactic Name: ampere_scudnn_128x32_sliced1x4_ldg4_relu_exp_small_nhwc_tn_v1 Tactic: 4479823862704990365 [03/25/2022-13:25:05] [V] [TRT] Tactic: 4479823862704990365 Time: 0.139648 [03/25/2022-13:25:05] [V] [TRT] Gemm_911 Set Tactic Name: ampere_scudnn_128x64_sliced1x2_ldg4_relu_exp_medium_nhwc_tn_v1 Tactic: 4696204239951173149 [03/25/2022-13:25:05] [V] [TRT] Tactic: 4696204239951173149 Time: 0.265088 [03/25/2022-13:25:05] [V] [TRT] Gemm_911 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_f32f32_f32f32_f32_nhwckrsc_nhwc_tilesize128x64x8_stage3_warpsize1x4x1_g1_ffma_t1r1s1 Tactic: 5061046663754203417 [03/25/2022-13:25:05] [V] [TRT] Tactic: 5061046663754203417 Time: 0.284544 [03/25/2022-13:25:05] [V] [TRT] Gemm_911 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_f32f32_f32f32_f32_nhwckrsc_nhwc_tilesize256x64x8_stage3_warpsize1x4x1_g1_ffma_t1r1s1 Tactic: 5660369513040054181 [03/25/2022-13:25:05] [V] [TRT] Tactic: 5660369513040054181 Time: 0.638208 [03/25/2022-13:25:05] [V] [TRT] Gemm_911 Set Tactic Name: ampere_scudnn_128x128_relu_exp_small_nhwc_tn_v1 Tactic: 5778138195697110003 [03/25/2022-13:25:05] [V] [TRT] Tactic: 5778138195697110003 Time: 0.512768 [03/25/2022-13:25:05] [V] [TRT] Gemm_911 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_f32f32_f32f32_f32_nhwckrsc_nhwc_tilesize64x256x8_stage3_warpsize1x4x1_g1_ffma_t1r1s1 Tactic: 6002893715742835901 [03/25/2022-13:25:05] [V] [TRT] Tactic: 6002893715742835901 Time: 0.5568 [03/25/2022-13:25:05] [V] [TRT] Gemm_911 Set Tactic Name: ampere_scudnn_128x128_ldg4_relu_exp_medium_nhwc_tn_v1 Tactic: 8918020581761223752 [03/25/2022-13:25:05] [V] [TRT] Tactic: 8918020581761223752 Time: 0.394624 [03/25/2022-13:25:05] [V] [TRT] Gemm_911 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_f32f32_f32f32_f32_nhwckrsc_nhwc_tilesize256x128x8_stage3_warpsize2x4x1_g1_ffma_simple_t1r1s1 Tactic: 9016055318246906759 [03/25/2022-13:25:05] [V] [TRT] Tactic: 9016055318246906759 Time: 0.771072 [03/25/2022-13:25:05] [V] [TRT] Gemm_911 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_f32f32_f32f32_f32_nhwckrsc_nhwc_tilesize64x128x8_stage3_warpsize1x4x1_g1_ffma_t1r1s1 Tactic: -7609160790790750215 [03/25/2022-13:25:05] [V] [TRT] Tactic: -7609160790790750215 Time: 0.216704 [03/25/2022-13:25:05] [V] [TRT] Gemm_911 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_f32f32_f32f32_f32_nhwckrsc_nhwc_tilesize64x128x8_stage3_warpsize1x4x1_g1_ffma_simple_t1r1s1 Tactic: -7054781547842146201 [03/25/2022-13:25:05] [V] [TRT] Tactic: -7054781547842146201 Time: 0.215936 [03/25/2022-13:25:05] [V] [TRT] Gemm_911 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_f32f32_f32f32_f32_nhwckrsc_nhwc_tilesize64x64x8_stage3_warpsize1x4x1_g1_ffma_simple_t1r1s1 Tactic: -6773414409150198858 [03/25/2022-13:25:05] [V] [TRT] Tactic: -6773414409150198858 Time: 0.119552 [03/25/2022-13:25:06] [V] [TRT] Gemm_911 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_f32f32_f32f32_f32_nhwckrsc_nhwc_tilesize128x64x8_stage3_warpsize1x4x1_g1_ffma_simple_t1r1s1 Tactic: -5980517219165853661 [03/25/2022-13:25:06] [V] [TRT] Tactic: -5980517219165853661 Time: 0.221568 [03/25/2022-13:25:06] [V] [TRT] Gemm_911 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_f32f32_f32f32_f32_nhwckrsc_nhwc_tilesize64x256x8_stage3_warpsize1x4x1_g1_ffma_simple_t1r1s1 Tactic: -5910172158931405628 [03/25/2022-13:25:06] [V] [TRT] Tactic: -5910172158931405628 Time: 0.419072 [03/25/2022-13:25:06] [V] [TRT] Gemm_911 Set Tactic Name: ampere_scudnn_128x64_sliced1x2_ldg4_relu_exp_interior_nhwc_tn_v1 Tactic: -5905193483742532701 [03/25/2022-13:25:06] [V] [TRT] Tactic: -5905193483742532701 Time: 0.199936 [03/25/2022-13:25:06] [V] [TRT] Gemm_911 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_f32f32_f32f32_f32_nhwckrsc_nhwc_tilesize128x256x8_stage3_warpsize2x4x1_g1_ffma_simple_t1r1s1 Tactic: -4196636767445012021 [03/25/2022-13:25:06] [V] [TRT] Tactic: -4196636767445012021 Time: 0.773376 [03/25/2022-13:25:06] [V] [TRT] Gemm_911 Set Tactic Name: ampere_scudnn_128x32_sliced1x4_ldg4_relu_exp_interior_nhwc_tn_v1 Tactic: -4035591156787122265 [03/25/2022-13:25:06] [V] [TRT] Tactic: -4035591156787122265 Time: 0.10944 [03/25/2022-13:25:06] [V] [TRT] Gemm_911 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_f32f32_f32f32_f32_nhwckrsc_nhwc_tilesize128x128x8_stage3_warpsize2x4x1_g1_ffma_t1r1s1 Tactic: -3829074795144908279 [03/25/2022-13:25:06] [V] [TRT] Tactic: -3829074795144908279 Time: 0.388864 [03/25/2022-13:25:06] [V] [TRT] Gemm_911 Set Tactic Name: ampere_scudnn_128x128_relu_exp_medium_nhwc_tn_v1 Tactic: -2809379259463049391 [03/25/2022-13:25:06] [V] [TRT] Tactic: -2809379259463049391 Time: 0.396288 [03/25/2022-13:25:06] [V] [TRT] Gemm_911 Set Tactic Name: ampere_scudnn_128x128_ldg4_relu_exp_interior_nhwc_tn_v1 Tactic: -1985235291706575900 [03/25/2022-13:25:06] [V] [TRT] Tactic: -1985235291706575900 Time: 0.38656 [03/25/2022-13:25:06] [V] [TRT] Gemm_911 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_f32f32_f32f32_f32_nhwckrsc_nhwc_tilesize128x128x8_stage3_warpsize2x4x1_g1_ffma_simple_t1r1s1 Tactic: -711510282315844248 [03/25/2022-13:25:06] [V] [TRT] Tactic: -711510282315844248 Time: 0.388096 [03/25/2022-13:25:06] [V] [TRT] Gemm_911 Set Tactic Name: ampere_scudnn_128x128_ldg4_relu_exp_small_nhwc_tn_v1 Tactic: -504296718212024303 [03/25/2022-13:25:06] [V] [TRT] Tactic: -504296718212024303 Time: 0.396672 [03/25/2022-13:25:06] [V] [TRT] Fastest Tactic: -4035591156787122265 Time: 0.10944 [03/25/2022-13:25:06] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: -4035591156787122265 [03/25/2022-13:25:06] [V] [TRT] *************** Autotuning format combination: Float(512,1:4,512,512) -> Float(250,1:4,250,250) *************** [03/25/2022-13:25:06] [V] [TRT] --------------- Timing Runner: Gemm_911 (CudnnConvolution) [03/25/2022-13:25:06] [V] [TRT] CudnnConvolution has no valid tactics for this config, skipping [03/25/2022-13:25:06] [V] [TRT] --------------- Timing Runner: Gemm_911 (CublasConvolution) [03/25/2022-13:25:06] [V] [TRT] CublasConvolution has no valid tactics for this config, skipping [03/25/2022-13:25:06] [V] [TRT] --------------- Timing Runner: Gemm_911 (CaskConvolution) [03/25/2022-13:25:06] [V] [TRT] Gemm_911 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_f32f32_f32f32_f32_nhwckrsc_nhwc_tilesize256x128x8_stage3_warpsize2x4x1_g1_ffma_t1r1s1 Tactic: 676988335020687107 [03/25/2022-13:25:06] [V] [TRT] Tactic: 676988335020687107 Time: 0.7744 [03/25/2022-13:25:06] [V] [TRT] Gemm_911 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_f32f32_f32f32_f32_nhwckrsc_nhwc_tilesize128x256x8_stage3_warpsize2x4x1_g1_ffma_t1r1s1 Tactic: 1149579359391877453 [03/25/2022-13:25:06] [V] [TRT] Tactic: 1149579359391877453 Time: 0.771712 [03/25/2022-13:25:06] [V] [TRT] Gemm_911 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_f32f32_tf32f32_f32_nhwckrsc_nhwc_tilesize128x128x16_stage4_warpsize2x2x1_g1_tensor16x8x8_t1r1s1 Tactic: 1373022415249282411 [03/25/2022-13:25:06] [V] [TRT] Tactic: 1373022415249282411 Time: 0.083072 [03/25/2022-13:25:06] [V] [TRT] Gemm_911 Set Tactic Name: ampere_scudnn_128x128_relu_exp_interior_nhwc_tn_v1 Tactic: 1663866669559596164 [03/25/2022-13:25:06] [V] [TRT] Tactic: 1663866669559596164 Time: 0.395392 [03/25/2022-13:25:06] [V] [TRT] Gemm_911 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_f32f32_f32f32_f32_nhwckrsc_nhwc_tilesize64x64x8_stage3_warpsize1x4x1_g1_ffma_t1r1s1 Tactic: 1995961315573863697 [03/25/2022-13:25:06] [V] [TRT] Tactic: 1995961315573863697 Time: 0.120192 [03/25/2022-13:25:06] [V] [TRT] Gemm_911 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_f32f32_tf32f32_f32_nhwckrsc_nhwc_tilesize256x64x32_stage3_warpsize4x2x1_g1_tensor16x8x8_t1r1s1 Tactic: 2536540085349545921 [03/25/2022-13:25:06] [V] [TRT] Tactic: 2536540085349545921 Time: 0.073088 [03/25/2022-13:25:06] [V] [TRT] Gemm_911 Set Tactic Name: ampere_scudnn_128x64_sliced1x2_ldg4_relu_exp_small_nhwc_tn_v1 Tactic: 2860655430572478466 [03/25/2022-13:25:06] [V] [TRT] Tactic: 2860655430572478466 Time: 0.20608 [03/25/2022-13:25:06] [V] [TRT] Gemm_911 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_f32f32_tf32f32_f32_nhwckrsc_nhwc_tilesize128x64x32_stage5_warpsize2x2x1_g1_tensor16x8x8_simple_t1r1s1 Tactic: 3119404851233737725 [03/25/2022-13:25:06] [V] [TRT] Tactic: 3119404851233737725 Time: 0.042496 [03/25/2022-13:25:06] [V] [TRT] Gemm_911 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_f32f32_f32f32_f32_nhwckrsc_nhwc_tilesize256x64x8_stage3_warpsize1x4x1_g1_ffma_simple_t1r1s1 Tactic: 4232768147062126270 [03/25/2022-13:25:06] [V] [TRT] Tactic: 4232768147062126270 Time: 0.494976 [03/25/2022-13:25:06] [V] [TRT] Gemm_911 Set Tactic Name: ampere_scudnn_128x32_sliced1x4_ldg4_relu_exp_medium_nhwc_tn_v1 Tactic: 4474630279712975759 [03/25/2022-13:25:06] [V] [TRT] Tactic: 4474630279712975759 Time: 0.109568 [03/25/2022-13:25:06] [V] [TRT] Gemm_911 Set Tactic Name: ampere_scudnn_128x32_sliced1x4_ldg4_relu_exp_small_nhwc_tn_v1 Tactic: 4479823862704990365 [03/25/2022-13:25:06] [V] [TRT] Tactic: 4479823862704990365 Time: 0.108672 [03/25/2022-13:25:06] [V] [TRT] Gemm_911 Set Tactic Name: ampere_scudnn_128x64_sliced1x2_ldg4_relu_exp_medium_nhwc_tn_v1 Tactic: 4696204239951173149 [03/25/2022-13:25:06] [V] [TRT] Tactic: 4696204239951173149 Time: 0.20608 [03/25/2022-13:25:06] [V] [TRT] Gemm_911 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_f32f32_f32f32_f32_nhwckrsc_nhwc_tilesize128x64x8_stage3_warpsize1x4x1_g1_ffma_t1r1s1 Tactic: 5061046663754203417 [03/25/2022-13:25:06] [V] [TRT] Tactic: 5061046663754203417 Time: 0.221184 [03/25/2022-13:25:06] [V] [TRT] Gemm_911 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_f32f32_tf32f32_f32_nhwckrsc_nhwc_tilesize64x64x64_stage4_warpsize2x2x1_g1_tensor16x8x8_simple_t1r1s1 Tactic: 5108195418077987197 [03/25/2022-13:25:06] [V] [TRT] Tactic: 5108195418077987197 Time: 0.03136 [03/25/2022-13:25:06] [V] [TRT] Gemm_911 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_f32f32_tf32f32_f32_nhwckrsc_nhwc_tilesize128x64x64_stage3_warpsize2x2x1_g1_tensor16x8x8_simple_t1r1s1 Tactic: 5380043773350002416 [03/25/2022-13:25:06] [V] [TRT] Tactic: 5380043773350002416 Time: 0.043008 [03/25/2022-13:25:06] [V] [TRT] Gemm_911 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_f32f32_tf32f32_f32_nhwckrsc_nhwc_tilesize256x64x32_stage3_warpsize4x2x1_g1_tensor16x8x8_simple_t1r1s1 Tactic: 5610788582700027969 [03/25/2022-13:25:06] [V] [TRT] Tactic: 5610788582700027969 Time: 0.071424 [03/25/2022-13:25:06] [V] [TRT] Gemm_911 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_f32f32_f32f32_f32_nhwckrsc_nhwc_tilesize256x64x8_stage3_warpsize1x4x1_g1_ffma_t1r1s1 Tactic: 5660369513040054181 [03/25/2022-13:25:06] [V] [TRT] Tactic: 5660369513040054181 Time: 0.49472 [03/25/2022-13:25:06] [V] [TRT] Gemm_911 Set Tactic Name: ampere_scudnn_128x128_relu_exp_small_nhwc_tn_v1 Tactic: 5778138195697110003 [03/25/2022-13:25:06] [V] [TRT] Tactic: 5778138195697110003 Time: 0.39808 [03/25/2022-13:25:06] [V] [TRT] Gemm_911 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_f32f32_f32f32_f32_nhwckrsc_nhwc_tilesize64x256x8_stage3_warpsize1x4x1_g1_ffma_t1r1s1 Tactic: 6002893715742835901 [03/25/2022-13:25:06] [V] [TRT] Tactic: 6002893715742835901 Time: 0.431104 [03/25/2022-13:25:06] [V] [TRT] Gemm_911 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_f32f32_tf32f32_f32_nhwckrsc_nhwc_tilesize128x128x32_stage4_warpsize2x2x1_g1_tensor16x8x8_t1r1s1 Tactic: 6031243482516294286 [03/25/2022-13:25:06] [V] [TRT] Tactic: 6031243482516294286 Time: 0.07168 [03/25/2022-13:25:06] [V] [TRT] Gemm_911 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_f32f32_tf32f32_f32_nhwckrsc_nhwc_tilesize256x128x32_stage3_warpsize4x2x1_g1_tensor16x8x8_simple_t1r1s1 Tactic: 6419443404913215692 [03/25/2022-13:25:06] [V] [TRT] Tactic: 6419443404913215692 Time: 0.110592 [03/25/2022-13:25:06] [V] [TRT] Gemm_911 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_f32f32_tf32f32_f32_nhwckrsc_nhwc_tilesize128x256x32_stage3_warpsize2x4x1_g1_tensor16x8x8_simple_t1r1s1 Tactic: 6571171993204510025 [03/25/2022-13:25:06] [V] [TRT] Tactic: 6571171993204510025 Time: 0.111488 [03/25/2022-13:25:06] [V] [TRT] Gemm_911 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_f32f32_tf32f32_f32_nhwckrsc_nhwc_tilesize64x128x32_stage5_warpsize2x2x1_g1_tensor16x8x8_t1r1s1 Tactic: 6832381412128253616 [03/25/2022-13:25:06] [V] [TRT] Tactic: 6832381412128253616 Time: 0.043264 [03/25/2022-13:25:06] [V] [TRT] Gemm_911 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_f32f32_tf32f32_f32_nhwckrsc_nhwc_tilesize128x64x64_stage3_warpsize2x2x1_g1_tensor16x8x8_t1r1s1 Tactic: 7058864664858220889 [03/25/2022-13:25:06] [V] [TRT] Tactic: 7058864664858220889 Time: 0.044032 [03/25/2022-13:25:06] [V] [TRT] Gemm_911 Set Tactic Name: ampere_scudnn_128x128_ldg4_relu_exp_medium_nhwc_tn_v1 Tactic: 8918020581761223752 [03/25/2022-13:25:06] [V] [TRT] Tactic: 8918020581761223752 Time: 0.39424 [03/25/2022-13:25:06] [V] [TRT] Gemm_911 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_f32f32_f32f32_f32_nhwckrsc_nhwc_tilesize256x128x8_stage3_warpsize2x4x1_g1_ffma_simple_t1r1s1 Tactic: 9016055318246906759 [03/25/2022-13:25:06] [V] [TRT] Tactic: 9016055318246906759 Time: 0.770944 [03/25/2022-13:25:06] [V] [TRT] Gemm_911 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_f32f32_tf32f32_f32_nhwckrsc_nhwc_tilesize64x32x64_stage5_warpsize2x2x1_g1_tensor16x8x8_simple_t1r1s1 Tactic: -8853476220903838035 [03/25/2022-13:25:06] [V] [TRT] Tactic: -8853476220903838035 Time: 0.0256 [03/25/2022-13:25:06] [V] [TRT] Gemm_911 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_f32f32_f32f32_f32_nhwckrsc_nhwc_tilesize64x128x8_stage3_warpsize1x4x1_g1_ffma_t1r1s1 Tactic: -7609160790790750215 [03/25/2022-13:25:06] [V] [TRT] Tactic: -7609160790790750215 Time: 0.215936 [03/25/2022-13:25:06] [V] [TRT] Gemm_911 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_f32f32_tf32f32_f32_nhwckrsc_nhwc_tilesize128x128x16_stage4_warpsize2x2x1_g1_tensor16x8x8_simple_t1r1s1 Tactic: -7067026478815706014 [03/25/2022-13:25:06] [V] [TRT] Tactic: -7067026478815706014 Time: 0.083584 [03/25/2022-13:25:06] [V] [TRT] Gemm_911 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_f32f32_f32f32_f32_nhwckrsc_nhwc_tilesize64x128x8_stage3_warpsize1x4x1_g1_ffma_simple_t1r1s1 Tactic: -7054781547842146201 [03/25/2022-13:25:06] [V] [TRT] Tactic: -7054781547842146201 Time: 0.216064 [03/25/2022-13:25:06] [V] [TRT] Gemm_911 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_f32f32_f32f32_f32_nhwckrsc_nhwc_tilesize64x64x8_stage3_warpsize1x4x1_g1_ffma_simple_t1r1s1 Tactic: -6773414409150198858 [03/25/2022-13:25:06] [V] [TRT] Tactic: -6773414409150198858 Time: 0.119296 [03/25/2022-13:25:06] [V] [TRT] Gemm_911 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_f32f32_tf32f32_f32_nhwckrsc_nhwc_tilesize64x64x64_stage4_warpsize2x2x1_g1_tensor16x8x8_t1r1s1 Tactic: -6407735577805887428 [03/25/2022-13:25:06] [V] [TRT] Tactic: -6407735577805887428 Time: 0.032768 [03/25/2022-13:25:06] [V] [TRT] Gemm_911 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_f32f32_tf32f32_f32_nhwckrsc_nhwc_tilesize256x128x32_stage3_warpsize4x2x1_g1_tensor16x8x8_t1r1s1 Tactic: -6073218138311523634 [03/25/2022-13:25:06] [V] [TRT] Tactic: -6073218138311523634 Time: 0.110976 [03/25/2022-13:25:06] [V] [TRT] Gemm_911 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_f32f32_f32f32_f32_nhwckrsc_nhwc_tilesize128x64x8_stage3_warpsize1x4x1_g1_ffma_simple_t1r1s1 Tactic: -5980517219165853661 [03/25/2022-13:25:06] [V] [TRT] Tactic: -5980517219165853661 Time: 0.221824 [03/25/2022-13:25:06] [V] [TRT] Gemm_911 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_f32f32_f32f32_f32_nhwckrsc_nhwc_tilesize64x256x8_stage3_warpsize1x4x1_g1_ffma_simple_t1r1s1 Tactic: -5910172158931405628 [03/25/2022-13:25:06] [V] [TRT] Tactic: -5910172158931405628 Time: 0.419584 [03/25/2022-13:25:06] [V] [TRT] Gemm_911 Set Tactic Name: ampere_scudnn_128x64_sliced1x2_ldg4_relu_exp_interior_nhwc_tn_v1 Tactic: -5905193483742532701 [03/25/2022-13:25:06] [V] [TRT] Tactic: -5905193483742532701 Time: 0.200448 [03/25/2022-13:25:06] [V] [TRT] Gemm_911 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_f32f32_tf32f32_f32_nhwckrsc_nhwc_tilesize64x128x32_stage5_warpsize2x2x1_g1_tensor16x8x8_simple_t1r1s1 Tactic: -5033173285090644307 [03/25/2022-13:25:06] [V] [TRT] Tactic: -5033173285090644307 Time: 0.041856 [03/25/2022-13:25:06] [V] [TRT] Gemm_911 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_f32f32_tf32f32_f32_nhwckrsc_nhwc_tilesize128x128x32_stage4_warpsize2x2x1_g1_tensor16x8x8_simple_t1r1s1 Tactic: -5010594849215755610 [03/25/2022-13:25:06] [V] [TRT] Tactic: -5010594849215755610 Time: 0.071424 [03/25/2022-13:25:06] [V] [TRT] Gemm_911 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_f32f32_f32f32_f32_nhwckrsc_nhwc_tilesize128x256x8_stage3_warpsize2x4x1_g1_ffma_simple_t1r1s1 Tactic: -4196636767445012021 [03/25/2022-13:25:06] [V] [TRT] Tactic: -4196636767445012021 Time: 0.773888 [03/25/2022-13:25:06] [V] [TRT] Gemm_911 Set Tactic Name: ampere_scudnn_128x32_sliced1x4_ldg4_relu_exp_interior_nhwc_tn_v1 Tactic: -4035591156787122265 [03/25/2022-13:25:06] [V] [TRT] Tactic: -4035591156787122265 Time: 0.109056 [03/25/2022-13:25:06] [V] [TRT] Gemm_911 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_f32f32_f32f32_f32_nhwckrsc_nhwc_tilesize128x128x8_stage3_warpsize2x4x1_g1_ffma_t1r1s1 Tactic: -3829074795144908279 [03/25/2022-13:25:06] [V] [TRT] Tactic: -3829074795144908279 Time: 0.388864 [03/25/2022-13:25:06] [V] [TRT] Gemm_911 Set Tactic Name: ampere_scudnn_128x128_relu_exp_medium_nhwc_tn_v1 Tactic: -2809379259463049391 [03/25/2022-13:25:06] [V] [TRT] Tactic: -2809379259463049391 Time: 0.396032 [03/25/2022-13:25:06] [V] [TRT] Gemm_911 Set Tactic Name: ampere_scudnn_128x128_ldg4_relu_exp_interior_nhwc_tn_v1 Tactic: -1985235291706575900 [03/25/2022-13:25:06] [V] [TRT] Tactic: -1985235291706575900 Time: 0.386944 [03/25/2022-13:25:06] [V] [TRT] Gemm_911 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_f32f32_tf32f32_f32_nhwckrsc_nhwc_tilesize128x256x32_stage3_warpsize2x4x1_g1_tensor16x8x8_t1r1s1 Tactic: -1450865838092804082 [03/25/2022-13:25:06] [V] [TRT] Tactic: -1450865838092804082 Time: 0.111104 [03/25/2022-13:25:06] [V] [TRT] Gemm_911 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_f32f32_tf32f32_f32_nhwckrsc_nhwc_tilesize64x32x64_stage5_warpsize2x2x1_g1_tensor16x8x8_t1r1s1 Tactic: -1366318165940453381 [03/25/2022-13:25:06] [V] [TRT] Tactic: -1366318165940453381 Time: 0.025344 [03/25/2022-13:25:06] [V] [TRT] Gemm_911 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_f32f32_tf32f32_f32_nhwckrsc_nhwc_tilesize128x64x32_stage5_warpsize2x2x1_g1_tensor16x8x8_t1r1s1 Tactic: -1138876066247138089 [03/25/2022-13:25:06] [V] [TRT] Tactic: -1138876066247138089 Time: 0.042368 [03/25/2022-13:25:06] [V] [TRT] Gemm_911 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_f32f32_f32f32_f32_nhwckrsc_nhwc_tilesize128x128x8_stage3_warpsize2x4x1_g1_ffma_simple_t1r1s1 Tactic: -711510282315844248 [03/25/2022-13:25:06] [V] [TRT] Tactic: -711510282315844248 Time: 0.387968 [03/25/2022-13:25:06] [V] [TRT] Gemm_911 Set Tactic Name: ampere_scudnn_128x128_ldg4_relu_exp_small_nhwc_tn_v1 Tactic: -504296718212024303 [03/25/2022-13:25:06] [V] [TRT] Tactic: -504296718212024303 Time: 0.396928 [03/25/2022-13:25:06] [V] [TRT] Fastest Tactic: -1366318165940453381 Time: 0.025344 [03/25/2022-13:25:06] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: -1366318165940453381 [03/25/2022-13:25:06] [V] [TRT] =============== Computing costs for [03/25/2022-13:25:06] [V] [TRT] *************** Autotuning format combination: Float(1000,1,1,1) -> Float(1000,1) *************** [03/25/2022-13:25:06] [V] [TRT] --------------- Timing Runner: (Unnamed Layer* 975) [Shuffle] (Shuffle) [03/25/2022-13:25:06] [V] [TRT] Tactic: 0 Time: 0.008064 [03/25/2022-13:25:06] [V] [TRT] Tactic: 1 Time: 0.014336 [03/25/2022-13:25:06] [V] [TRT] Fastest Tactic: 0 Time: 0.008064 [03/25/2022-13:25:06] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: Shuffle Tactic: 0 [03/25/2022-13:25:06] [V] [TRT] =============== Computing costs for [03/25/2022-13:25:06] [V] [TRT] *************** Autotuning format combination: Float(1000,1) -> Float(1000,1) *************** [03/25/2022-13:25:06] [V] [TRT] --------------- Timing Runner: Softmax_912 (CudaSoftMax) [03/25/2022-13:25:06] [V] [TRT] Tactic: 1002 Time: 0.01024 [03/25/2022-13:25:06] [V] [TRT] Tactic: 1001 Time: 0.009216 [03/25/2022-13:25:06] [V] [TRT] Fastest Tactic: 1001 Time: 0.009216 [03/25/2022-13:25:06] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CudaSoftMax Tactic: 1001 [03/25/2022-13:25:06] [V] [TRT] Adding reformat layer: Reformatted Input Tensor 0 to sections.3.2.conv3.module.weight + QuantizeLinear_890_quantize_scale_node + Conv_894 + Add_902 + Relu_903 (2059) from Int8(784,49:32,7,1) to Int8(6272,49:4,7,1) [03/25/2022-13:25:06] [V] [TRT] Adding reformat layer: Reformatted Input Tensor 0 to Gemm_911 (2079) from Float(2048,1,1,1) to Float(512,1:4,512,512) [03/25/2022-13:25:06] [V] [TRT] Adding reformat layer: Reformatted Input Tensor 0 to (Unnamed Layer* 975) [Shuffle] ((Unnamed Layer* 971) [Fully Connected]_output) from Float(250,1:4,250,250) to Float(1000,1,1,1) [03/25/2022-13:25:06] [V] [TRT] Formats and tactics selection completed in 59.1276 seconds. [03/25/2022-13:25:06] [V] [TRT] After reformat layers: 63 layers [03/25/2022-13:25:06] [V] [TRT] Pre-optimized block assignment. [03/25/2022-13:25:06] [V] [TRT] Block size 19267584 [03/25/2022-13:25:06] [V] [TRT] Block size 102760448 [03/25/2022-13:25:06] [V] [TRT] Block size 25690112 [03/25/2022-13:25:06] [V] [TRT] Block size 102760448 [03/25/2022-13:25:06] [V] [TRT] Block size 25690112 [03/25/2022-13:25:06] [V] [TRT] Block size 25690112 [03/25/2022-13:25:06] [V] [TRT] Block size 102760448 [03/25/2022-13:25:06] [V] [TRT] Block size 25690112 [03/25/2022-13:25:06] [V] [TRT] Block size 25690112 [03/25/2022-13:25:06] [V] [TRT] Block size 102760448 [03/25/2022-13:25:06] [V] [TRT] Block size 25690112 [03/25/2022-13:25:06] [V] [TRT] Block size 25690112 [03/25/2022-13:25:06] [V] [TRT] Block size 102760448 [03/25/2022-13:25:06] [V] [TRT] Block size 51380224 [03/25/2022-13:25:06] [V] [TRT] Block size 51380224 [03/25/2022-13:25:06] [V] [TRT] Block size 12845056 [03/25/2022-13:25:06] [V] [TRT] Block size 51380224 [03/25/2022-13:25:06] [V] [TRT] Block size 12845056 [03/25/2022-13:25:06] [V] [TRT] Block size 12845056 [03/25/2022-13:25:06] [V] [TRT] Block size 51380224 [03/25/2022-13:25:06] [V] [TRT] Block size 12845056 [03/25/2022-13:25:06] [V] [TRT] Block size 12845056 [03/25/2022-13:25:06] [V] [TRT] Block size 51380224 [03/25/2022-13:25:06] [V] [TRT] Block size 12845056 [03/25/2022-13:25:06] [V] [TRT] Block size 12845056 [03/25/2022-13:25:06] [V] [TRT] Block size 51380224 [03/25/2022-13:25:06] [V] [TRT] Block size 25690112 [03/25/2022-13:25:06] [V] [TRT] Block size 25690112 [03/25/2022-13:25:06] [V] [TRT] Block size 6422528 [03/25/2022-13:25:06] [V] [TRT] Block size 25690112 [03/25/2022-13:25:06] [V] [TRT] Block size 6422528 [03/25/2022-13:25:06] [V] [TRT] Block size 6422528 [03/25/2022-13:25:06] [V] [TRT] Block size 25690112 [03/25/2022-13:25:06] [V] [TRT] Block size 6422528 [03/25/2022-13:25:06] [V] [TRT] Block size 6422528 [03/25/2022-13:25:06] [V] [TRT] Block size 25690112 [03/25/2022-13:25:06] [V] [TRT] Block size 6422528 [03/25/2022-13:25:06] [V] [TRT] Block size 6422528 [03/25/2022-13:25:06] [V] [TRT] Block size 25690112 [03/25/2022-13:25:06] [V] [TRT] Block size 6422528 [03/25/2022-13:25:06] [V] [TRT] Block size 6422528 [03/25/2022-13:25:06] [V] [TRT] Block size 25690112 [03/25/2022-13:25:06] [V] [TRT] Block size 6422528 [03/25/2022-13:25:06] [V] [TRT] Block size 6422528 [03/25/2022-13:25:06] [V] [TRT] Block size 25690112 [03/25/2022-13:25:06] [V] [TRT] Block size 12845056 [03/25/2022-13:25:06] [V] [TRT] Block size 12845056 [03/25/2022-13:25:06] [V] [TRT] Block size 3211264 [03/25/2022-13:25:06] [V] [TRT] Block size 12845056 [03/25/2022-13:25:06] [V] [TRT] Block size 3211264 [03/25/2022-13:25:06] [V] [TRT] Block size 3211264 [03/25/2022-13:25:06] [V] [TRT] Block size 12845056 [03/25/2022-13:25:06] [V] [TRT] Block size 51380224 [03/25/2022-13:25:06] [V] [TRT] Block size 3211264 [03/25/2022-13:25:06] [V] [TRT] Block size 3211264 [03/25/2022-13:25:06] [V] [TRT] Block size 51380224 [03/25/2022-13:25:06] [V] [TRT] Block size 1048576 [03/25/2022-13:25:06] [V] [TRT] Block size 1 [03/25/2022-13:25:06] [V] [TRT] Block size 3211264 [03/25/2022-13:25:06] [V] [TRT] Block size 1 [03/25/2022-13:25:06] [V] [TRT] Block size 1 [03/25/2022-13:25:06] [V] [TRT] Block size 16777216 [03/25/2022-13:25:06] [V] [TRT] Total Activation Memory: 1578500099 [03/25/2022-13:25:06] [I] [TRT] Detected 1 inputs and 2 output network tensors. [03/25/2022-13:25:06] [V] [TRT] input.conv.module.weight + QuantizeLinear_8_quantize_scale_node + Conv_12 + Relu_14 Set Tactic Name: volta_first_layer_filter7x7_imma_fwd Tactic: -5510956450195747703 [03/25/2022-13:25:06] [V] [TRT] sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x64x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 3670282018109435863 [03/25/2022-13:25:06] [V] [TRT] sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2972948223367788520 [03/25/2022-13:25:06] [V] [TRT] sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r3s3 Tactic: -3182884991006484042 [03/25/2022-13:25:06] [V] [TRT] sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2376898825218218566 [03/25/2022-13:25:06] [V] [TRT] sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize64x128x128_stage3_warpsize1x4x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -4889498558023475527 [03/25/2022-13:25:06] [V] [TRT] sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 5966973378912044513 [03/25/2022-13:25:06] [V] [TRT] sections.0.1.conv3.module.weight + QuantizeLinear_120_quantize_scale_node + Conv_124 + Add_132 + Relu_133 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2376898825218218566 [03/25/2022-13:25:06] [V] [TRT] sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x64x64_stage6_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: -8985599729413291927 [03/25/2022-13:25:06] [V] [TRT] sections.0.2.conv2.module.weight + QuantizeLinear_157_quantize_scale_node + Conv_161 + Relu_163 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x64x64_stage4_warpsize4x1x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: 5966973378912044513 [03/25/2022-13:25:06] [V] [TRT] sections.0.2.conv3.module.weight + QuantizeLinear_172_quantize_scale_node + Conv_176 + Add_184 + Relu_185 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2376898825218218566 [03/25/2022-13:25:06] [V] [TRT] sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Set Tactic Name: sm80_xmma_conv_fprop_smallk_i8i8_i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_warptilesize32x32_stage4_warpsize2x4x1_r1s1_u1v1_hw0_c256_scalebias_relu Tactic: -6620675299995493092 [03/25/2022-13:25:06] [V] [TRT] sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r1s1_epifadd Tactic: 7731430299029542276 [03/25/2022-13:25:06] [V] [TRT] sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_t1r3s3_epifadd Tactic: -5709079507616090666 [03/25/2022-13:25:06] [V] [TRT] sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2972948223367788520 [03/25/2022-13:25:06] [V] [TRT] sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Set Tactic Name: sm80_xmma_conv_fprop_smallk_i8i8_i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_warptilesize16x32_stage4_warpsize2x4x1_r1s1_u1v1_hw0_c512_scalebias_relu Tactic: 1913026264725750683 [03/25/2022-13:25:06] [V] [TRT] sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64 Tactic: 4414594337986714263 [03/25/2022-13:25:06] [V] [TRT] sections.1.1.conv3.module.weight + QuantizeLinear_290_quantize_scale_node + Conv_294 + Add_302 + Relu_303 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2972948223367788520 [03/25/2022-13:25:06] [V] [TRT] sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Set Tactic Name: sm80_xmma_conv_fprop_smallk_i8i8_i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_warptilesize16x32_stage4_warpsize2x4x1_r1s1_u1v1_hw0_c512_scalebias_relu Tactic: 1913026264725750683 [03/25/2022-13:25:06] [V] [TRT] sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64 Tactic: 4414594337986714263 [03/25/2022-13:25:06] [V] [TRT] sections.1.2.conv3.module.weight + QuantizeLinear_342_quantize_scale_node + Conv_346 + Add_354 + Relu_355 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2972948223367788520 [03/25/2022-13:25:06] [V] [TRT] sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Set Tactic Name: sm80_xmma_conv_fprop_smallk_i8i8_i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_warptilesize16x32_stage4_warpsize2x4x1_r1s1_u1v1_hw0_c512_scalebias_relu Tactic: 1913026264725750683 [03/25/2022-13:25:06] [V] [TRT] sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64 Tactic: 4414594337986714263 [03/25/2022-13:25:06] [V] [TRT] sections.1.3.conv3.module.weight + QuantizeLinear_394_quantize_scale_node + Conv_398 + Add_406 + Relu_407 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2972948223367788520 [03/25/2022-13:25:06] [V] [TRT] sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Set Tactic Name: sm80_xmma_conv_fprop_smallk_i8i8_i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_warptilesize16x32_stage4_warpsize2x4x1_r1s1_u1v1_hw0_c512_scalebias_relu Tactic: 1913026264725750683 [03/25/2022-13:25:06] [V] [TRT] sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64_t1r1s1_no_preds Tactic: 4909502217677847353 [03/25/2022-13:25:06] [V] [TRT] sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64 Tactic: 3651043333819148268 [03/25/2022-13:25:06] [V] [TRT] sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2376898825218218566 [03/25/2022-13:25:06] [V] [TRT] sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -6779804930216439173 [03/25/2022-13:25:06] [V] [TRT] sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64 Tactic: 4414594337986714263 [03/25/2022-13:25:06] [V] [TRT] sections.2.1.conv3.module.weight + QuantizeLinear_512_quantize_scale_node + Conv_516 + Add_524 + Relu_525 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2376898825218218566 [03/25/2022-13:25:06] [V] [TRT] sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: 857001784974286465 [03/25/2022-13:25:06] [V] [TRT] sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64 Tactic: 4414594337986714263 [03/25/2022-13:25:06] [V] [TRT] sections.2.2.conv3.module.weight + QuantizeLinear_564_quantize_scale_node + Conv_568 + Add_576 + Relu_577 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2376898825218218566 [03/25/2022-13:25:06] [V] [TRT] sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: -4681913707320020520 [03/25/2022-13:25:06] [V] [TRT] sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64 Tactic: 4414594337986714263 [03/25/2022-13:25:06] [V] [TRT] sections.2.3.conv3.module.weight + QuantizeLinear_616_quantize_scale_node + Conv_620 + Add_628 + Relu_629 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2376898825218218566 [03/25/2022-13:25:06] [V] [TRT] sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -6779804930216439173 [03/25/2022-13:25:06] [V] [TRT] sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64 Tactic: 4414594337986714263 [03/25/2022-13:25:06] [V] [TRT] sections.2.4.conv3.module.weight + QuantizeLinear_668_quantize_scale_node + Conv_672 + Add_680 + Relu_681 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2376898825218218566 [03/25/2022-13:25:06] [V] [TRT] sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_simple_t1r1s1 Tactic: -4681913707320020520 [03/25/2022-13:25:06] [V] [TRT] sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x128_stage3_warpsize2x2x1_g1_sptensor16x8x64 Tactic: 4414594337986714263 [03/25/2022-13:25:06] [V] [TRT] sections.2.5.conv3.module.weight + QuantizeLinear_720_quantize_scale_node + Conv_724 + Add_732 + Relu_733 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize128x128x64_stage4_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1_epifadd Tactic: 2376898825218218566 [03/25/2022-13:25:07] [V] [TRT] sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Set Tactic Name: sm80_xmma_conv_fprop_smallk_i8i8_i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_warptilesize16x32_stage5_warpsize1x4x2_r1s1_u1v1_hw0_c1024_scalebias_relu Tactic: 1263683011321748626 [03/25/2022-13:25:07] [V] [TRT] sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_t1r1s1 Tactic: 857001784974286465 [03/25/2022-13:25:07] [V] [TRT] sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64 Tactic: 3651043333819148268 [03/25/2022-13:25:07] [V] [TRT] sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 8524082966802584889 [03/25/2022-13:25:07] [V] [TRT] sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -6779804930216439173 [03/25/2022-13:25:07] [V] [TRT] sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64 Tactic: 3651043333819148268 [03/25/2022-13:25:07] [V] [TRT] sections.3.1.conv3.module.weight + QuantizeLinear_838_quantize_scale_node + Conv_842 + Add_850 + Relu_851 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize96x128x64_stage3_warpsize2x2x1_g1_tensor16x8x32_simple_t1r1s1 Tactic: 8524082966802584889 [03/25/2022-13:25:07] [V] [TRT] sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64_simple_t1r1s1_no_preds Tactic: -6779804930216439173 [03/25/2022-13:25:07] [V] [TRT] sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Set Tactic Name: sm80_xmma_fprop_sparse_conv_interleaved_i8i8_i8i32_f32_nchw_vect_c_32kcrs_vect_c_32_nchw_vect_c_32_tilesize256x128x128_stage3_warpsize4x2x1_g1_sptensor16x8x64 Tactic: 3651043333819148268 [03/25/2022-13:25:07] [V] [TRT] sections.3.2.conv3.module.weight + QuantizeLinear_890_quantize_scale_node + Conv_894 + Add_902 + Relu_903 Set Tactic Name: ampere_fp32_icudnn_int8x4_128x64_relu_interior_nn_v1 Tactic: -42427192380281294 [03/25/2022-13:25:07] [V] [TRT] Gemm_911 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_f32f32_tf32f32_f32_nhwckrsc_nhwc_tilesize64x32x64_stage5_warpsize2x2x1_g1_tensor16x8x8_t1r1s1 Tactic: -1366318165940453381 [03/25/2022-13:25:07] [V] [TRT] Layer: QuantizeLinear_2_quantize_scale_node Host Persistent: 0 Device Persistent: 0 Scratch Memory: 0 [03/25/2022-13:25:07] [V] [TRT] Layer: input.conv.module.weight + QuantizeLinear_8_quantize_scale_node + Conv_12 + Relu_14 Host Persistent: 256 Device Persistent: 12800 Scratch Memory: 0 [03/25/2022-13:25:07] [V] [TRT] Layer: MaxPool_15 Host Persistent: 0 Device Persistent: 0 Scratch Memory: 0 [03/25/2022-13:25:07] [V] [TRT] Layer: sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30 Host Persistent: 2048 Device Persistent: 5120 Scratch Memory: 0 [03/25/2022-13:25:07] [V] [TRT] Layer: sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72 Host Persistent: 2048 Device Persistent: 19456 Scratch Memory: 0 [03/25/2022-13:25:07] [V] [TRT] Layer: sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45 Host Persistent: 2048 Device Persistent: 37888 Scratch Memory: 0 [03/25/2022-13:25:07] [V] [TRT] Layer: sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81 Host Persistent: 2048 Device Persistent: 19456 Scratch Memory: 0 [03/25/2022-13:25:07] [V] [TRT] Layer: sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96 Host Persistent: 2400 Device Persistent: 11264 Scratch Memory: 0 [03/25/2022-13:25:07] [V] [TRT] Layer: sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111 Host Persistent: 2048 Device Persistent: 37888 Scratch Memory: 0 [03/25/2022-13:25:07] [V] [TRT] Layer: sections.0.1.conv3.module.weight + QuantizeLinear_120_quantize_scale_node + Conv_124 + Add_132 + Relu_133 Host Persistent: 2048 Device Persistent: 19456 Scratch Memory: 0 [03/25/2022-13:25:07] [V] [TRT] Layer: sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148 Host Persistent: 2048 Device Persistent: 17408 Scratch Memory: 0 [03/25/2022-13:25:07] [V] [TRT] Layer: sections.0.2.conv2.module.weight + QuantizeLinear_157_quantize_scale_node + Conv_161 + Relu_163 Host Persistent: 2048 Device Persistent: 37888 Scratch Memory: 0 [03/25/2022-13:25:07] [V] [TRT] Layer: sections.0.2.conv3.module.weight + QuantizeLinear_172_quantize_scale_node + Conv_176 + Add_184 + Relu_185 Host Persistent: 2048 Device Persistent: 19456 Scratch Memory: 0 [03/25/2022-13:25:07] [V] [TRT] Layer: sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200 Host Persistent: 1152 Device Persistent: 68608 Scratch Memory: 0 [03/25/2022-13:25:07] [V] [TRT] Layer: sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242 Host Persistent: 2048 Device Persistent: 137216 Scratch Memory: 0 [03/25/2022-13:25:07] [V] [TRT] Layer: sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215 Host Persistent: 2048 Device Persistent: 148992 Scratch Memory: 0 [03/25/2022-13:25:07] [V] [TRT] Layer: sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251 Host Persistent: 2048 Device Persistent: 71680 Scratch Memory: 0 [03/25/2022-13:25:07] [V] [TRT] Layer: sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266 Host Persistent: 1152 Device Persistent: 134144 Scratch Memory: 0 [03/25/2022-13:25:07] [V] [TRT] Layer: sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281 Host Persistent: 2400 Device Persistent: 241152 Scratch Memory: 0 [03/25/2022-13:25:07] [V] [TRT] Layer: sections.1.1.conv3.module.weight + QuantizeLinear_290_quantize_scale_node + Conv_294 + Add_302 + Relu_303 Host Persistent: 2048 Device Persistent: 71680 Scratch Memory: 0 [03/25/2022-13:25:07] [V] [TRT] Layer: sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318 Host Persistent: 1152 Device Persistent: 134144 Scratch Memory: 0 [03/25/2022-13:25:07] [V] [TRT] Layer: sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333 Host Persistent: 2400 Device Persistent: 241152 Scratch Memory: 0 [03/25/2022-13:25:07] [V] [TRT] Layer: sections.1.2.conv3.module.weight + QuantizeLinear_342_quantize_scale_node + Conv_346 + Add_354 + Relu_355 Host Persistent: 2048 Device Persistent: 71680 Scratch Memory: 0 [03/25/2022-13:25:07] [V] [TRT] Layer: sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370 Host Persistent: 1152 Device Persistent: 134144 Scratch Memory: 0 [03/25/2022-13:25:07] [V] [TRT] Layer: sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385 Host Persistent: 2400 Device Persistent: 241152 Scratch Memory: 0 [03/25/2022-13:25:07] [V] [TRT] Layer: sections.1.3.conv3.module.weight + QuantizeLinear_394_quantize_scale_node + Conv_398 + Add_406 + Relu_407 Host Persistent: 2048 Device Persistent: 71680 Scratch Memory: 0 [03/25/2022-13:25:07] [V] [TRT] Layer: sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422 Host Persistent: 1152 Device Persistent: 268288 Scratch Memory: 0 [03/25/2022-13:25:07] [V] [TRT] Layer: sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464 Host Persistent: 2400 Device Persistent: 339968 Scratch Memory: 0 [03/25/2022-13:25:07] [V] [TRT] Layer: sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437 Host Persistent: 2400 Device Persistent: 961536 Scratch Memory: 0 [03/25/2022-13:25:07] [V] [TRT] Layer: sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473 Host Persistent: 2048 Device Persistent: 274432 Scratch Memory: 0 [03/25/2022-13:25:07] [V] [TRT] Layer: sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488 Host Persistent: 2400 Device Persistent: 166912 Scratch Memory: 0 [03/25/2022-13:25:07] [V] [TRT] Layer: sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503 Host Persistent: 2400 Device Persistent: 961536 Scratch Memory: 0 [03/25/2022-13:25:07] [V] [TRT] Layer: sections.2.1.conv3.module.weight + QuantizeLinear_512_quantize_scale_node + Conv_516 + Add_524 + Relu_525 Host Persistent: 2048 Device Persistent: 274432 Scratch Memory: 0 [03/25/2022-13:25:07] [V] [TRT] Layer: sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540 Host Persistent: 2400 Device Persistent: 166912 Scratch Memory: 0 [03/25/2022-13:25:07] [V] [TRT] Layer: sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555 Host Persistent: 2400 Device Persistent: 961536 Scratch Memory: 0 [03/25/2022-13:25:07] [V] [TRT] Layer: sections.2.2.conv3.module.weight + QuantizeLinear_564_quantize_scale_node + Conv_568 + Add_576 + Relu_577 Host Persistent: 2048 Device Persistent: 274432 Scratch Memory: 0 [03/25/2022-13:25:07] [V] [TRT] Layer: sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592 Host Persistent: 2400 Device Persistent: 166912 Scratch Memory: 0 [03/25/2022-13:25:07] [V] [TRT] Layer: sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607 Host Persistent: 2400 Device Persistent: 961536 Scratch Memory: 0 [03/25/2022-13:25:07] [V] [TRT] Layer: sections.2.3.conv3.module.weight + QuantizeLinear_616_quantize_scale_node + Conv_620 + Add_628 + Relu_629 Host Persistent: 2048 Device Persistent: 274432 Scratch Memory: 0 [03/25/2022-13:25:07] [V] [TRT] Layer: sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644 Host Persistent: 2400 Device Persistent: 166912 Scratch Memory: 0 [03/25/2022-13:25:07] [V] [TRT] Layer: sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659 Host Persistent: 2400 Device Persistent: 961536 Scratch Memory: 0 [03/25/2022-13:25:07] [V] [TRT] Layer: sections.2.4.conv3.module.weight + QuantizeLinear_668_quantize_scale_node + Conv_672 + Add_680 + Relu_681 Host Persistent: 2048 Device Persistent: 274432 Scratch Memory: 0 [03/25/2022-13:25:07] [V] [TRT] Layer: sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696 Host Persistent: 2400 Device Persistent: 166912 Scratch Memory: 0 [03/25/2022-13:25:07] [V] [TRT] Layer: sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711 Host Persistent: 2400 Device Persistent: 961536 Scratch Memory: 0 [03/25/2022-13:25:07] [V] [TRT] Layer: sections.2.5.conv3.module.weight + QuantizeLinear_720_quantize_scale_node + Conv_724 + Add_732 + Relu_733 Host Persistent: 2048 Device Persistent: 274432 Scratch Memory: 0 [03/25/2022-13:25:07] [V] [TRT] Layer: sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748 Host Persistent: 1152 Device Persistent: 1060864 Scratch Memory: 0 [03/25/2022-13:25:07] [V] [TRT] Layer: sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790 Host Persistent: 2400 Device Persistent: 1335296 Scratch Memory: 0 [03/25/2022-13:25:07] [V] [TRT] Layer: sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763 Host Persistent: 2400 Device Persistent: 3840000 Scratch Memory: 0 [03/25/2022-13:25:07] [V] [TRT] Layer: sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799 Host Persistent: 2048 Device Persistent: 1073152 Scratch Memory: 0 [03/25/2022-13:25:07] [V] [TRT] Layer: sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814 Host Persistent: 2400 Device Persistent: 661504 Scratch Memory: 0 [03/25/2022-13:25:07] [V] [TRT] Layer: sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829 Host Persistent: 2400 Device Persistent: 3840000 Scratch Memory: 0 [03/25/2022-13:25:07] [V] [TRT] Layer: sections.3.1.conv3.module.weight + QuantizeLinear_838_quantize_scale_node + Conv_842 + Add_850 + Relu_851 Host Persistent: 2048 Device Persistent: 1073152 Scratch Memory: 0 [03/25/2022-13:25:07] [V] [TRT] Layer: DequantizeLinear_901_dequantize_scale_node Host Persistent: 0 Device Persistent: 0 Scratch Memory: 0 [03/25/2022-13:25:07] [V] [TRT] Layer: sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866 Host Persistent: 2400 Device Persistent: 661504 Scratch Memory: 0 [03/25/2022-13:25:07] [V] [TRT] Layer: sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 Host Persistent: 2400 Device Persistent: 3840000 Scratch Memory: 0 [03/25/2022-13:25:07] [V] [TRT] Layer: Reformatting CopyNode for Input Tensor 0 to sections.3.2.conv3.module.weight + QuantizeLinear_890_quantize_scale_node + Conv_894 + Add_902 + Relu_903 Host Persistent: 0 Device Persistent: 0 Scratch Memory: 0 [03/25/2022-13:25:07] [V] [TRT] Layer: sections.3.2.conv3.module.weight + QuantizeLinear_890_quantize_scale_node + Conv_894 + Add_902 + Relu_903 Host Persistent: 3200 Device Persistent: 1090048 Scratch Memory: 0 [03/25/2022-13:25:07] [V] [TRT] Layer: GlobalAveragePool_904 Host Persistent: 48 Device Persistent: 0 Scratch Memory: 0 [03/25/2022-13:25:07] [V] [TRT] Layer: Reformatting CopyNode for Input Tensor 0 to Gemm_911 Host Persistent: 0 Device Persistent: 0 Scratch Memory: 0 [03/25/2022-13:25:07] [V] [TRT] Layer: Gemm_911 Host Persistent: 2400 Device Persistent: 8204288 Scratch Memory: 0 [03/25/2022-13:25:07] [V] [TRT] Layer: Reformatting CopyNode for Input Tensor 0 to (Unnamed Layer* 975) [Shuffle] Host Persistent: 0 Device Persistent: 0 Scratch Memory: 0 [03/25/2022-13:25:07] [V] [TRT] Layer: (Unnamed Layer* 975) [Shuffle] Host Persistent: 0 Device Persistent: 0 Scratch Memory: 0 [03/25/2022-13:25:07] [V] [TRT] Layer: Softmax_912 Host Persistent: 0 Device Persistent: 0 Scratch Memory: 0 [03/25/2022-13:25:07] [I] [TRT] Total Host Persistent Memory: 112720 [03/25/2022-13:25:07] [I] [TRT] Total Device Persistent Memory: 37543936 [03/25/2022-13:25:07] [I] [TRT] Total Scratch Memory: 0 [03/25/2022-13:25:07] [I] [TRT] [MemUsageStats] Peak memory usage of TRT CPU/GPU memory allocators: CPU 120 MiB, GPU 369 MiB [03/25/2022-13:25:07] [I] [TRT] [BlockAssignment] Algorithm ShiftNTopDown took 1.71574ms to assign 4 blocks to 61 nodes requiring 269746176 bytes. [03/25/2022-13:25:07] [V] [TRT] Optimized block assignment. [03/25/2022-13:25:07] [V] [TRT] Block size 102760448 [03/25/2022-13:25:07] [V] [TRT] Block size 102760448 [03/25/2022-13:25:07] [V] [TRT] Block size 51380224 [03/25/2022-13:25:07] [V] [TRT] Block size 12845056 [03/25/2022-13:25:07] [I] [TRT] Total Activation Memory: 269746176 [03/25/2022-13:25:07] [I] [TRT] (Sparsity) Layers eligible for sparse math: sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30, sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72, sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45, sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81, sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96, sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111, sections.0.1.conv3.module.weight + QuantizeLinear_120_quantize_scale_node + Conv_124 + Add_132 + Relu_133, sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148, sections.0.2.conv2.module.weight + QuantizeLinear_157_quantize_scale_node + Conv_161 + Relu_163, sections.0.2.conv3.module.weight + QuantizeLinear_172_quantize_scale_node + Conv_176 + Add_184 + Relu_185, sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200, sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242, sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215, sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251, sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266, sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281, sections.1.1.conv3.module.weight + QuantizeLinear_290_quantize_scale_node + Conv_294 + Add_302 + Relu_303, sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318, sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333, sections.1.2.conv3.module.weight + QuantizeLinear_342_quantize_scale_node + Conv_346 + Add_354 + Relu_355, sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370, sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385, sections.1.3.conv3.module.weight + QuantizeLinear_394_quantize_scale_node + Conv_398 + Add_406 + Relu_407, sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422, sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464, sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437, sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473, sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488, sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503, sections.2.1.conv3.module.weight + QuantizeLinear_512_quantize_scale_node + Conv_516 + Add_524 + Relu_525, sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540, sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555, sections.2.2.conv3.module.weight + QuantizeLinear_564_quantize_scale_node + Conv_568 + Add_576 + Relu_577, sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592, sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607, sections.2.3.conv3.module.weight + QuantizeLinear_616_quantize_scale_node + Conv_620 + Add_628 + Relu_629, sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644, sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659, sections.2.4.conv3.module.weight + QuantizeLinear_668_quantize_scale_node + Conv_672 + Add_680 + Relu_681, sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696, sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711, sections.2.5.conv3.module.weight + QuantizeLinear_720_quantize_scale_node + Conv_724 + Add_732 + Relu_733, sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748, sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790, sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763, sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799, sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814, sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829, sections.3.1.conv3.module.weight + QuantizeLinear_838_quantize_scale_node + Conv_842 + Add_850 + Relu_851, sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866, sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881, sections.3.2.conv3.module.weight + QuantizeLinear_890_quantize_scale_node + Conv_894 + Add_902 + Relu_903, Gemm_911 [03/25/2022-13:25:07] [I] [TRT] (Sparsity) TRT inference plan picked sparse implementation for layers: sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96, sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281, sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333, sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385, sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464, sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437, sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488, sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503, sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540, sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555, sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592, sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607, sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644, sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659, sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696, sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711, sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790, sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763, sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814, sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829, sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866, sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881 [03/25/2022-13:25:07] [V] [TRT] Using cublasLt as a tactic source [03/25/2022-13:25:07] [I] [TRT] [MemUsageChange] Init cuBLAS/cuBLASLt: CPU +0, GPU +8, now: CPU 2675, GPU 1686 (MiB) [03/25/2022-13:25:07] [V] [TRT] Using cuDNN as a tactic source [03/25/2022-13:25:07] [I] [TRT] [MemUsageChange] Init cuDNN: CPU +0, GPU +10, now: CPU 2675, GPU 1696 (MiB) [03/25/2022-13:25:07] [V] [TRT] Engine generation completed in 60.5942 seconds. [03/25/2022-13:25:07] [V] [TRT] Deleting timing cache: 140 entries, 223 hits [03/25/2022-13:25:07] [V] [TRT] Engine Layer Information: Layer(Scale): QuantizeLinear_2_quantize_scale_node, Tactic: 0, input[Float(128,3,224,224)] -> 1177[Int8(128,3,224,224)] Layer(CaskConvolution): input.conv.module.weight + QuantizeLinear_8_quantize_scale_node + Conv_12 + Relu_14, Tactic: -5510956450195747703, 1177[Int8(128,3,224,224)] -> 1190[Int8(128,64,112,112)] Layer(CudaPooling): MaxPool_15, Tactic: -4, 1190[Int8(128,64,112,112)] -> 1193[Int8(128,64,56,56)] Layer(CaskConvolution): sections.0.0.conv1.module.weight + QuantizeLinear_24_quantize_scale_node + Conv_28 + Relu_30, Tactic: 3670282018109435863, 1193[Int8(128,64,56,56)] -> 1208[Int8(128,64,56,56)] Layer(CaskConvolution): sections.0.0.identity.conv.module.weight + QuantizeLinear_68_quantize_scale_node + Conv_72, Tactic: 2972948223367788520, 1193[Int8(128,64,56,56)] -> 1251[Int8(128,256,56,56)] Layer(CaskConvolution): sections.0.0.conv2.module.weight + QuantizeLinear_39_quantize_scale_node + Conv_43 + Relu_45, Tactic: -3182884991006484042, 1208[Int8(128,64,56,56)] -> 1223[Int8(128,64,56,56)] Layer(CaskConvolution): sections.0.0.conv3.module.weight + QuantizeLinear_54_quantize_scale_node + Conv_58 + Add_80 + Relu_81, Tactic: 2376898825218218566, 1223[Int8(128,64,56,56)], 1251[Int8(128,256,56,56)] -> 1259[Int8(128,256,56,56)] Layer(CaskConvolution): sections.0.1.conv1.module.weight + QuantizeLinear_90_quantize_scale_node + Conv_94 + Relu_96, Tactic: -4889498558023475527, 1259[Int8(128,256,56,56)] -> 1274[Int8(128,64,56,56)] Layer(CaskConvolution): sections.0.1.conv2.module.weight + QuantizeLinear_105_quantize_scale_node + Conv_109 + Relu_111, Tactic: 5966973378912044513, 1274[Int8(128,64,56,56)] -> 1289[Int8(128,64,56,56)] Layer(CaskConvolution): sections.0.1.conv3.module.weight + QuantizeLinear_120_quantize_scale_node + Conv_124 + Add_132 + Relu_133, Tactic: 2376898825218218566, 1289[Int8(128,64,56,56)], 1259[Int8(128,256,56,56)] -> 1311[Int8(128,256,56,56)] Layer(CaskConvolution): sections.0.2.conv1.module.weight + QuantizeLinear_142_quantize_scale_node + Conv_146 + Relu_148, Tactic: -8985599729413291927, 1311[Int8(128,256,56,56)] -> 1326[Int8(128,64,56,56)] Layer(CaskConvolution): sections.0.2.conv2.module.weight + QuantizeLinear_157_quantize_scale_node + Conv_161 + Relu_163, Tactic: 5966973378912044513, 1326[Int8(128,64,56,56)] -> 1341[Int8(128,64,56,56)] Layer(CaskConvolution): sections.0.2.conv3.module.weight + QuantizeLinear_172_quantize_scale_node + Conv_176 + Add_184 + Relu_185, Tactic: 2376898825218218566, 1341[Int8(128,64,56,56)], 1311[Int8(128,256,56,56)] -> 1363[Int8(128,256,56,56)] Layer(CaskConvolution): sections.1.0.conv1.module.weight + QuantizeLinear_194_quantize_scale_node + Conv_198 + Relu_200, Tactic: -6620675299995493092, 1363[Int8(128,256,56,56)] -> 1378[Int8(128,128,56,56)] Layer(CaskConvolution): sections.1.0.identity.conv.module.weight + QuantizeLinear_238_quantize_scale_node + Conv_242, Tactic: 7731430299029542276, 1363[Int8(128,256,56,56)] -> 1421[Int8(128,512,28,28)] Layer(CaskConvolution): sections.1.0.conv2.module.weight + QuantizeLinear_209_quantize_scale_node + Conv_213 + Relu_215, Tactic: -5709079507616090666, 1378[Int8(128,128,56,56)] -> 1393[Int8(128,128,28,28)] Layer(CaskConvolution): sections.1.0.conv3.module.weight + QuantizeLinear_224_quantize_scale_node + Conv_228 + Add_250 + Relu_251, Tactic: 2972948223367788520, 1393[Int8(128,128,28,28)], 1421[Int8(128,512,28,28)] -> 1429[Int8(128,512,28,28)] Layer(CaskConvolution): sections.1.1.conv1.module.weight + QuantizeLinear_260_quantize_scale_node + Conv_264 + Relu_266, Tactic: 1913026264725750683, 1429[Int8(128,512,28,28)] -> 1444[Int8(128,128,28,28)] Layer(CaskConvolution): sections.1.1.conv2.module.weight + QuantizeLinear_275_quantize_scale_node + Conv_279 + Relu_281, Tactic: 4414594337986714263, 1444[Int8(128,128,28,28)] -> 1459[Int8(128,128,28,28)] Layer(CaskConvolution): sections.1.1.conv3.module.weight + QuantizeLinear_290_quantize_scale_node + Conv_294 + Add_302 + Relu_303, Tactic: 2972948223367788520, 1459[Int8(128,128,28,28)], 1429[Int8(128,512,28,28)] -> 1481[Int8(128,512,28,28)] Layer(CaskConvolution): sections.1.2.conv1.module.weight + QuantizeLinear_312_quantize_scale_node + Conv_316 + Relu_318, Tactic: 1913026264725750683, 1481[Int8(128,512,28,28)] -> 1496[Int8(128,128,28,28)] Layer(CaskConvolution): sections.1.2.conv2.module.weight + QuantizeLinear_327_quantize_scale_node + Conv_331 + Relu_333, Tactic: 4414594337986714263, 1496[Int8(128,128,28,28)] -> 1511[Int8(128,128,28,28)] Layer(CaskConvolution): sections.1.2.conv3.module.weight + QuantizeLinear_342_quantize_scale_node + Conv_346 + Add_354 + Relu_355, Tactic: 2972948223367788520, 1511[Int8(128,128,28,28)], 1481[Int8(128,512,28,28)] -> 1533[Int8(128,512,28,28)] Layer(CaskConvolution): sections.1.3.conv1.module.weight + QuantizeLinear_364_quantize_scale_node + Conv_368 + Relu_370, Tactic: 1913026264725750683, 1533[Int8(128,512,28,28)] -> 1548[Int8(128,128,28,28)] Layer(CaskConvolution): sections.1.3.conv2.module.weight + QuantizeLinear_379_quantize_scale_node + Conv_383 + Relu_385, Tactic: 4414594337986714263, 1548[Int8(128,128,28,28)] -> 1563[Int8(128,128,28,28)] Layer(CaskConvolution): sections.1.3.conv3.module.weight + QuantizeLinear_394_quantize_scale_node + Conv_398 + Add_406 + Relu_407, Tactic: 2972948223367788520, 1563[Int8(128,128,28,28)], 1533[Int8(128,512,28,28)] -> 1585[Int8(128,512,28,28)] Layer(CaskConvolution): sections.2.0.conv1.module.weight + QuantizeLinear_416_quantize_scale_node + Conv_420 + Relu_422, Tactic: 1913026264725750683, 1585[Int8(128,512,28,28)] -> 1600[Int8(128,256,28,28)] Layer(CaskConvolution): sections.2.0.identity.conv.module.weight + QuantizeLinear_460_quantize_scale_node + Conv_464, Tactic: 4909502217677847353, 1585[Int8(128,512,28,28)] -> 1643[Int8(128,1024,14,14)] Layer(CaskConvolution): sections.2.0.conv2.module.weight + QuantizeLinear_431_quantize_scale_node + Conv_435 + Relu_437, Tactic: 3651043333819148268, 1600[Int8(128,256,28,28)] -> 1615[Int8(128,256,14,14)] Layer(CaskConvolution): sections.2.0.conv3.module.weight + QuantizeLinear_446_quantize_scale_node + Conv_450 + Add_472 + Relu_473, Tactic: 2376898825218218566, 1615[Int8(128,256,14,14)], 1643[Int8(128,1024,14,14)] -> 1651[Int8(128,1024,14,14)] Layer(CaskConvolution): sections.2.1.conv1.module.weight + QuantizeLinear_482_quantize_scale_node + Conv_486 + Relu_488, Tactic: -6779804930216439173, 1651[Int8(128,1024,14,14)] -> 1666[Int8(128,256,14,14)] Layer(CaskConvolution): sections.2.1.conv2.module.weight + QuantizeLinear_497_quantize_scale_node + Conv_501 + Relu_503, Tactic: 4414594337986714263, 1666[Int8(128,256,14,14)] -> 1681[Int8(128,256,14,14)] Layer(CaskConvolution): sections.2.1.conv3.module.weight + QuantizeLinear_512_quantize_scale_node + Conv_516 + Add_524 + Relu_525, Tactic: 2376898825218218566, 1681[Int8(128,256,14,14)], 1651[Int8(128,1024,14,14)] -> 1703[Int8(128,1024,14,14)] Layer(CaskConvolution): sections.2.2.conv1.module.weight + QuantizeLinear_534_quantize_scale_node + Conv_538 + Relu_540, Tactic: 857001784974286465, 1703[Int8(128,1024,14,14)] -> 1718[Int8(128,256,14,14)] Layer(CaskConvolution): sections.2.2.conv2.module.weight + QuantizeLinear_549_quantize_scale_node + Conv_553 + Relu_555, Tactic: 4414594337986714263, 1718[Int8(128,256,14,14)] -> 1733[Int8(128,256,14,14)] Layer(CaskConvolution): sections.2.2.conv3.module.weight + QuantizeLinear_564_quantize_scale_node + Conv_568 + Add_576 + Relu_577, Tactic: 2376898825218218566, 1733[Int8(128,256,14,14)], 1703[Int8(128,1024,14,14)] -> 1755[Int8(128,1024,14,14)] Layer(CaskConvolution): sections.2.3.conv1.module.weight + QuantizeLinear_586_quantize_scale_node + Conv_590 + Relu_592, Tactic: -4681913707320020520, 1755[Int8(128,1024,14,14)] -> 1770[Int8(128,256,14,14)] Layer(CaskConvolution): sections.2.3.conv2.module.weight + QuantizeLinear_601_quantize_scale_node + Conv_605 + Relu_607, Tactic: 4414594337986714263, 1770[Int8(128,256,14,14)] -> 1785[Int8(128,256,14,14)] Layer(CaskConvolution): sections.2.3.conv3.module.weight + QuantizeLinear_616_quantize_scale_node + Conv_620 + Add_628 + Relu_629, Tactic: 2376898825218218566, 1785[Int8(128,256,14,14)], 1755[Int8(128,1024,14,14)] -> 1807[Int8(128,1024,14,14)] Layer(CaskConvolution): sections.2.4.conv1.module.weight + QuantizeLinear_638_quantize_scale_node + Conv_642 + Relu_644, Tactic: -6779804930216439173, 1807[Int8(128,1024,14,14)] -> 1822[Int8(128,256,14,14)] Layer(CaskConvolution): sections.2.4.conv2.module.weight + QuantizeLinear_653_quantize_scale_node + Conv_657 + Relu_659, Tactic: 4414594337986714263, 1822[Int8(128,256,14,14)] -> 1837[Int8(128,256,14,14)] Layer(CaskConvolution): sections.2.4.conv3.module.weight + QuantizeLinear_668_quantize_scale_node + Conv_672 + Add_680 + Relu_681, Tactic: 2376898825218218566, 1837[Int8(128,256,14,14)], 1807[Int8(128,1024,14,14)] -> 1859[Int8(128,1024,14,14)] Layer(CaskConvolution): sections.2.5.conv1.module.weight + QuantizeLinear_690_quantize_scale_node + Conv_694 + Relu_696, Tactic: -4681913707320020520, 1859[Int8(128,1024,14,14)] -> 1874[Int8(128,256,14,14)] Layer(CaskConvolution): sections.2.5.conv2.module.weight + QuantizeLinear_705_quantize_scale_node + Conv_709 + Relu_711, Tactic: 4414594337986714263, 1874[Int8(128,256,14,14)] -> 1889[Int8(128,256,14,14)] Layer(CaskConvolution): sections.2.5.conv3.module.weight + QuantizeLinear_720_quantize_scale_node + Conv_724 + Add_732 + Relu_733, Tactic: 2376898825218218566, 1889[Int8(128,256,14,14)], 1859[Int8(128,1024,14,14)] -> 1911[Int8(128,1024,14,14)] Layer(CaskConvolution): sections.3.0.conv1.module.weight + QuantizeLinear_742_quantize_scale_node + Conv_746 + Relu_748, Tactic: 1263683011321748626, 1911[Int8(128,1024,14,14)] -> 1926[Int8(128,512,14,14)] Layer(CaskConvolution): sections.3.0.identity.conv.module.weight + QuantizeLinear_786_quantize_scale_node + Conv_790, Tactic: 857001784974286465, 1911[Int8(128,1024,14,14)] -> 1969[Int8(128,2048,7,7)] Layer(CaskConvolution): sections.3.0.conv2.module.weight + QuantizeLinear_757_quantize_scale_node + Conv_761 + Relu_763, Tactic: 3651043333819148268, 1926[Int8(128,512,14,14)] -> 1941[Int8(128,512,7,7)] Layer(CaskConvolution): sections.3.0.conv3.module.weight + QuantizeLinear_772_quantize_scale_node + Conv_776 + Add_798 + Relu_799, Tactic: 8524082966802584889, 1941[Int8(128,512,7,7)], 1969[Int8(128,2048,7,7)] -> 1977[Int8(128,2048,7,7)] Layer(CaskConvolution): sections.3.1.conv1.module.weight + QuantizeLinear_808_quantize_scale_node + Conv_812 + Relu_814, Tactic: -6779804930216439173, 1977[Int8(128,2048,7,7)] -> 1992[Int8(128,512,7,7)] Layer(CaskConvolution): sections.3.1.conv2.module.weight + QuantizeLinear_823_quantize_scale_node + Conv_827 + Relu_829, Tactic: 3651043333819148268, 1992[Int8(128,512,7,7)] -> 2007[Int8(128,512,7,7)] Layer(CaskConvolution): sections.3.1.conv3.module.weight + QuantizeLinear_838_quantize_scale_node + Conv_842 + Add_850 + Relu_851, Tactic: 8524082966802584889, 2007[Int8(128,512,7,7)], 1977[Int8(128,2048,7,7)] -> 2029[Int8(128,2048,7,7)] Layer(Scale): DequantizeLinear_901_dequantize_scale_node, Tactic: 0, 2029[Int8(128,2048,7,7)] -> 2076[Float(128,2048,7,7)] Layer(CaskConvolution): sections.3.2.conv1.module.weight + QuantizeLinear_860_quantize_scale_node + Conv_864 + Relu_866, Tactic: -6779804930216439173, 2029[Int8(128,2048,7,7)] -> 2044[Int8(128,512,7,7)] Layer(CaskConvolution): sections.3.2.conv2.module.weight + QuantizeLinear_875_quantize_scale_node + Conv_879 + Relu_881, Tactic: 3651043333819148268, 2044[Int8(128,512,7,7)] -> 2059[Int8(128,512,7,7)] Layer(Reformat): Reformatting CopyNode for Input Tensor 0 to sections.3.2.conv3.module.weight + QuantizeLinear_890_quantize_scale_node + Conv_894 + Add_902 + Relu_903, Tactic: 0, 2059[Int8(128,512,7,7)] -> Reformatted Input Tensor 0 to sections.3.2.conv3.module.weight + QuantizeLinear_890_quantize_scale_node + Conv_894 + Add_902 + Relu_903[Int8(128,512,7,7)] Layer(CaskConvolution): sections.3.2.conv3.module.weight + QuantizeLinear_890_quantize_scale_node + Conv_894 + Add_902 + Relu_903, Tactic: -42427192380281294, Reformatted Input Tensor 0 to sections.3.2.conv3.module.weight + QuantizeLinear_890_quantize_scale_node + Conv_894 + Add_902 + Relu_903[Int8(128,512,7,7)], 2076[Float(128,2048,7,7)] -> 2078[Float(128,2048,7,7)] Layer(CudnnPooling): GlobalAveragePool_904, Tactic: -1, 2078[Float(128,2048,7,7)] -> 2079[Float(128,2048,1,1)] Layer(NoOp): Reformatting CopyNode for Input Tensor 0 to Gemm_911, Tactic: 0, 2079[Float(128,2048,1,1)] -> Reformatted Input Tensor 0 to Gemm_911[Float(128,2048,1,1)] Layer(CaskConvolution): Gemm_911, Tactic: -1366318165940453381, Reformatted Input Tensor 0 to Gemm_911[Float(128,2048,1,1)] -> (Unnamed Layer* 971) [Fully Connected]_output[Float(128,1000,1,1)] Layer(NoOp): Reformatting CopyNode for Input Tensor 0 to (Unnamed Layer* 975) [Shuffle], Tactic: 0, (Unnamed Layer* 971) [Fully Connected]_output[Float(128,1000,1,1)] -> Reformatted Input Tensor 0 to (Unnamed Layer* 975) [Shuffle][Float(128,1000,1,1)] Layer(NoOp): (Unnamed Layer* 975) [Shuffle], Tactic: 0, Reformatted Input Tensor 0 to (Unnamed Layer* 975) [Shuffle][Float(128,1000,1,1)] -> output_0[Float(128,1000)] Layer(CudaSoftMax): Softmax_912, Tactic: 1001, output_0[Float(128,1000)] -> output_1[Float(128,1000)] [03/25/2022-13:25:07] [I] [TRT] [MemUsageChange] TensorRT-managed allocation in building engine: CPU +22, GPU +36, now: CPU 22, GPU 36 (MiB) [03/25/2022-13:25:07] [I] [TRT] [MemUsageChange] Init CUDA: CPU +0, GPU +0, now: CPU 2597, GPU 1626 (MiB) [03/25/2022-13:25:07] [I] [TRT] Loaded engine size: 35 MiB [03/25/2022-13:25:07] [V] [TRT] Using cublasLt as a tactic source [03/25/2022-13:25:07] [I] [TRT] [MemUsageChange] Init cuBLAS/cuBLASLt: CPU +0, GPU +10, now: CPU 2598, GPU 1672 (MiB) [03/25/2022-13:25:07] [V] [TRT] Using cuDNN as a tactic source [03/25/2022-13:25:07] [I] [TRT] [MemUsageChange] Init cuDNN: CPU +0, GPU +8, now: CPU 2598, GPU 1680 (MiB) [03/25/2022-13:25:07] [V] [TRT] Deserialization required 10493 microseconds. [03/25/2022-13:25:07] [I] [TRT] [MemUsageChange] TensorRT-managed allocation in engine deserialization: CPU +0, GPU +35, now: CPU 0, GPU 35 (MiB) [03/25/2022-13:25:07] [I] Engine built in 65.7738 sec. [03/25/2022-13:25:07] [V] [TRT] Using cublasLt as a tactic source [03/25/2022-13:25:07] [I] [TRT] [MemUsageChange] Init cuBLAS/cuBLASLt: CPU +0, GPU +10, now: CPU 2246, GPU 1550 (MiB) [03/25/2022-13:25:07] [V] [TRT] Using cuDNN as a tactic source [03/25/2022-13:25:07] [I] [TRT] [MemUsageChange] Init cuDNN: CPU +0, GPU +8, now: CPU 2246, GPU 1558 (MiB) [03/25/2022-13:25:07] [V] [TRT] Total per-runner device persistent memory is 37543936 [03/25/2022-13:25:07] [V] [TRT] Total per-runner host persistent memory is 112720 [03/25/2022-13:25:07] [V] [TRT] Allocated activation device memory of size 269746176 [03/25/2022-13:25:07] [I] [TRT] [MemUsageChange] TensorRT-managed allocation in IExecutionContext creation: CPU +0, GPU +293, now: CPU 0, GPU 328 (MiB) [03/25/2022-13:25:07] [I] Using random values for input input [03/25/2022-13:25:07] [I] Created input binding for input with dimensions 128x3x224x224 [03/25/2022-13:25:07] [I] Using random values for output output_0 [03/25/2022-13:25:07] [I] Created output binding for output_0 with dimensions 128x1000 [03/25/2022-13:25:07] [I] Using random values for output output_1 [03/25/2022-13:25:07] [I] Created output binding for output_1 with dimensions 128x1000 [03/25/2022-13:25:07] [I] Starting inference [03/25/2022-13:25:10] [I] Warmup completed 39 queries over 200 ms [03/25/2022-13:25:10] [I] Timing trace has 603 queries over 3.01718 s [03/25/2022-13:25:10] [I] [03/25/2022-13:25:10] [I] === Trace details === [03/25/2022-13:25:10] [I] Trace averages of 10 runs: [03/25/2022-13:25:10] [I] Average on 10 runs - GPU latency: 4.99599 ms - Host latency: 8.54441 ms (end to end 9.89321 ms, enqueue 0.50862 ms) [03/25/2022-13:25:10] [I] Average on 10 runs - GPU latency: 4.99651 ms - Host latency: 8.53962 ms (end to end 9.89465 ms, enqueue 0.508864 ms) [03/25/2022-13:25:10] [I] Average on 10 runs - GPU latency: 4.99476 ms - Host latency: 8.53515 ms (end to end 9.8897 ms, enqueue 0.507349 ms) [03/25/2022-13:25:10] [I] Average on 10 runs - GPU latency: 4.99384 ms - Host latency: 8.53003 ms (end to end 9.88975 ms, enqueue 0.506564 ms) [03/25/2022-13:25:10] [I] Average on 10 runs - GPU latency: 4.9964 ms - Host latency: 8.53435 ms (end to end 9.89483 ms, enqueue 0.506635 ms) [03/25/2022-13:25:10] [I] Average on 10 runs - GPU latency: 4.99589 ms - Host latency: 8.53274 ms (end to end 9.89087 ms, enqueue 0.505823 ms) [03/25/2022-13:25:10] [I] Average on 10 runs - GPU latency: 4.99539 ms - Host latency: 8.53224 ms (end to end 9.89895 ms, enqueue 0.505899 ms) [03/25/2022-13:25:10] [I] Average on 10 runs - GPU latency: 4.99659 ms - Host latency: 8.53253 ms (end to end 9.89116 ms, enqueue 0.505634 ms) [03/25/2022-13:25:10] [I] Average on 10 runs - GPU latency: 4.99568 ms - Host latency: 8.53276 ms (end to end 9.89506 ms, enqueue 0.506189 ms) [03/25/2022-13:25:10] [I] Average on 10 runs - GPU latency: 4.98043 ms - Host latency: 8.52013 ms (end to end 9.69406 ms, enqueue 0.506671 ms) [03/25/2022-13:25:10] [I] Average on 10 runs - GPU latency: 4.98965 ms - Host latency: 8.52471 ms (end to end 9.88299 ms, enqueue 0.505542 ms) [03/25/2022-13:25:10] [I] Average on 10 runs - GPU latency: 4.99241 ms - Host latency: 8.52835 ms (end to end 9.88481 ms, enqueue 0.506567 ms) [03/25/2022-13:25:10] [I] Average on 10 runs - GPU latency: 4.99261 ms - Host latency: 8.52747 ms (end to end 9.88978 ms, enqueue 0.507336 ms) [03/25/2022-13:25:10] [I] Average on 10 runs - GPU latency: 4.99077 ms - Host latency: 8.52187 ms (end to end 9.88174 ms, enqueue 0.506757 ms) [03/25/2022-13:25:10] [I] Average on 10 runs - GPU latency: 4.98871 ms - Host latency: 8.52133 ms (end to end 9.8745 ms, enqueue 0.507391 ms) [03/25/2022-13:25:10] [I] Average on 10 runs - GPU latency: 4.99025 ms - Host latency: 8.52318 ms (end to end 9.88165 ms, enqueue 0.510168 ms) [03/25/2022-13:25:10] [I] Average on 10 runs - GPU latency: 4.99014 ms - Host latency: 8.52007 ms (end to end 9.88351 ms, enqueue 0.511816 ms) [03/25/2022-13:25:10] [I] Average on 10 runs - GPU latency: 4.99077 ms - Host latency: 8.52142 ms (end to end 9.88372 ms, enqueue 0.509509 ms) [03/25/2022-13:25:10] [I] Average on 10 runs - GPU latency: 4.99149 ms - Host latency: 8.52084 ms (end to end 9.88619 ms, enqueue 0.511353 ms) [03/25/2022-13:25:10] [I] Average on 10 runs - GPU latency: 4.99117 ms - Host latency: 8.52127 ms (end to end 9.88827 ms, enqueue 0.510608 ms) [03/25/2022-13:25:10] [I] Average on 10 runs - GPU latency: 4.98882 ms - Host latency: 8.51803 ms (end to end 9.87947 ms, enqueue 0.511645 ms) [03/25/2022-13:25:10] [I] Average on 10 runs - GPU latency: 4.99036 ms - Host latency: 8.51786 ms (end to end 9.88508 ms, enqueue 0.513525 ms) [03/25/2022-13:25:10] [I] Average on 10 runs - GPU latency: 4.99127 ms - Host latency: 8.51688 ms (end to end 9.88485 ms, enqueue 0.51156 ms) [03/25/2022-13:25:10] [I] Average on 10 runs - GPU latency: 4.99344 ms - Host latency: 8.5229 ms (end to end 9.89105 ms, enqueue 0.509961 ms) [03/25/2022-13:25:10] [I] Average on 10 runs - GPU latency: 4.98871 ms - Host latency: 8.51608 ms (end to end 9.87886 ms, enqueue 0.510522 ms) [03/25/2022-13:25:10] [I] Average on 10 runs - GPU latency: 4.98915 ms - Host latency: 8.51379 ms (end to end 9.87803 ms, enqueue 0.50719 ms) [03/25/2022-13:25:10] [I] Average on 10 runs - GPU latency: 4.98829 ms - Host latency: 8.51334 ms (end to end 9.87832 ms, enqueue 0.505945 ms) [03/25/2022-13:25:10] [I] Average on 10 runs - GPU latency: 4.99181 ms - Host latency: 8.51388 ms (end to end 9.8845 ms, enqueue 0.506323 ms) [03/25/2022-13:25:10] [I] Average on 10 runs - GPU latency: 4.99108 ms - Host latency: 8.51571 ms (end to end 9.88364 ms, enqueue 0.507459 ms) [03/25/2022-13:25:10] [I] Average on 10 runs - GPU latency: 4.98998 ms - Host latency: 8.51357 ms (end to end 9.88237 ms, enqueue 0.508008 ms) [03/25/2022-13:25:10] [I] Average on 10 runs - GPU latency: 4.99095 ms - Host latency: 8.51321 ms (end to end 9.88406 ms, enqueue 0.507239 ms) [03/25/2022-13:25:10] [I] Average on 10 runs - GPU latency: 4.99149 ms - Host latency: 8.51046 ms (end to end 9.87555 ms, enqueue 0.525098 ms) [03/25/2022-13:25:10] [I] Average on 10 runs - GPU latency: 4.9917 ms - Host latency: 8.50521 ms (end to end 9.87777 ms, enqueue 0.507727 ms) [03/25/2022-13:25:10] [I] Average on 10 runs - GPU latency: 4.99299 ms - Host latency: 8.50419 ms (end to end 9.88187 ms, enqueue 0.507422 ms) [03/25/2022-13:25:10] [I] Average on 10 runs - GPU latency: 4.98978 ms - Host latency: 8.5021 ms (end to end 9.87726 ms, enqueue 0.509082 ms) [03/25/2022-13:25:10] [I] Average on 10 runs - GPU latency: 4.99077 ms - Host latency: 8.50105 ms (end to end 9.87877 ms, enqueue 0.508362 ms) [03/25/2022-13:25:10] [I] Average on 10 runs - GPU latency: 4.99272 ms - Host latency: 8.50148 ms (end to end 9.87813 ms, enqueue 0.50979 ms) [03/25/2022-13:25:10] [I] Average on 10 runs - GPU latency: 4.99033 ms - Host latency: 8.49838 ms (end to end 9.87742 ms, enqueue 0.508276 ms) [03/25/2022-13:25:10] [I] Average on 10 runs - GPU latency: 4.99038 ms - Host latency: 8.49517 ms (end to end 9.87991 ms, enqueue 0.509106 ms) [03/25/2022-13:25:10] [I] Average on 10 runs - GPU latency: 4.99211 ms - Host latency: 8.49998 ms (end to end 9.87651 ms, enqueue 0.507837 ms) [03/25/2022-13:25:10] [I] Average on 10 runs - GPU latency: 4.99333 ms - Host latency: 8.49719 ms (end to end 9.88193 ms, enqueue 0.508252 ms) [03/25/2022-13:25:10] [I] Average on 10 runs - GPU latency: 4.99028 ms - Host latency: 8.49246 ms (end to end 9.87468 ms, enqueue 0.50625 ms) [03/25/2022-13:25:10] [I] Average on 10 runs - GPU latency: 4.99263 ms - Host latency: 8.49297 ms (end to end 9.88105 ms, enqueue 0.50852 ms) [03/25/2022-13:25:10] [I] Average on 10 runs - GPU latency: 4.99167 ms - Host latency: 8.49185 ms (end to end 9.87876 ms, enqueue 0.505151 ms) [03/25/2022-13:25:10] [I] Average on 10 runs - GPU latency: 4.99224 ms - Host latency: 8.49302 ms (end to end 9.88115 ms, enqueue 0.504761 ms) [03/25/2022-13:25:10] [I] Average on 10 runs - GPU latency: 4.99141 ms - Host latency: 8.4876 ms (end to end 9.87986 ms, enqueue 0.50398 ms) [03/25/2022-13:25:10] [I] Average on 10 runs - GPU latency: 4.99077 ms - Host latency: 8.48804 ms (end to end 9.87815 ms, enqueue 0.503906 ms) [03/25/2022-13:25:10] [I] Average on 10 runs - GPU latency: 4.99104 ms - Host latency: 8.49133 ms (end to end 9.87771 ms, enqueue 0.506519 ms) [03/25/2022-13:25:10] [I] Average on 10 runs - GPU latency: 4.99209 ms - Host latency: 8.49246 ms (end to end 9.88083 ms, enqueue 0.505225 ms) [03/25/2022-13:25:10] [I] Average on 10 runs - GPU latency: 4.98997 ms - Host latency: 8.48721 ms (end to end 9.87688 ms, enqueue 0.505347 ms) [03/25/2022-13:25:10] [I] Average on 10 runs - GPU latency: 4.98994 ms - Host latency: 8.48914 ms (end to end 9.87561 ms, enqueue 0.505444 ms) [03/25/2022-13:25:10] [I] Average on 10 runs - GPU latency: 4.99028 ms - Host latency: 8.4896 ms (end to end 9.87507 ms, enqueue 0.506836 ms) [03/25/2022-13:25:10] [I] Average on 10 runs - GPU latency: 4.99104 ms - Host latency: 8.48748 ms (end to end 9.8792 ms, enqueue 0.507251 ms) [03/25/2022-13:25:10] [I] Average on 10 runs - GPU latency: 4.99048 ms - Host latency: 8.48643 ms (end to end 9.8771 ms, enqueue 0.510059 ms) [03/25/2022-13:25:10] [I] Average on 10 runs - GPU latency: 4.99172 ms - Host latency: 8.48801 ms (end to end 9.88142 ms, enqueue 0.507861 ms) [03/25/2022-13:25:10] [I] Average on 10 runs - GPU latency: 4.9895 ms - Host latency: 8.48499 ms (end to end 9.87529 ms, enqueue 0.510645 ms) [03/25/2022-13:25:10] [I] Average on 10 runs - GPU latency: 4.99065 ms - Host latency: 8.48665 ms (end to end 9.88091 ms, enqueue 0.507935 ms) [03/25/2022-13:25:10] [I] Average on 10 runs - GPU latency: 4.9886 ms - Host latency: 8.48208 ms (end to end 9.87534 ms, enqueue 0.510645 ms) [03/25/2022-13:25:10] [I] Average on 10 runs - GPU latency: 4.99209 ms - Host latency: 8.48638 ms (end to end 9.88716 ms, enqueue 0.510156 ms) [03/25/2022-13:25:10] [I] Average on 10 runs - GPU latency: 4.99119 ms - Host latency: 8.48047 ms (end to end 9.88513 ms, enqueue 0.511401 ms) [03/25/2022-13:25:10] [I] [03/25/2022-13:25:10] [I] === Performance summary === [03/25/2022-13:25:10] [I] Throughput: 199.856 qps [03/25/2022-13:25:10] [I] Latency: min = 8.47217 ms, max = 8.56494 ms, mean = 8.50889 ms, median = 8.50952 ms, percentile(99%) = 8.54593 ms [03/25/2022-13:25:10] [I] End-to-End Host Latency: min = 8.56702 ms, max = 9.97339 ms, mean = 9.87959 ms, median = 9.8822 ms, percentile(99%) = 9.91235 ms [03/25/2022-13:25:10] [I] Enqueue Time: min = 0.501221 ms, max = 0.654175 ms, mean = 0.508246 ms, median = 0.506714 ms, percentile(99%) = 0.524292 ms [03/25/2022-13:25:10] [I] H2D Latency: min = 3.40381 ms, max = 3.47464 ms, mean = 3.43683 ms, median = 3.43945 ms, percentile(99%) = 3.46896 ms [03/25/2022-13:25:10] [I] GPU Compute Time: min = 4.9613 ms, max = 5.08203 ms, mean = 4.9916 ms, median = 4.99097 ms, percentile(99%) = 5.00122 ms [03/25/2022-13:25:10] [I] D2H Latency: min = 0.0773926 ms, max = 0.106506 ms, mean = 0.0804736 ms, median = 0.0803223 ms, percentile(99%) = 0.0825195 ms [03/25/2022-13:25:10] [I] Total Host Walltime: 3.01718 s [03/25/2022-13:25:10] [I] Total GPU Compute Time: 3.00993 s [03/25/2022-13:25:10] [I] Explanations of the performance metrics are printed in the verbose logs. [03/25/2022-13:25:10] [V] [03/25/2022-13:25:10] [V] === Explanations of the performance metrics === [03/25/2022-13:25:10] [V] Total Host Walltime: the host walltime from when the first query (after warmups) is enqueued to when the last query is completed. [03/25/2022-13:25:10] [V] GPU Compute Time: the GPU latency to execute the kernels for a query. [03/25/2022-13:25:10] [V] Total GPU Compute Time: the summation of the GPU Compute Time of all the queries. If this is significantly shorter than Total Host Walltime, the GPU may be under-utilized because of host-side overheads or data transfers. [03/25/2022-13:25:10] [V] Throughput: the observed throughput computed by dividing the number of queries by the Total Host Walltime. If this is significantly lower than the reciprocal of GPU Compute Time, the GPU may be under-utilized because of host-side overheads or data transfers. [03/25/2022-13:25:10] [V] Enqueue Time: the host latency to enqueue a query. If this is longer than GPU Compute Time, the GPU may be under-utilized. [03/25/2022-13:25:10] [V] H2D Latency: the latency for host-to-device data transfers for input tensors of a single query. [03/25/2022-13:25:10] [V] D2H Latency: the latency for device-to-host data transfers for output tensors of a single query. [03/25/2022-13:25:10] [V] Latency: the summation of H2D Latency, GPU Compute Time, and D2H Latency. This is the latency to infer a single query. [03/25/2022-13:25:10] [V] End-to-End Host Latency: the duration from when the H2D of a query is called to when the D2H of the same query is completed, which includes the latency to wait for the completion of the previous query. This is the latency of a query if multiple queries are enqueued consecutively. [03/25/2022-13:25:10] [I] &&&& PASSED TensorRT.trtexec [TensorRT v8203] # trtexec --onnx=resnet50_quant_sparse.onnx --int8 --sparsity=enable --shapes=input:128x3x224x224 --verbose