PyTorch/ONNX Model Involving Very Large Images: Myelin error: autotuning: CUDA error 2 allocating 0-byte buffer: out of memory

Description

I am trying to use trtexec to convert an unconventional onnx model to a TensorRT engine. The input is 9 very large (61MP) grayscale images, but we treat the 9 as the channel for application-specific reasons. This model (and much larger versions of it) runs fine in PyTorch on the GPU, but the TensorRT optimized version can’t seem to secure enough memory.

Environment

TensorRT Version: 8.4.3.1
GPU Type: GeForce RTX 3090
Nvidia Driver Version: 470.141.03
CUDA Version: 11.7.1
CUDNN Version: 8.4.1
Operating System + Version: Ubuntu 20.04
Python Version (if applicable): 3.8.13
PyTorch Version (if applicable): 1.12.1
Baremetal or Container (if container which image + tag): Baremetal

Relevant Files

This is the onnx model file:
maxnet_2x4.onnx (28.6 KB)

Steps To Reproduce

  • Exact steps to run my repro. Adjust paths for your system.
 ./trtexec --onnx=/home/c.mourning/maxnet_2x4.onnx --saveEngine=/home/c.mourning/maxnet_2x4.engine --verbose
  • Full traceback of errors encountered
&&&& RUNNING TensorRT.trtexec [TensorRT v8403] # ./trtexec --onnx=/home/c.mourning/maxnet_2x4.onnx --saveEngine=/home/c.mourning/maxnet_2x4.engine --verbose
[09/27/2022-11:27:50] [I] === Model Options ===
[09/27/2022-11:27:50] [I] Format: ONNX
[09/27/2022-11:27:50] [I] Model: /home/c.mourning/maxnet_2x4.onnx
[09/27/2022-11:27:50] [I] Output:
[09/27/2022-11:27:50] [I] === Build Options ===
[09/27/2022-11:27:50] [I] Max batch: explicit batch
[09/27/2022-11:27:50] [I] Memory Pools: workspace: default, dlaSRAM: default, dlaLocalDRAM: default, dlaGlobalDRAM: default
[09/27/2022-11:27:50] [I] minTiming: 1
[09/27/2022-11:27:50] [I] avgTiming: 8
[09/27/2022-11:27:50] [I] Precision: FP32
[09/27/2022-11:27:50] [I] LayerPrecisions: 
[09/27/2022-11:27:50] [I] Calibration: 
[09/27/2022-11:27:50] [I] Refit: Disabled
[09/27/2022-11:27:50] [I] Sparsity: Disabled
[09/27/2022-11:27:50] [I] Safe mode: Disabled
[09/27/2022-11:27:50] [I] DirectIO mode: Disabled
[09/27/2022-11:27:50] [I] Restricted mode: Disabled
[09/27/2022-11:27:50] [I] Build only: Disabled
[09/27/2022-11:27:50] [I] Save engine: /home/c.mourning/maxnet_2x4.engine
[09/27/2022-11:27:50] [I] Load engine: 
[09/27/2022-11:27:50] [I] Profiling verbosity: 0
[09/27/2022-11:27:50] [I] Tactic sources: Using default tactic sources
[09/27/2022-11:27:50] [I] timingCacheMode: local
[09/27/2022-11:27:50] [I] timingCacheFile: 
[09/27/2022-11:27:50] [I] Input(s)s format: fp32:CHW
[09/27/2022-11:27:50] [I] Output(s)s format: fp32:CHW
[09/27/2022-11:27:50] [I] Input build shapes: model
[09/27/2022-11:27:50] [I] Input calibration shapes: model
[09/27/2022-11:27:50] [I] === System Options ===
[09/27/2022-11:27:50] [I] Device: 0
[09/27/2022-11:27:50] [I] DLACore: 
[09/27/2022-11:27:50] [I] Plugins:
[09/27/2022-11:27:50] [I] === Inference Options ===
[09/27/2022-11:27:50] [I] Batch: Explicit
[09/27/2022-11:27:50] [I] Input inference shapes: model
[09/27/2022-11:27:50] [I] Iterations: 10
[09/27/2022-11:27:50] [I] Duration: 3s (+ 200ms warm up)
[09/27/2022-11:27:50] [I] Sleep time: 0ms
[09/27/2022-11:27:50] [I] Idle time: 0ms
[09/27/2022-11:27:50] [I] Streams: 1
[09/27/2022-11:27:50] [I] ExposeDMA: Disabled
[09/27/2022-11:27:50] [I] Data transfers: Enabled
[09/27/2022-11:27:50] [I] Spin-wait: Disabled
[09/27/2022-11:27:50] [I] Multithreading: Disabled
[09/27/2022-11:27:50] [I] CUDA Graph: Disabled
[09/27/2022-11:27:50] [I] Separate profiling: Disabled
[09/27/2022-11:27:50] [I] Time Deserialize: Disabled
[09/27/2022-11:27:50] [I] Time Refit: Disabled
[09/27/2022-11:27:50] [I] Inputs:
[09/27/2022-11:27:50] [I] === Reporting Options ===
[09/27/2022-11:27:50] [I] Verbose: Enabled
[09/27/2022-11:27:50] [I] Averages: 10 inferences
[09/27/2022-11:27:50] [I] Percentile: 99
[09/27/2022-11:27:50] [I] Dump refittable layers:Disabled
[09/27/2022-11:27:50] [I] Dump output: Disabled
[09/27/2022-11:27:50] [I] Profile: Disabled
[09/27/2022-11:27:50] [I] Export timing to JSON file: 
[09/27/2022-11:27:50] [I] Export output to JSON file: 
[09/27/2022-11:27:50] [I] Export profile to JSON file: 
[09/27/2022-11:27:50] [I] 
[09/27/2022-11:27:51] [I] === Device Information ===
[09/27/2022-11:27:51] [I] Selected Device: NVIDIA GeForce RTX 3090
[09/27/2022-11:27:51] [I] Compute Capability: 8.6
[09/27/2022-11:27:51] [I] SMs: 82
[09/27/2022-11:27:51] [I] Compute Clock Rate: 1.695 GHz
[09/27/2022-11:27:51] [I] Device Global Memory: 24258 MiB
[09/27/2022-11:27:51] [I] Shared Memory per SM: 100 KiB
[09/27/2022-11:27:51] [I] Memory Bus Width: 384 bits (ECC disabled)
[09/27/2022-11:27:51] [I] Memory Clock Rate: 9.751 GHz
[09/27/2022-11:27:51] [I] 
[09/27/2022-11:27:51] [I] TensorRT version: 8.4.3
[09/27/2022-11:27:51] [V] [TRT] Registered plugin creator - ::GridAnchor_TRT version 1
[09/27/2022-11:27:51] [V] [TRT] Registered plugin creator - ::GridAnchorRect_TRT version 1
[09/27/2022-11:27:51] [V] [TRT] Registered plugin creator - ::NMS_TRT version 1
[09/27/2022-11:27:51] [V] [TRT] Registered plugin creator - ::Reorg_TRT version 1
[09/27/2022-11:27:51] [V] [TRT] Registered plugin creator - ::Region_TRT version 1
[09/27/2022-11:27:51] [V] [TRT] Registered plugin creator - ::Clip_TRT version 1
[09/27/2022-11:27:51] [V] [TRT] Registered plugin creator - ::LReLU_TRT version 1
[09/27/2022-11:27:51] [V] [TRT] Registered plugin creator - ::PriorBox_TRT version 1
[09/27/2022-11:27:51] [V] [TRT] Registered plugin creator - ::Normalize_TRT version 1
[09/27/2022-11:27:51] [V] [TRT] Registered plugin creator - ::ScatterND version 1
[09/27/2022-11:27:51] [V] [TRT] Registered plugin creator - ::RPROI_TRT version 1
[09/27/2022-11:27:51] [V] [TRT] Registered plugin creator - ::BatchedNMS_TRT version 1
[09/27/2022-11:27:51] [V] [TRT] Registered plugin creator - ::BatchedNMSDynamic_TRT version 1
[09/27/2022-11:27:51] [V] [TRT] Registered plugin creator - ::BatchTilePlugin_TRT version 1
[09/27/2022-11:27:51] [V] [TRT] Registered plugin creator - ::FlattenConcat_TRT version 1
[09/27/2022-11:27:51] [V] [TRT] Registered plugin creator - ::CropAndResize version 1
[09/27/2022-11:27:51] [V] [TRT] Registered plugin creator - ::CropAndResizeDynamic version 1
[09/27/2022-11:27:51] [V] [TRT] Registered plugin creator - ::DetectionLayer_TRT version 1
[09/27/2022-11:27:51] [V] [TRT] Registered plugin creator - ::EfficientNMS_TRT version 1
[09/27/2022-11:27:51] [V] [TRT] Registered plugin creator - ::EfficientNMS_ONNX_TRT version 1
[09/27/2022-11:27:51] [V] [TRT] Registered plugin creator - ::EfficientNMS_Explicit_TF_TRT version 1
[09/27/2022-11:27:51] [V] [TRT] Registered plugin creator - ::EfficientNMS_Implicit_TF_TRT version 1
[09/27/2022-11:27:51] [V] [TRT] Registered plugin creator - ::ProposalDynamic version 1
[09/27/2022-11:27:51] [V] [TRT] Registered plugin creator - ::Proposal version 1
[09/27/2022-11:27:51] [V] [TRT] Registered plugin creator - ::ProposalLayer_TRT version 1
[09/27/2022-11:27:51] [V] [TRT] Registered plugin creator - ::PyramidROIAlign_TRT version 1
[09/27/2022-11:27:51] [V] [TRT] Registered plugin creator - ::ResizeNearest_TRT version 1
[09/27/2022-11:27:51] [V] [TRT] Registered plugin creator - ::Split version 1
[09/27/2022-11:27:51] [V] [TRT] Registered plugin creator - ::SpecialSlice_TRT version 1
[09/27/2022-11:27:51] [V] [TRT] Registered plugin creator - ::InstanceNormalization_TRT version 1
[09/27/2022-11:27:51] [V] [TRT] Registered plugin creator - ::InstanceNormalization_TRT version 2
[09/27/2022-11:27:51] [V] [TRT] Registered plugin creator - ::CoordConvAC version 1
[09/27/2022-11:27:51] [V] [TRT] Registered plugin creator - ::DecodeBbox3DPlugin version 1
[09/27/2022-11:27:51] [V] [TRT] Registered plugin creator - ::GenerateDetection_TRT version 1
[09/27/2022-11:27:51] [V] [TRT] Registered plugin creator - ::MultilevelCropAndResize_TRT version 1
[09/27/2022-11:27:51] [V] [TRT] Registered plugin creator - ::MultilevelProposeROI_TRT version 1
[09/27/2022-11:27:51] [V] [TRT] Registered plugin creator - ::NMSDynamic_TRT version 1
[09/27/2022-11:27:51] [V] [TRT] Registered plugin creator - ::PillarScatterPlugin version 1
[09/27/2022-11:27:51] [V] [TRT] Registered plugin creator - ::VoxelGeneratorPlugin version 1
[09/27/2022-11:27:51] [V] [TRT] Registered plugin creator - ::MultiscaleDeformableAttnPlugin_TRT version 1
[09/27/2022-11:27:51] [I] [TRT] [MemUsageChange] Init CUDA: CPU +328, GPU +0, now: CPU 336, GPU 433 (MiB)
[09/27/2022-11:27:52] [I] [TRT] [MemUsageChange] Init builder kernel library: CPU +327, GPU +104, now: CPU 682, GPU 537 (MiB)
[09/27/2022-11:27:52] [I] Start parsing network model
[09/27/2022-11:27:52] [I] [TRT] ----------------------------------------------------------------
[09/27/2022-11:27:52] [I] [TRT] Input filename:   /home/c.mourning/maxnet_2x4.onnx
[09/27/2022-11:27:52] [I] [TRT] ONNX IR version:  0.0.7
[09/27/2022-11:27:52] [I] [TRT] Opset version:    13
[09/27/2022-11:27:52] [I] [TRT] Producer name:    pytorch
[09/27/2022-11:27:52] [I] [TRT] Producer version: 1.12.1
[09/27/2022-11:27:52] [I] [TRT] Domain:           
[09/27/2022-11:27:52] [I] [TRT] Model version:    0
[09/27/2022-11:27:52] [I] [TRT] Doc string:       
[09/27/2022-11:27:52] [I] [TRT] ----------------------------------------------------------------
[09/27/2022-11:27:52] [V] [TRT] Plugin creator already registered - ::GridAnchor_TRT version 1
[09/27/2022-11:27:52] [V] [TRT] Plugin creator already registered - ::GridAnchorRect_TRT version 1
[09/27/2022-11:27:52] [V] [TRT] Plugin creator already registered - ::NMS_TRT version 1
[09/27/2022-11:27:52] [V] [TRT] Plugin creator already registered - ::Reorg_TRT version 1
[09/27/2022-11:27:52] [V] [TRT] Plugin creator already registered - ::Region_TRT version 1
[09/27/2022-11:27:52] [V] [TRT] Plugin creator already registered - ::Clip_TRT version 1
[09/27/2022-11:27:52] [V] [TRT] Plugin creator already registered - ::LReLU_TRT version 1
[09/27/2022-11:27:52] [V] [TRT] Plugin creator already registered - ::PriorBox_TRT version 1
[09/27/2022-11:27:52] [V] [TRT] Plugin creator already registered - ::Normalize_TRT version 1
[09/27/2022-11:27:52] [V] [TRT] Plugin creator already registered - ::ScatterND version 1
[09/27/2022-11:27:52] [V] [TRT] Plugin creator already registered - ::RPROI_TRT version 1
[09/27/2022-11:27:52] [V] [TRT] Plugin creator already registered - ::BatchedNMS_TRT version 1
[09/27/2022-11:27:52] [V] [TRT] Plugin creator already registered - ::BatchedNMSDynamic_TRT version 1
[09/27/2022-11:27:52] [V] [TRT] Plugin creator already registered - ::BatchTilePlugin_TRT version 1
[09/27/2022-11:27:52] [V] [TRT] Plugin creator already registered - ::FlattenConcat_TRT version 1
[09/27/2022-11:27:52] [V] [TRT] Plugin creator already registered - ::CropAndResize version 1
[09/27/2022-11:27:52] [V] [TRT] Plugin creator already registered - ::CropAndResizeDynamic version 1
[09/27/2022-11:27:52] [V] [TRT] Plugin creator already registered - ::DetectionLayer_TRT version 1
[09/27/2022-11:27:52] [V] [TRT] Plugin creator already registered - ::EfficientNMS_TRT version 1
[09/27/2022-11:27:52] [V] [TRT] Plugin creator already registered - ::EfficientNMS_ONNX_TRT version 1
[09/27/2022-11:27:52] [V] [TRT] Plugin creator already registered - ::EfficientNMS_Explicit_TF_TRT version 1
[09/27/2022-11:27:52] [V] [TRT] Plugin creator already registered - ::EfficientNMS_Implicit_TF_TRT version 1
[09/27/2022-11:27:52] [V] [TRT] Plugin creator already registered - ::ProposalDynamic version 1
[09/27/2022-11:27:52] [V] [TRT] Plugin creator already registered - ::Proposal version 1
[09/27/2022-11:27:52] [V] [TRT] Plugin creator already registered - ::ProposalLayer_TRT version 1
[09/27/2022-11:27:52] [V] [TRT] Plugin creator already registered - ::PyramidROIAlign_TRT version 1
[09/27/2022-11:27:52] [V] [TRT] Plugin creator already registered - ::ResizeNearest_TRT version 1
[09/27/2022-11:27:52] [V] [TRT] Plugin creator already registered - ::Split version 1
[09/27/2022-11:27:52] [V] [TRT] Plugin creator already registered - ::SpecialSlice_TRT version 1
[09/27/2022-11:27:52] [V] [TRT] Plugin creator already registered - ::InstanceNormalization_TRT version 1
[09/27/2022-11:27:52] [V] [TRT] Plugin creator already registered - ::InstanceNormalization_TRT version 2
[09/27/2022-11:27:52] [V] [TRT] Plugin creator already registered - ::CoordConvAC version 1
[09/27/2022-11:27:52] [V] [TRT] Plugin creator already registered - ::DecodeBbox3DPlugin version 1
[09/27/2022-11:27:52] [V] [TRT] Plugin creator already registered - ::GenerateDetection_TRT version 1
[09/27/2022-11:27:52] [V] [TRT] Plugin creator already registered - ::MultilevelCropAndResize_TRT version 1
[09/27/2022-11:27:52] [V] [TRT] Plugin creator already registered - ::MultilevelProposeROI_TRT version 1
[09/27/2022-11:27:52] [V] [TRT] Plugin creator already registered - ::NMSDynamic_TRT version 1
[09/27/2022-11:27:52] [V] [TRT] Plugin creator already registered - ::PillarScatterPlugin version 1
[09/27/2022-11:27:52] [V] [TRT] Plugin creator already registered - ::VoxelGeneratorPlugin version 1
[09/27/2022-11:27:52] [V] [TRT] Plugin creator already registered - ::MultiscaleDeformableAttnPlugin_TRT version 1
[09/27/2022-11:27:52] [V] [TRT] Adding network input: im_stack with dtype: float32, dimensions: (-1, 9, 6380, 9568)
[09/27/2022-11:27:52] [V] [TRT] Registering tensor: im_stack for ONNX tensor: im_stack
[09/27/2022-11:27:52] [V] [TRT] Importing initializer: blocks.0.0.Conv.weight
[09/27/2022-11:27:52] [V] [TRT] Importing initializer: blocks.0.1.Conv.weight
[09/27/2022-11:27:52] [V] [TRT] Importing initializer: blocks.0.2.Conv.weight
[09/27/2022-11:27:52] [V] [TRT] Importing initializer: blocks.0.3.Conv.weight
[09/27/2022-11:27:52] [V] [TRT] Importing initializer: blocks.1.0.Conv.weight
[09/27/2022-11:27:52] [V] [TRT] Importing initializer: blocks.1.1.Conv.weight
[09/27/2022-11:27:52] [V] [TRT] Importing initializer: blocks.1.2.Conv.weight
[09/27/2022-11:27:52] [V] [TRT] Importing initializer: blocks.1.3.Conv.weight
[09/27/2022-11:27:52] [V] [TRT] Parsing node: Conv_0 [Conv]
[09/27/2022-11:27:52] [V] [TRT] Searching for input: im_stack
[09/27/2022-11:27:52] [V] [TRT] Searching for input: blocks.0.0.Conv.weight
[09/27/2022-11:27:52] [V] [TRT] Conv_0 [Conv] inputs: [im_stack -> (-1, 9, 6380, 9568)[FLOAT]], [blocks.0.0.Conv.weight -> (9, 1, 9, 9)[FLOAT]], 
[09/27/2022-11:27:52] [V] [TRT] Convolution input dimensions: (-1, 9, 6380, 9568)
[09/27/2022-11:27:52] [V] [TRT] Registering layer: Conv_0 for ONNX node: Conv_0
[09/27/2022-11:27:52] [V] [TRT] Using kernel: (9, 9), strides: (1, 1), prepadding: (4, 4), postpadding: (4, 4), dilations: (1, 1), numOutputs: 9
[09/27/2022-11:27:52] [V] [TRT] Convolution output dimensions: (-1, 9, 6380, 9568)
[09/27/2022-11:27:52] [V] [TRT] Registering tensor: input for ONNX tensor: input
[09/27/2022-11:27:52] [V] [TRT] Conv_0 [Conv] outputs: [input -> (-1, 9, 6380, 9568)[FLOAT]], 
[09/27/2022-11:27:52] [V] [TRT] Parsing node: Constant_1 [Constant]
[09/27/2022-11:27:52] [V] [TRT] Constant_1 [Constant] inputs: 
[09/27/2022-11:27:52] [W] [TRT] onnx2trt_utils.cpp:369: Your ONNX model has been generated with INT64 weights, while TensorRT does not natively support INT64. Attempting to cast down to INT32.
[09/27/2022-11:27:52] [V] [TRT] Constant_1 [Constant] outputs: [onnx::ReduceSum_10 -> (1)[INT32]], 
[09/27/2022-11:27:52] [V] [TRT] Parsing node: ReduceSum_2 [ReduceSum]
[09/27/2022-11:27:52] [V] [TRT] Searching for input: input
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::ReduceSum_10
[09/27/2022-11:27:52] [V] [TRT] ReduceSum_2 [ReduceSum] inputs: [input -> (-1, 9, 6380, 9568)[FLOAT]], [onnx::ReduceSum_10 -> (1)[INT32]], 
[09/27/2022-11:27:52] [V] [TRT] Registering layer: ReduceSum_2 for ONNX node: ReduceSum_2
[09/27/2022-11:27:52] [V] [TRT] Registering tensor: onnx::Shape_11 for ONNX tensor: onnx::Shape_11
[09/27/2022-11:27:52] [V] [TRT] ReduceSum_2 [ReduceSum] outputs: [onnx::Shape_11 -> (-1, 6380, 9568)[FLOAT]], 
[09/27/2022-11:27:52] [V] [TRT] Parsing node: Shape_3 [Shape]
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::Shape_11
[09/27/2022-11:27:52] [V] [TRT] Shape_3 [Shape] inputs: [onnx::Shape_11 -> (-1, 6380, 9568)[FLOAT]], 
[09/27/2022-11:27:52] [V] [TRT] Registering layer: Shape_3 for ONNX node: Shape_3
[09/27/2022-11:27:52] [V] [TRT] Registering tensor: onnx::ConstantOfShape_12 for ONNX tensor: onnx::ConstantOfShape_12
[09/27/2022-11:27:52] [V] [TRT] Shape_3 [Shape] outputs: [onnx::ConstantOfShape_12 -> (3)[INT32]], 
[09/27/2022-11:27:52] [V] [TRT] Parsing node: ConstantOfShape_4 [ConstantOfShape]
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::ConstantOfShape_12
[09/27/2022-11:27:52] [V] [TRT] ConstantOfShape_4 [ConstantOfShape] inputs: [onnx::ConstantOfShape_12 -> (3)[INT32]], 
[09/27/2022-11:27:52] [V] [TRT] Registering layer: ConstantOfShape_4 for ONNX node: ConstantOfShape_4
[09/27/2022-11:27:52] [V] [TRT] Registering tensor: onnx::Cast_13 for ONNX tensor: onnx::Cast_13
[09/27/2022-11:27:52] [V] [TRT] ConstantOfShape_4 [ConstantOfShape] outputs: [onnx::Cast_13 -> (-1, 6380, 9568)[FLOAT]], 
[09/27/2022-11:27:52] [V] [TRT] Parsing node: Cast_5 [Cast]
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::Cast_13
[09/27/2022-11:27:52] [V] [TRT] Cast_5 [Cast] inputs: [onnx::Cast_13 -> (-1, 6380, 9568)[FLOAT]], 
[09/27/2022-11:27:52] [V] [TRT] Casting to type: int32
[09/27/2022-11:27:52] [V] [TRT] Registering layer: Cast_5 for ONNX node: Cast_5
[09/27/2022-11:27:52] [V] [TRT] Registering tensor: onnx::Where_14 for ONNX tensor: onnx::Where_14
[09/27/2022-11:27:52] [V] [TRT] Cast_5 [Cast] outputs: [onnx::Where_14 -> (-1, 6380, 9568)[INT32]], 
[09/27/2022-11:27:52] [V] [TRT] Parsing node: Conv_6 [Conv]
[09/27/2022-11:27:52] [V] [TRT] Searching for input: input
[09/27/2022-11:27:52] [V] [TRT] Searching for input: blocks.0.1.Conv.weight
[09/27/2022-11:27:52] [V] [TRT] Conv_6 [Conv] inputs: [input -> (-1, 9, 6380, 9568)[FLOAT]], [blocks.0.1.Conv.weight -> (9, 1, 9, 9)[FLOAT]], 
[09/27/2022-11:27:52] [V] [TRT] Convolution input dimensions: (-1, 9, 6380, 9568)
[09/27/2022-11:27:52] [V] [TRT] Registering layer: Conv_6 for ONNX node: Conv_6
[09/27/2022-11:27:52] [V] [TRT] Using kernel: (9, 9), strides: (1, 1), prepadding: (4, 4), postpadding: (4, 4), dilations: (1, 1), numOutputs: 9
[09/27/2022-11:27:52] [V] [TRT] Convolution output dimensions: (-1, 9, 6380, 9568)
[09/27/2022-11:27:52] [V] [TRT] Registering tensor: input.4 for ONNX tensor: input.4
[09/27/2022-11:27:52] [V] [TRT] Conv_6 [Conv] outputs: [input.4 -> (-1, 9, 6380, 9568)[FLOAT]], 
[09/27/2022-11:27:52] [V] [TRT] Parsing node: Constant_7 [Constant]
[09/27/2022-11:27:52] [V] [TRT] Constant_7 [Constant] inputs: 
[09/27/2022-11:27:52] [V] [TRT] Constant_7 [Constant] outputs: [onnx::ReduceSum_16 -> (1)[INT32]], 
[09/27/2022-11:27:52] [V] [TRT] Parsing node: ReduceSum_8 [ReduceSum]
[09/27/2022-11:27:52] [V] [TRT] Searching for input: input.4
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::ReduceSum_16
[09/27/2022-11:27:52] [V] [TRT] ReduceSum_8 [ReduceSum] inputs: [input.4 -> (-1, 9, 6380, 9568)[FLOAT]], [onnx::ReduceSum_16 -> (1)[INT32]], 
[09/27/2022-11:27:52] [V] [TRT] Registering layer: ReduceSum_8 for ONNX node: ReduceSum_8
[09/27/2022-11:27:52] [V] [TRT] Registering tensor: onnx::Max_17 for ONNX tensor: onnx::Max_17
[09/27/2022-11:27:52] [V] [TRT] ReduceSum_8 [ReduceSum] outputs: [onnx::Max_17 -> (-1, 6380, 9568)[FLOAT]], 
[09/27/2022-11:27:52] [V] [TRT] Parsing node: Max_9 [Max]
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::Max_17
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::Shape_11
[09/27/2022-11:27:52] [V] [TRT] Max_9 [Max] inputs: [onnx::Max_17 -> (-1, 6380, 9568)[FLOAT]], [onnx::Shape_11 -> (-1, 6380, 9568)[FLOAT]], 
[09/27/2022-11:27:52] [V] [TRT] Registering layer: Max_9 for ONNX node: Max_9
[09/27/2022-11:27:52] [V] [TRT] Registering tensor: onnx::Equal_18 for ONNX tensor: onnx::Equal_18
[09/27/2022-11:27:52] [V] [TRT] Max_9 [Max] outputs: [onnx::Equal_18 -> (-1, 6380, 9568)[FLOAT]], 
[09/27/2022-11:27:52] [V] [TRT] Parsing node: Equal_10 [Equal]
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::Equal_18
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::Shape_11
[09/27/2022-11:27:52] [V] [TRT] Equal_10 [Equal] inputs: [onnx::Equal_18 -> (-1, 6380, 9568)[FLOAT]], [onnx::Shape_11 -> (-1, 6380, 9568)[FLOAT]], 
[09/27/2022-11:27:52] [V] [TRT] Registering layer: Equal_10 for ONNX node: Equal_10
[09/27/2022-11:27:52] [V] [TRT] Registering tensor: onnx::Not_19 for ONNX tensor: onnx::Not_19
[09/27/2022-11:27:52] [V] [TRT] Equal_10 [Equal] outputs: [onnx::Not_19 -> (-1, 6380, 9568)[BOOL]], 
[09/27/2022-11:27:52] [V] [TRT] Parsing node: Not_11 [Not]
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::Not_19
[09/27/2022-11:27:52] [V] [TRT] Not_11 [Not] inputs: [onnx::Not_19 -> (-1, 6380, 9568)[BOOL]], 
[09/27/2022-11:27:52] [V] [TRT] Registering layer: Not_11 for ONNX node: Not_11
[09/27/2022-11:27:52] [V] [TRT] Registering tensor: onnx::Cast_20 for ONNX tensor: onnx::Cast_20
[09/27/2022-11:27:52] [V] [TRT] Not_11 [Not] outputs: [onnx::Cast_20 -> (-1, 6380, 9568)[BOOL]], 
[09/27/2022-11:27:52] [V] [TRT] Parsing node: Cast_12 [Cast]
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::Cast_20
[09/27/2022-11:27:52] [V] [TRT] Cast_12 [Cast] inputs: [onnx::Cast_20 -> (-1, 6380, 9568)[BOOL]], 
[09/27/2022-11:27:52] [V] [TRT] Casting to type: bool
[09/27/2022-11:27:52] [V] [TRT] Registering layer: Cast_12 for ONNX node: Cast_12
[09/27/2022-11:27:52] [V] [TRT] Registering tensor: onnx::Where_21 for ONNX tensor: onnx::Where_21
[09/27/2022-11:27:52] [V] [TRT] Cast_12 [Cast] outputs: [onnx::Where_21 -> (-1, 6380, 9568)[BOOL]], 
[09/27/2022-11:27:52] [V] [TRT] Parsing node: Constant_13 [Constant]
[09/27/2022-11:27:52] [V] [TRT] Constant_13 [Constant] inputs: 
[09/27/2022-11:27:52] [V] [TRT] Constant_13 [Constant] outputs: [onnx::Where_22 -> ()[INT32]], 
[09/27/2022-11:27:52] [V] [TRT] Parsing node: Where_14 [Where]
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::Where_21
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::Where_22
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::Where_14
[09/27/2022-11:27:52] [V] [TRT] Where_14 [Where] inputs: [onnx::Where_21 -> (-1, 6380, 9568)[BOOL]], [onnx::Where_22 -> ()[INT32]], [onnx::Where_14 -> (-1, 6380, 9568)[INT32]], 
[09/27/2022-11:27:52] [V] [TRT] Registering layer: onnx::Where_22 for ONNX node: onnx::Where_22
[09/27/2022-11:27:52] [V] [TRT] Registering layer: Where_14 for ONNX node: Where_14
[09/27/2022-11:27:52] [V] [TRT] Registering tensor: onnx::Where_23 for ONNX tensor: onnx::Where_23
[09/27/2022-11:27:52] [V] [TRT] Where_14 [Where] outputs: [onnx::Where_23 -> (-1, 6380, 9568)[INT32]], 
[09/27/2022-11:27:52] [V] [TRT] Parsing node: Conv_15 [Conv]
[09/27/2022-11:27:52] [V] [TRT] Searching for input: input.4
[09/27/2022-11:27:52] [V] [TRT] Searching for input: blocks.0.2.Conv.weight
[09/27/2022-11:27:52] [V] [TRT] Conv_15 [Conv] inputs: [input.4 -> (-1, 9, 6380, 9568)[FLOAT]], [blocks.0.2.Conv.weight -> (9, 1, 9, 9)[FLOAT]], 
[09/27/2022-11:27:52] [V] [TRT] Convolution input dimensions: (-1, 9, 6380, 9568)
[09/27/2022-11:27:52] [V] [TRT] Registering layer: Conv_15 for ONNX node: Conv_15
[09/27/2022-11:27:52] [V] [TRT] Using kernel: (9, 9), strides: (1, 1), prepadding: (4, 4), postpadding: (4, 4), dilations: (1, 1), numOutputs: 9
[09/27/2022-11:27:52] [V] [TRT] Convolution output dimensions: (-1, 9, 6380, 9568)
[09/27/2022-11:27:52] [V] [TRT] Registering tensor: input.8 for ONNX tensor: input.8
[09/27/2022-11:27:52] [V] [TRT] Conv_15 [Conv] outputs: [input.8 -> (-1, 9, 6380, 9568)[FLOAT]], 
[09/27/2022-11:27:52] [V] [TRT] Parsing node: Constant_16 [Constant]
[09/27/2022-11:27:52] [V] [TRT] Constant_16 [Constant] inputs: 
[09/27/2022-11:27:52] [V] [TRT] Constant_16 [Constant] outputs: [onnx::ReduceSum_25 -> (1)[INT32]], 
[09/27/2022-11:27:52] [V] [TRT] Parsing node: ReduceSum_17 [ReduceSum]
[09/27/2022-11:27:52] [V] [TRT] Searching for input: input.8
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::ReduceSum_25
[09/27/2022-11:27:52] [V] [TRT] ReduceSum_17 [ReduceSum] inputs: [input.8 -> (-1, 9, 6380, 9568)[FLOAT]], [onnx::ReduceSum_25 -> (1)[INT32]], 
[09/27/2022-11:27:52] [V] [TRT] Registering layer: ReduceSum_17 for ONNX node: ReduceSum_17
[09/27/2022-11:27:52] [V] [TRT] Registering tensor: onnx::Max_26 for ONNX tensor: onnx::Max_26
[09/27/2022-11:27:52] [V] [TRT] ReduceSum_17 [ReduceSum] outputs: [onnx::Max_26 -> (-1, 6380, 9568)[FLOAT]], 
[09/27/2022-11:27:52] [V] [TRT] Parsing node: Max_18 [Max]
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::Max_26
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::Equal_18
[09/27/2022-11:27:52] [V] [TRT] Max_18 [Max] inputs: [onnx::Max_26 -> (-1, 6380, 9568)[FLOAT]], [onnx::Equal_18 -> (-1, 6380, 9568)[FLOAT]], 
[09/27/2022-11:27:52] [V] [TRT] Registering layer: Max_18 for ONNX node: Max_18
[09/27/2022-11:27:52] [V] [TRT] Registering tensor: onnx::Equal_27 for ONNX tensor: onnx::Equal_27
[09/27/2022-11:27:52] [V] [TRT] Max_18 [Max] outputs: [onnx::Equal_27 -> (-1, 6380, 9568)[FLOAT]], 
[09/27/2022-11:27:52] [V] [TRT] Parsing node: Equal_19 [Equal]
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::Equal_27
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::Equal_18
[09/27/2022-11:27:52] [V] [TRT] Equal_19 [Equal] inputs: [onnx::Equal_27 -> (-1, 6380, 9568)[FLOAT]], [onnx::Equal_18 -> (-1, 6380, 9568)[FLOAT]], 
[09/27/2022-11:27:52] [V] [TRT] Registering layer: Equal_19 for ONNX node: Equal_19
[09/27/2022-11:27:52] [V] [TRT] Registering tensor: onnx::Not_28 for ONNX tensor: onnx::Not_28
[09/27/2022-11:27:52] [V] [TRT] Equal_19 [Equal] outputs: [onnx::Not_28 -> (-1, 6380, 9568)[BOOL]], 
[09/27/2022-11:27:52] [V] [TRT] Parsing node: Not_20 [Not]
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::Not_28
[09/27/2022-11:27:52] [V] [TRT] Not_20 [Not] inputs: [onnx::Not_28 -> (-1, 6380, 9568)[BOOL]], 
[09/27/2022-11:27:52] [V] [TRT] Registering layer: Not_20 for ONNX node: Not_20
[09/27/2022-11:27:52] [V] [TRT] Registering tensor: onnx::Cast_29 for ONNX tensor: onnx::Cast_29
[09/27/2022-11:27:52] [V] [TRT] Not_20 [Not] outputs: [onnx::Cast_29 -> (-1, 6380, 9568)[BOOL]], 
[09/27/2022-11:27:52] [V] [TRT] Parsing node: Cast_21 [Cast]
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::Cast_29
[09/27/2022-11:27:52] [V] [TRT] Cast_21 [Cast] inputs: [onnx::Cast_29 -> (-1, 6380, 9568)[BOOL]], 
[09/27/2022-11:27:52] [V] [TRT] Casting to type: bool
[09/27/2022-11:27:52] [V] [TRT] Registering layer: Cast_21 for ONNX node: Cast_21
[09/27/2022-11:27:52] [V] [TRT] Registering tensor: onnx::Where_30 for ONNX tensor: onnx::Where_30
[09/27/2022-11:27:52] [V] [TRT] Cast_21 [Cast] outputs: [onnx::Where_30 -> (-1, 6380, 9568)[BOOL]], 
[09/27/2022-11:27:52] [V] [TRT] Parsing node: Constant_22 [Constant]
[09/27/2022-11:27:52] [V] [TRT] Constant_22 [Constant] inputs: 
[09/27/2022-11:27:52] [V] [TRT] Constant_22 [Constant] outputs: [onnx::Where_31 -> ()[INT32]], 
[09/27/2022-11:27:52] [V] [TRT] Parsing node: Where_23 [Where]
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::Where_30
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::Where_31
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::Where_23
[09/27/2022-11:27:52] [V] [TRT] Where_23 [Where] inputs: [onnx::Where_30 -> (-1, 6380, 9568)[BOOL]], [onnx::Where_31 -> ()[INT32]], [onnx::Where_23 -> (-1, 6380, 9568)[INT32]], 
[09/27/2022-11:27:52] [V] [TRT] Registering layer: onnx::Where_31 for ONNX node: onnx::Where_31
[09/27/2022-11:27:52] [V] [TRT] Registering layer: Where_23 for ONNX node: Where_23
[09/27/2022-11:27:52] [V] [TRT] Registering tensor: onnx::Where_32 for ONNX tensor: onnx::Where_32
[09/27/2022-11:27:52] [V] [TRT] Where_23 [Where] outputs: [onnx::Where_32 -> (-1, 6380, 9568)[INT32]], 
[09/27/2022-11:27:52] [V] [TRT] Parsing node: Conv_24 [Conv]
[09/27/2022-11:27:52] [V] [TRT] Searching for input: input.8
[09/27/2022-11:27:52] [V] [TRT] Searching for input: blocks.0.3.Conv.weight
[09/27/2022-11:27:52] [V] [TRT] Conv_24 [Conv] inputs: [input.8 -> (-1, 9, 6380, 9568)[FLOAT]], [blocks.0.3.Conv.weight -> (9, 1, 9, 9)[FLOAT]], 
[09/27/2022-11:27:52] [V] [TRT] Convolution input dimensions: (-1, 9, 6380, 9568)
[09/27/2022-11:27:52] [V] [TRT] Registering layer: Conv_24 for ONNX node: Conv_24
[09/27/2022-11:27:52] [V] [TRT] Using kernel: (9, 9), strides: (1, 1), prepadding: (4, 4), postpadding: (4, 4), dilations: (1, 1), numOutputs: 9
[09/27/2022-11:27:52] [V] [TRT] Convolution output dimensions: (-1, 9, 6380, 9568)
[09/27/2022-11:27:52] [V] [TRT] Registering tensor: onnx::ReduceSum_33 for ONNX tensor: onnx::ReduceSum_33
[09/27/2022-11:27:52] [V] [TRT] Conv_24 [Conv] outputs: [onnx::ReduceSum_33 -> (-1, 9, 6380, 9568)[FLOAT]], 
[09/27/2022-11:27:52] [V] [TRT] Parsing node: Constant_25 [Constant]
[09/27/2022-11:27:52] [V] [TRT] Constant_25 [Constant] inputs: 
[09/27/2022-11:27:52] [V] [TRT] Constant_25 [Constant] outputs: [onnx::ReduceSum_34 -> (1)[INT32]], 
[09/27/2022-11:27:52] [V] [TRT] Parsing node: ReduceSum_26 [ReduceSum]
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::ReduceSum_33
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::ReduceSum_34
[09/27/2022-11:27:52] [V] [TRT] ReduceSum_26 [ReduceSum] inputs: [onnx::ReduceSum_33 -> (-1, 9, 6380, 9568)[FLOAT]], [onnx::ReduceSum_34 -> (1)[INT32]], 
[09/27/2022-11:27:52] [V] [TRT] Registering layer: ReduceSum_26 for ONNX node: ReduceSum_26
[09/27/2022-11:27:52] [V] [TRT] Registering tensor: onnx::Max_35 for ONNX tensor: onnx::Max_35
[09/27/2022-11:27:52] [V] [TRT] ReduceSum_26 [ReduceSum] outputs: [onnx::Max_35 -> (-1, 6380, 9568)[FLOAT]], 
[09/27/2022-11:27:52] [V] [TRT] Parsing node: Max_27 [Max]
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::Max_35
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::Equal_27
[09/27/2022-11:27:52] [V] [TRT] Max_27 [Max] inputs: [onnx::Max_35 -> (-1, 6380, 9568)[FLOAT]], [onnx::Equal_27 -> (-1, 6380, 9568)[FLOAT]], 
[09/27/2022-11:27:52] [V] [TRT] Registering layer: Max_27 for ONNX node: Max_27
[09/27/2022-11:27:52] [V] [TRT] Registering tensor: onnx::Equal_36 for ONNX tensor: onnx::Equal_36
[09/27/2022-11:27:52] [V] [TRT] Max_27 [Max] outputs: [onnx::Equal_36 -> (-1, 6380, 9568)[FLOAT]], 
[09/27/2022-11:27:52] [V] [TRT] Parsing node: Equal_28 [Equal]
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::Equal_36
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::Equal_27
[09/27/2022-11:27:52] [V] [TRT] Equal_28 [Equal] inputs: [onnx::Equal_36 -> (-1, 6380, 9568)[FLOAT]], [onnx::Equal_27 -> (-1, 6380, 9568)[FLOAT]], 
[09/27/2022-11:27:52] [V] [TRT] Registering layer: Equal_28 for ONNX node: Equal_28
[09/27/2022-11:27:52] [V] [TRT] Registering tensor: onnx::Not_37 for ONNX tensor: onnx::Not_37
[09/27/2022-11:27:52] [V] [TRT] Equal_28 [Equal] outputs: [onnx::Not_37 -> (-1, 6380, 9568)[BOOL]], 
[09/27/2022-11:27:52] [V] [TRT] Parsing node: Not_29 [Not]
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::Not_37
[09/27/2022-11:27:52] [V] [TRT] Not_29 [Not] inputs: [onnx::Not_37 -> (-1, 6380, 9568)[BOOL]], 
[09/27/2022-11:27:52] [V] [TRT] Registering layer: Not_29 for ONNX node: Not_29
[09/27/2022-11:27:52] [V] [TRT] Registering tensor: onnx::Cast_38 for ONNX tensor: onnx::Cast_38
[09/27/2022-11:27:52] [V] [TRT] Not_29 [Not] outputs: [onnx::Cast_38 -> (-1, 6380, 9568)[BOOL]], 
[09/27/2022-11:27:52] [V] [TRT] Parsing node: Cast_30 [Cast]
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::Cast_38
[09/27/2022-11:27:52] [V] [TRT] Cast_30 [Cast] inputs: [onnx::Cast_38 -> (-1, 6380, 9568)[BOOL]], 
[09/27/2022-11:27:52] [V] [TRT] Casting to type: bool
[09/27/2022-11:27:52] [V] [TRT] Registering layer: Cast_30 for ONNX node: Cast_30
[09/27/2022-11:27:52] [V] [TRT] Registering tensor: onnx::Where_39 for ONNX tensor: onnx::Where_39
[09/27/2022-11:27:52] [V] [TRT] Cast_30 [Cast] outputs: [onnx::Where_39 -> (-1, 6380, 9568)[BOOL]], 
[09/27/2022-11:27:52] [V] [TRT] Parsing node: Constant_31 [Constant]
[09/27/2022-11:27:52] [V] [TRT] Constant_31 [Constant] inputs: 
[09/27/2022-11:27:52] [V] [TRT] Constant_31 [Constant] outputs: [onnx::Where_40 -> ()[INT32]], 
[09/27/2022-11:27:52] [V] [TRT] Parsing node: Where_32 [Where]
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::Where_39
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::Where_40
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::Where_32
[09/27/2022-11:27:52] [V] [TRT] Where_32 [Where] inputs: [onnx::Where_39 -> (-1, 6380, 9568)[BOOL]], [onnx::Where_40 -> ()[INT32]], [onnx::Where_32 -> (-1, 6380, 9568)[INT32]], 
[09/27/2022-11:27:52] [V] [TRT] Registering layer: onnx::Where_40 for ONNX node: onnx::Where_40
[09/27/2022-11:27:52] [V] [TRT] Registering layer: Where_32 for ONNX node: Where_32
[09/27/2022-11:27:52] [V] [TRT] Registering tensor: onnx::Where_41 for ONNX tensor: onnx::Where_41
[09/27/2022-11:27:52] [V] [TRT] Where_32 [Where] outputs: [onnx::Where_41 -> (-1, 6380, 9568)[INT32]], 
[09/27/2022-11:27:52] [V] [TRT] Parsing node: Conv_33 [Conv]
[09/27/2022-11:27:52] [V] [TRT] Searching for input: im_stack
[09/27/2022-11:27:52] [V] [TRT] Searching for input: blocks.1.0.Conv.weight
[09/27/2022-11:27:52] [V] [TRT] Conv_33 [Conv] inputs: [im_stack -> (-1, 9, 6380, 9568)[FLOAT]], [blocks.1.0.Conv.weight -> (9, 1, 9, 9)[FLOAT]], 
[09/27/2022-11:27:52] [V] [TRT] Convolution input dimensions: (-1, 9, 6380, 9568)
[09/27/2022-11:27:52] [V] [TRT] Registering layer: Conv_33 for ONNX node: Conv_33
[09/27/2022-11:27:52] [V] [TRT] Using kernel: (9, 9), strides: (1, 1), prepadding: (4, 4), postpadding: (4, 4), dilations: (1, 1), numOutputs: 9
[09/27/2022-11:27:52] [V] [TRT] Convolution output dimensions: (-1, 9, 6380, 9568)
[09/27/2022-11:27:52] [V] [TRT] Registering tensor: input.12 for ONNX tensor: input.12
[09/27/2022-11:27:52] [V] [TRT] Conv_33 [Conv] outputs: [input.12 -> (-1, 9, 6380, 9568)[FLOAT]], 
[09/27/2022-11:27:52] [V] [TRT] Parsing node: Constant_34 [Constant]
[09/27/2022-11:27:52] [V] [TRT] Constant_34 [Constant] inputs: 
[09/27/2022-11:27:52] [V] [TRT] Constant_34 [Constant] outputs: [onnx::ReduceSum_43 -> (1)[INT32]], 
[09/27/2022-11:27:52] [V] [TRT] Parsing node: ReduceSum_35 [ReduceSum]
[09/27/2022-11:27:52] [V] [TRT] Searching for input: input.12
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::ReduceSum_43
[09/27/2022-11:27:52] [V] [TRT] ReduceSum_35 [ReduceSum] inputs: [input.12 -> (-1, 9, 6380, 9568)[FLOAT]], [onnx::ReduceSum_43 -> (1)[INT32]], 
[09/27/2022-11:27:52] [V] [TRT] Registering layer: ReduceSum_35 for ONNX node: ReduceSum_35
[09/27/2022-11:27:52] [V] [TRT] Registering tensor: onnx::Max_44 for ONNX tensor: onnx::Max_44
[09/27/2022-11:27:52] [V] [TRT] ReduceSum_35 [ReduceSum] outputs: [onnx::Max_44 -> (-1, 6380, 9568)[FLOAT]], 
[09/27/2022-11:27:52] [V] [TRT] Parsing node: Max_36 [Max]
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::Max_44
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::Equal_36
[09/27/2022-11:27:52] [V] [TRT] Max_36 [Max] inputs: [onnx::Max_44 -> (-1, 6380, 9568)[FLOAT]], [onnx::Equal_36 -> (-1, 6380, 9568)[FLOAT]], 
[09/27/2022-11:27:52] [V] [TRT] Registering layer: Max_36 for ONNX node: Max_36
[09/27/2022-11:27:52] [V] [TRT] Registering tensor: onnx::Equal_45 for ONNX tensor: onnx::Equal_45
[09/27/2022-11:27:52] [V] [TRT] Max_36 [Max] outputs: [onnx::Equal_45 -> (-1, 6380, 9568)[FLOAT]], 
[09/27/2022-11:27:52] [V] [TRT] Parsing node: Equal_37 [Equal]
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::Equal_45
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::Equal_36
[09/27/2022-11:27:52] [V] [TRT] Equal_37 [Equal] inputs: [onnx::Equal_45 -> (-1, 6380, 9568)[FLOAT]], [onnx::Equal_36 -> (-1, 6380, 9568)[FLOAT]], 
[09/27/2022-11:27:52] [V] [TRT] Registering layer: Equal_37 for ONNX node: Equal_37
[09/27/2022-11:27:52] [V] [TRT] Registering tensor: onnx::Not_46 for ONNX tensor: onnx::Not_46
[09/27/2022-11:27:52] [V] [TRT] Equal_37 [Equal] outputs: [onnx::Not_46 -> (-1, 6380, 9568)[BOOL]], 
[09/27/2022-11:27:52] [V] [TRT] Parsing node: Not_38 [Not]
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::Not_46
[09/27/2022-11:27:52] [V] [TRT] Not_38 [Not] inputs: [onnx::Not_46 -> (-1, 6380, 9568)[BOOL]], 
[09/27/2022-11:27:52] [V] [TRT] Registering layer: Not_38 for ONNX node: Not_38
[09/27/2022-11:27:52] [V] [TRT] Registering tensor: onnx::Cast_47 for ONNX tensor: onnx::Cast_47
[09/27/2022-11:27:52] [V] [TRT] Not_38 [Not] outputs: [onnx::Cast_47 -> (-1, 6380, 9568)[BOOL]], 
[09/27/2022-11:27:52] [V] [TRT] Parsing node: Cast_39 [Cast]
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::Cast_47
[09/27/2022-11:27:52] [V] [TRT] Cast_39 [Cast] inputs: [onnx::Cast_47 -> (-1, 6380, 9568)[BOOL]], 
[09/27/2022-11:27:52] [V] [TRT] Casting to type: bool
[09/27/2022-11:27:52] [V] [TRT] Registering layer: Cast_39 for ONNX node: Cast_39
[09/27/2022-11:27:52] [V] [TRT] Registering tensor: onnx::Where_48 for ONNX tensor: onnx::Where_48
[09/27/2022-11:27:52] [V] [TRT] Cast_39 [Cast] outputs: [onnx::Where_48 -> (-1, 6380, 9568)[BOOL]], 
[09/27/2022-11:27:52] [V] [TRT] Parsing node: Constant_40 [Constant]
[09/27/2022-11:27:52] [V] [TRT] Constant_40 [Constant] inputs: 
[09/27/2022-11:27:52] [V] [TRT] Constant_40 [Constant] outputs: [onnx::Where_49 -> ()[INT32]], 
[09/27/2022-11:27:52] [V] [TRT] Parsing node: Where_41 [Where]
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::Where_48
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::Where_49
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::Where_41
[09/27/2022-11:27:52] [V] [TRT] Where_41 [Where] inputs: [onnx::Where_48 -> (-1, 6380, 9568)[BOOL]], [onnx::Where_49 -> ()[INT32]], [onnx::Where_41 -> (-1, 6380, 9568)[INT32]], 
[09/27/2022-11:27:52] [V] [TRT] Registering layer: onnx::Where_49 for ONNX node: onnx::Where_49
[09/27/2022-11:27:52] [V] [TRT] Registering layer: Where_41 for ONNX node: Where_41
[09/27/2022-11:27:52] [V] [TRT] Registering tensor: onnx::Where_50 for ONNX tensor: onnx::Where_50
[09/27/2022-11:27:52] [V] [TRT] Where_41 [Where] outputs: [onnx::Where_50 -> (-1, 6380, 9568)[INT32]], 
[09/27/2022-11:27:52] [V] [TRT] Parsing node: Conv_42 [Conv]
[09/27/2022-11:27:52] [V] [TRT] Searching for input: input.12
[09/27/2022-11:27:52] [V] [TRT] Searching for input: blocks.1.1.Conv.weight
[09/27/2022-11:27:52] [V] [TRT] Conv_42 [Conv] inputs: [input.12 -> (-1, 9, 6380, 9568)[FLOAT]], [blocks.1.1.Conv.weight -> (9, 1, 9, 9)[FLOAT]], 
[09/27/2022-11:27:52] [V] [TRT] Convolution input dimensions: (-1, 9, 6380, 9568)
[09/27/2022-11:27:52] [V] [TRT] Registering layer: Conv_42 for ONNX node: Conv_42
[09/27/2022-11:27:52] [V] [TRT] Using kernel: (9, 9), strides: (1, 1), prepadding: (4, 4), postpadding: (4, 4), dilations: (1, 1), numOutputs: 9
[09/27/2022-11:27:52] [V] [TRT] Convolution output dimensions: (-1, 9, 6380, 9568)
[09/27/2022-11:27:52] [V] [TRT] Registering tensor: input.16 for ONNX tensor: input.16
[09/27/2022-11:27:52] [V] [TRT] Conv_42 [Conv] outputs: [input.16 -> (-1, 9, 6380, 9568)[FLOAT]], 
[09/27/2022-11:27:52] [V] [TRT] Parsing node: Constant_43 [Constant]
[09/27/2022-11:27:52] [V] [TRT] Constant_43 [Constant] inputs: 
[09/27/2022-11:27:52] [V] [TRT] Constant_43 [Constant] outputs: [onnx::ReduceSum_52 -> (1)[INT32]], 
[09/27/2022-11:27:52] [V] [TRT] Parsing node: ReduceSum_44 [ReduceSum]
[09/27/2022-11:27:52] [V] [TRT] Searching for input: input.16
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::ReduceSum_52
[09/27/2022-11:27:52] [V] [TRT] ReduceSum_44 [ReduceSum] inputs: [input.16 -> (-1, 9, 6380, 9568)[FLOAT]], [onnx::ReduceSum_52 -> (1)[INT32]], 
[09/27/2022-11:27:52] [V] [TRT] Registering layer: ReduceSum_44 for ONNX node: ReduceSum_44
[09/27/2022-11:27:52] [V] [TRT] Registering tensor: onnx::Max_53 for ONNX tensor: onnx::Max_53
[09/27/2022-11:27:52] [V] [TRT] ReduceSum_44 [ReduceSum] outputs: [onnx::Max_53 -> (-1, 6380, 9568)[FLOAT]], 
[09/27/2022-11:27:52] [V] [TRT] Parsing node: Max_45 [Max]
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::Max_53
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::Equal_45
[09/27/2022-11:27:52] [V] [TRT] Max_45 [Max] inputs: [onnx::Max_53 -> (-1, 6380, 9568)[FLOAT]], [onnx::Equal_45 -> (-1, 6380, 9568)[FLOAT]], 
[09/27/2022-11:27:52] [V] [TRT] Registering layer: Max_45 for ONNX node: Max_45
[09/27/2022-11:27:52] [V] [TRT] Registering tensor: onnx::Equal_54 for ONNX tensor: onnx::Equal_54
[09/27/2022-11:27:52] [V] [TRT] Max_45 [Max] outputs: [onnx::Equal_54 -> (-1, 6380, 9568)[FLOAT]], 
[09/27/2022-11:27:52] [V] [TRT] Parsing node: Equal_46 [Equal]
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::Equal_54
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::Equal_45
[09/27/2022-11:27:52] [V] [TRT] Equal_46 [Equal] inputs: [onnx::Equal_54 -> (-1, 6380, 9568)[FLOAT]], [onnx::Equal_45 -> (-1, 6380, 9568)[FLOAT]], 
[09/27/2022-11:27:52] [V] [TRT] Registering layer: Equal_46 for ONNX node: Equal_46
[09/27/2022-11:27:52] [V] [TRT] Registering tensor: onnx::Not_55 for ONNX tensor: onnx::Not_55
[09/27/2022-11:27:52] [V] [TRT] Equal_46 [Equal] outputs: [onnx::Not_55 -> (-1, 6380, 9568)[BOOL]], 
[09/27/2022-11:27:52] [V] [TRT] Parsing node: Not_47 [Not]
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::Not_55
[09/27/2022-11:27:52] [V] [TRT] Not_47 [Not] inputs: [onnx::Not_55 -> (-1, 6380, 9568)[BOOL]], 
[09/27/2022-11:27:52] [V] [TRT] Registering layer: Not_47 for ONNX node: Not_47
[09/27/2022-11:27:52] [V] [TRT] Registering tensor: onnx::Cast_56 for ONNX tensor: onnx::Cast_56
[09/27/2022-11:27:52] [V] [TRT] Not_47 [Not] outputs: [onnx::Cast_56 -> (-1, 6380, 9568)[BOOL]], 
[09/27/2022-11:27:52] [V] [TRT] Parsing node: Cast_48 [Cast]
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::Cast_56
[09/27/2022-11:27:52] [V] [TRT] Cast_48 [Cast] inputs: [onnx::Cast_56 -> (-1, 6380, 9568)[BOOL]], 
[09/27/2022-11:27:52] [V] [TRT] Casting to type: bool
[09/27/2022-11:27:52] [V] [TRT] Registering layer: Cast_48 for ONNX node: Cast_48
[09/27/2022-11:27:52] [V] [TRT] Registering tensor: onnx::Where_57 for ONNX tensor: onnx::Where_57
[09/27/2022-11:27:52] [V] [TRT] Cast_48 [Cast] outputs: [onnx::Where_57 -> (-1, 6380, 9568)[BOOL]], 
[09/27/2022-11:27:52] [V] [TRT] Parsing node: Constant_49 [Constant]
[09/27/2022-11:27:52] [V] [TRT] Constant_49 [Constant] inputs: 
[09/27/2022-11:27:52] [V] [TRT] Constant_49 [Constant] outputs: [onnx::Where_58 -> ()[INT32]], 
[09/27/2022-11:27:52] [V] [TRT] Parsing node: Where_50 [Where]
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::Where_57
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::Where_58
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::Where_50
[09/27/2022-11:27:52] [V] [TRT] Where_50 [Where] inputs: [onnx::Where_57 -> (-1, 6380, 9568)[BOOL]], [onnx::Where_58 -> ()[INT32]], [onnx::Where_50 -> (-1, 6380, 9568)[INT32]], 
[09/27/2022-11:27:52] [V] [TRT] Registering layer: onnx::Where_58 for ONNX node: onnx::Where_58
[09/27/2022-11:27:52] [V] [TRT] Registering layer: Where_50 for ONNX node: Where_50
[09/27/2022-11:27:52] [V] [TRT] Registering tensor: onnx::Where_59 for ONNX tensor: onnx::Where_59
[09/27/2022-11:27:52] [V] [TRT] Where_50 [Where] outputs: [onnx::Where_59 -> (-1, 6380, 9568)[INT32]], 
[09/27/2022-11:27:52] [V] [TRT] Parsing node: Conv_51 [Conv]
[09/27/2022-11:27:52] [V] [TRT] Searching for input: input.16
[09/27/2022-11:27:52] [V] [TRT] Searching for input: blocks.1.2.Conv.weight
[09/27/2022-11:27:52] [V] [TRT] Conv_51 [Conv] inputs: [input.16 -> (-1, 9, 6380, 9568)[FLOAT]], [blocks.1.2.Conv.weight -> (9, 1, 9, 9)[FLOAT]], 
[09/27/2022-11:27:52] [V] [TRT] Convolution input dimensions: (-1, 9, 6380, 9568)
[09/27/2022-11:27:52] [V] [TRT] Registering layer: Conv_51 for ONNX node: Conv_51
[09/27/2022-11:27:52] [V] [TRT] Using kernel: (9, 9), strides: (1, 1), prepadding: (4, 4), postpadding: (4, 4), dilations: (1, 1), numOutputs: 9
[09/27/2022-11:27:52] [V] [TRT] Convolution output dimensions: (-1, 9, 6380, 9568)
[09/27/2022-11:27:52] [V] [TRT] Registering tensor: input.20 for ONNX tensor: input.20
[09/27/2022-11:27:52] [V] [TRT] Conv_51 [Conv] outputs: [input.20 -> (-1, 9, 6380, 9568)[FLOAT]], 
[09/27/2022-11:27:52] [V] [TRT] Parsing node: Constant_52 [Constant]
[09/27/2022-11:27:52] [V] [TRT] Constant_52 [Constant] inputs: 
[09/27/2022-11:27:52] [V] [TRT] Constant_52 [Constant] outputs: [onnx::ReduceSum_61 -> (1)[INT32]], 
[09/27/2022-11:27:52] [V] [TRT] Parsing node: ReduceSum_53 [ReduceSum]
[09/27/2022-11:27:52] [V] [TRT] Searching for input: input.20
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::ReduceSum_61
[09/27/2022-11:27:52] [V] [TRT] ReduceSum_53 [ReduceSum] inputs: [input.20 -> (-1, 9, 6380, 9568)[FLOAT]], [onnx::ReduceSum_61 -> (1)[INT32]], 
[09/27/2022-11:27:52] [V] [TRT] Registering layer: ReduceSum_53 for ONNX node: ReduceSum_53
[09/27/2022-11:27:52] [V] [TRT] Registering tensor: onnx::Max_62 for ONNX tensor: onnx::Max_62
[09/27/2022-11:27:52] [V] [TRT] ReduceSum_53 [ReduceSum] outputs: [onnx::Max_62 -> (-1, 6380, 9568)[FLOAT]], 
[09/27/2022-11:27:52] [V] [TRT] Parsing node: Max_54 [Max]
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::Max_62
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::Equal_54
[09/27/2022-11:27:52] [V] [TRT] Max_54 [Max] inputs: [onnx::Max_62 -> (-1, 6380, 9568)[FLOAT]], [onnx::Equal_54 -> (-1, 6380, 9568)[FLOAT]], 
[09/27/2022-11:27:52] [V] [TRT] Registering layer: Max_54 for ONNX node: Max_54
[09/27/2022-11:27:52] [V] [TRT] Registering tensor: onnx::Equal_63 for ONNX tensor: onnx::Equal_63
[09/27/2022-11:27:52] [V] [TRT] Max_54 [Max] outputs: [onnx::Equal_63 -> (-1, 6380, 9568)[FLOAT]], 
[09/27/2022-11:27:52] [V] [TRT] Parsing node: Equal_55 [Equal]
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::Equal_63
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::Equal_54
[09/27/2022-11:27:52] [V] [TRT] Equal_55 [Equal] inputs: [onnx::Equal_63 -> (-1, 6380, 9568)[FLOAT]], [onnx::Equal_54 -> (-1, 6380, 9568)[FLOAT]], 
[09/27/2022-11:27:52] [V] [TRT] Registering layer: Equal_55 for ONNX node: Equal_55
[09/27/2022-11:27:52] [V] [TRT] Registering tensor: onnx::Not_64 for ONNX tensor: onnx::Not_64
[09/27/2022-11:27:52] [V] [TRT] Equal_55 [Equal] outputs: [onnx::Not_64 -> (-1, 6380, 9568)[BOOL]], 
[09/27/2022-11:27:52] [V] [TRT] Parsing node: Not_56 [Not]
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::Not_64
[09/27/2022-11:27:52] [V] [TRT] Not_56 [Not] inputs: [onnx::Not_64 -> (-1, 6380, 9568)[BOOL]], 
[09/27/2022-11:27:52] [V] [TRT] Registering layer: Not_56 for ONNX node: Not_56
[09/27/2022-11:27:52] [V] [TRT] Registering tensor: onnx::Cast_65 for ONNX tensor: onnx::Cast_65
[09/27/2022-11:27:52] [V] [TRT] Not_56 [Not] outputs: [onnx::Cast_65 -> (-1, 6380, 9568)[BOOL]], 
[09/27/2022-11:27:52] [V] [TRT] Parsing node: Cast_57 [Cast]
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::Cast_65
[09/27/2022-11:27:52] [V] [TRT] Cast_57 [Cast] inputs: [onnx::Cast_65 -> (-1, 6380, 9568)[BOOL]], 
[09/27/2022-11:27:52] [V] [TRT] Casting to type: bool
[09/27/2022-11:27:52] [V] [TRT] Registering layer: Cast_57 for ONNX node: Cast_57
[09/27/2022-11:27:52] [V] [TRT] Registering tensor: onnx::Where_66 for ONNX tensor: onnx::Where_66
[09/27/2022-11:27:52] [V] [TRT] Cast_57 [Cast] outputs: [onnx::Where_66 -> (-1, 6380, 9568)[BOOL]], 
[09/27/2022-11:27:52] [V] [TRT] Parsing node: Constant_58 [Constant]
[09/27/2022-11:27:52] [V] [TRT] Constant_58 [Constant] inputs: 
[09/27/2022-11:27:52] [V] [TRT] Constant_58 [Constant] outputs: [onnx::Where_67 -> ()[INT32]], 
[09/27/2022-11:27:52] [V] [TRT] Parsing node: Where_59 [Where]
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::Where_66
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::Where_67
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::Where_59
[09/27/2022-11:27:52] [V] [TRT] Where_59 [Where] inputs: [onnx::Where_66 -> (-1, 6380, 9568)[BOOL]], [onnx::Where_67 -> ()[INT32]], [onnx::Where_59 -> (-1, 6380, 9568)[INT32]], 
[09/27/2022-11:27:52] [V] [TRT] Registering layer: onnx::Where_67 for ONNX node: onnx::Where_67
[09/27/2022-11:27:52] [V] [TRT] Registering layer: Where_59 for ONNX node: Where_59
[09/27/2022-11:27:52] [V] [TRT] Registering tensor: onnx::Where_68 for ONNX tensor: onnx::Where_68
[09/27/2022-11:27:52] [V] [TRT] Where_59 [Where] outputs: [onnx::Where_68 -> (-1, 6380, 9568)[INT32]], 
[09/27/2022-11:27:52] [V] [TRT] Parsing node: Conv_60 [Conv]
[09/27/2022-11:27:52] [V] [TRT] Searching for input: input.20
[09/27/2022-11:27:52] [V] [TRT] Searching for input: blocks.1.3.Conv.weight
[09/27/2022-11:27:52] [V] [TRT] Conv_60 [Conv] inputs: [input.20 -> (-1, 9, 6380, 9568)[FLOAT]], [blocks.1.3.Conv.weight -> (9, 1, 9, 9)[FLOAT]], 
[09/27/2022-11:27:52] [V] [TRT] Convolution input dimensions: (-1, 9, 6380, 9568)
[09/27/2022-11:27:52] [V] [TRT] Registering layer: Conv_60 for ONNX node: Conv_60
[09/27/2022-11:27:52] [V] [TRT] Using kernel: (9, 9), strides: (1, 1), prepadding: (4, 4), postpadding: (4, 4), dilations: (1, 1), numOutputs: 9
[09/27/2022-11:27:52] [V] [TRT] Convolution output dimensions: (-1, 9, 6380, 9568)
[09/27/2022-11:27:52] [V] [TRT] Registering tensor: onnx::ReduceSum_69 for ONNX tensor: onnx::ReduceSum_69
[09/27/2022-11:27:52] [V] [TRT] Conv_60 [Conv] outputs: [onnx::ReduceSum_69 -> (-1, 9, 6380, 9568)[FLOAT]], 
[09/27/2022-11:27:52] [V] [TRT] Parsing node: Constant_61 [Constant]
[09/27/2022-11:27:52] [V] [TRT] Constant_61 [Constant] inputs: 
[09/27/2022-11:27:52] [V] [TRT] Constant_61 [Constant] outputs: [onnx::ReduceSum_70 -> (1)[INT32]], 
[09/27/2022-11:27:52] [V] [TRT] Parsing node: ReduceSum_62 [ReduceSum]
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::ReduceSum_69
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::ReduceSum_70
[09/27/2022-11:27:52] [V] [TRT] ReduceSum_62 [ReduceSum] inputs: [onnx::ReduceSum_69 -> (-1, 9, 6380, 9568)[FLOAT]], [onnx::ReduceSum_70 -> (1)[INT32]], 
[09/27/2022-11:27:52] [V] [TRT] Registering layer: ReduceSum_62 for ONNX node: ReduceSum_62
[09/27/2022-11:27:52] [V] [TRT] Registering tensor: onnx::Max_71 for ONNX tensor: onnx::Max_71
[09/27/2022-11:27:52] [V] [TRT] ReduceSum_62 [ReduceSum] outputs: [onnx::Max_71 -> (-1, 6380, 9568)[FLOAT]], 
[09/27/2022-11:27:52] [V] [TRT] Parsing node: Max_63 [Max]
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::Max_71
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::Equal_63
[09/27/2022-11:27:52] [V] [TRT] Max_63 [Max] inputs: [onnx::Max_71 -> (-1, 6380, 9568)[FLOAT]], [onnx::Equal_63 -> (-1, 6380, 9568)[FLOAT]], 
[09/27/2022-11:27:52] [V] [TRT] Registering layer: Max_63 for ONNX node: Max_63
[09/27/2022-11:27:52] [V] [TRT] Registering tensor: max_frame_out_9 for ONNX tensor: max_frame_out
[09/27/2022-11:27:52] [V] [TRT] Max_63 [Max] outputs: [max_frame_out -> (-1, 6380, 9568)[FLOAT]], 
[09/27/2022-11:27:52] [V] [TRT] Parsing node: Equal_64 [Equal]
[09/27/2022-11:27:52] [V] [TRT] Searching for input: max_frame_out
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::Equal_63
[09/27/2022-11:27:52] [V] [TRT] Equal_64 [Equal] inputs: [max_frame_out -> (-1, 6380, 9568)[FLOAT]], [onnx::Equal_63 -> (-1, 6380, 9568)[FLOAT]], 
[09/27/2022-11:27:52] [V] [TRT] Registering layer: Equal_64 for ONNX node: Equal_64
[09/27/2022-11:27:52] [V] [TRT] Registering tensor: onnx::Not_73 for ONNX tensor: onnx::Not_73
[09/27/2022-11:27:52] [V] [TRT] Equal_64 [Equal] outputs: [onnx::Not_73 -> (-1, 6380, 9568)[BOOL]], 
[09/27/2022-11:27:52] [V] [TRT] Parsing node: Not_65 [Not]
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::Not_73
[09/27/2022-11:27:52] [V] [TRT] Not_65 [Not] inputs: [onnx::Not_73 -> (-1, 6380, 9568)[BOOL]], 
[09/27/2022-11:27:52] [V] [TRT] Registering layer: Not_65 for ONNX node: Not_65
[09/27/2022-11:27:52] [V] [TRT] Registering tensor: onnx::Cast_74 for ONNX tensor: onnx::Cast_74
[09/27/2022-11:27:52] [V] [TRT] Not_65 [Not] outputs: [onnx::Cast_74 -> (-1, 6380, 9568)[BOOL]], 
[09/27/2022-11:27:52] [V] [TRT] Parsing node: Cast_66 [Cast]
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::Cast_74
[09/27/2022-11:27:52] [V] [TRT] Cast_66 [Cast] inputs: [onnx::Cast_74 -> (-1, 6380, 9568)[BOOL]], 
[09/27/2022-11:27:52] [V] [TRT] Casting to type: bool
[09/27/2022-11:27:52] [V] [TRT] Registering layer: Cast_66 for ONNX node: Cast_66
[09/27/2022-11:27:52] [V] [TRT] Registering tensor: onnx::Where_75 for ONNX tensor: onnx::Where_75
[09/27/2022-11:27:52] [V] [TRT] Cast_66 [Cast] outputs: [onnx::Where_75 -> (-1, 6380, 9568)[BOOL]], 
[09/27/2022-11:27:52] [V] [TRT] Parsing node: Constant_67 [Constant]
[09/27/2022-11:27:52] [V] [TRT] Constant_67 [Constant] inputs: 
[09/27/2022-11:27:52] [V] [TRT] Constant_67 [Constant] outputs: [onnx::Where_76 -> ()[INT32]], 
[09/27/2022-11:27:52] [V] [TRT] Parsing node: Where_68 [Where]
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::Where_75
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::Where_76
[09/27/2022-11:27:52] [V] [TRT] Searching for input: onnx::Where_68
[09/27/2022-11:27:52] [V] [TRT] Where_68 [Where] inputs: [onnx::Where_75 -> (-1, 6380, 9568)[BOOL]], [onnx::Where_76 -> ()[INT32]], [onnx::Where_68 -> (-1, 6380, 9568)[INT32]], 
[09/27/2022-11:27:52] [V] [TRT] Registering layer: onnx::Where_76 for ONNX node: onnx::Where_76
[09/27/2022-11:27:52] [V] [TRT] Registering layer: Where_68 for ONNX node: Where_68
[09/27/2022-11:27:52] [V] [TRT] Registering tensor: argmax_frame_out_10 for ONNX tensor: argmax_frame_out
[09/27/2022-11:27:52] [V] [TRT] Where_68 [Where] outputs: [argmax_frame_out -> (-1, 6380, 9568)[INT32]], 
[09/27/2022-11:27:52] [V] [TRT] Marking max_frame_out_9 as output: max_frame_out
[09/27/2022-11:27:52] [V] [TRT] Marking argmax_frame_out_10 as output: argmax_frame_out
[09/27/2022-11:27:52] [I] Finish parsing network model
[09/27/2022-11:27:52] [W] Dynamic dimensions required for input: im_stack, but no shapes were provided. Automatically overriding shape to: 1x9x6380x9568
[09/27/2022-11:27:52] [V] [TRT] Applying generic optimizations to the graph for inference.
[09/27/2022-11:27:52] [V] [TRT] Original: 69 layers
[09/27/2022-11:27:52] [V] [TRT] After dead-layer removal: 69 layers
[09/27/2022-11:27:52] [V] [TRT] Running: ConstShuffleFusion on (Unnamed Layer* 3) [Constant]
[09/27/2022-11:27:52] [V] [TRT] ConstShuffleFusion: Fusing (Unnamed Layer* 3) [Constant] with (Unnamed Layer* 4) [Shuffle]
[09/27/2022-11:27:52] [V] [TRT] Running: ConstShuffleFusion on onnx::Where_22
[09/27/2022-11:27:52] [V] [TRT] ConstShuffleFusion: Fusing onnx::Where_22 with (Unnamed Layer* 15) [Shuffle]
[09/27/2022-11:27:52] [V] [TRT] Running: ConstShuffleFusion on onnx::Where_31
[09/27/2022-11:27:52] [V] [TRT] ConstShuffleFusion: Fusing onnx::Where_31 with (Unnamed Layer* 24) [Shuffle]
[09/27/2022-11:27:52] [V] [TRT] Running: ConstShuffleFusion on onnx::Where_40
[09/27/2022-11:27:52] [V] [TRT] ConstShuffleFusion: Fusing onnx::Where_40 with (Unnamed Layer* 33) [Shuffle]
[09/27/2022-11:27:52] [V] [TRT] Running: ConstShuffleFusion on onnx::Where_49
[09/27/2022-11:27:52] [V] [TRT] ConstShuffleFusion: Fusing onnx::Where_49 with (Unnamed Layer* 42) [Shuffle]
[09/27/2022-11:27:52] [V] [TRT] Running: ConstShuffleFusion on onnx::Where_58
[09/27/2022-11:27:52] [V] [TRT] ConstShuffleFusion: Fusing onnx::Where_58 with (Unnamed Layer* 51) [Shuffle]
[09/27/2022-11:27:52] [V] [TRT] Running: ConstShuffleFusion on onnx::Where_67
[09/27/2022-11:27:52] [V] [TRT] ConstShuffleFusion: Fusing onnx::Where_67 with (Unnamed Layer* 60) [Shuffle]
[09/27/2022-11:27:52] [V] [TRT] Running: ConstShuffleFusion on onnx::Where_76
[09/27/2022-11:27:52] [V] [TRT] ConstShuffleFusion: Fusing onnx::Where_76 with (Unnamed Layer* 69) [Shuffle]
[09/27/2022-11:27:52] [V] [TRT] After Myelin optimization: 9 layers
[09/27/2022-11:27:52] [V] [TRT] Applying ScaleNodes fusions.
[09/27/2022-11:27:52] [V] [TRT] After scale fusion: 9 layers
[09/27/2022-11:27:52] [V] [TRT] After dupe layer removal: 9 layers
[09/27/2022-11:27:52] [V] [TRT] After final dead-layer removal: 9 layers
[09/27/2022-11:27:52] [V] [TRT] After tensor merging: 9 layers
[09/27/2022-11:27:52] [V] [TRT] After vertical fusions: 9 layers
[09/27/2022-11:27:52] [V] [TRT] After dupe layer removal: 9 layers
[09/27/2022-11:27:52] [V] [TRT] After final dead-layer removal: 9 layers
[09/27/2022-11:27:52] [V] [TRT] After tensor merging: 9 layers
[09/27/2022-11:27:52] [V] [TRT] After slice removal: 9 layers
[09/27/2022-11:27:52] [V] [TRT] After concat removal: 9 layers
[09/27/2022-11:27:52] [V] [TRT] Trying to split Reshape and strided tensor
[09/27/2022-11:27:52] [V] [TRT] Graph construction and optimization completed in 0.00745657 seconds.
[09/27/2022-11:27:53] [V] [TRT] Using cublasLt as a tactic source
[09/27/2022-11:27:53] [I] [TRT] [MemUsageChange] Init cuBLAS/cuBLASLt: CPU +839, GPU +360, now: CPU 1521, GPU 897 (MiB)
[09/27/2022-11:27:53] [V] [TRT] Using cuDNN as a tactic source
[09/27/2022-11:27:53] [I] [TRT] [MemUsageChange] Init cuDNN: CPU +128, GPU +60, now: CPU 1649, GPU 957 (MiB)
[09/27/2022-11:27:53] [I] [TRT] Local timing cache in use. Profiling results in this builder pass will not be stored.
[09/27/2022-11:27:53] [V] [TRT] Constructing optimization profile number 0 [1/1].
[09/27/2022-11:27:53] [V] [TRT] Reserving memory for host IO tensors. Host: 0 bytes
[09/27/2022-11:27:53] [V] [TRT] =============== Computing reformatting costs
[09/27/2022-11:27:53] [V] [TRT] *************** Autotuning Reformat: Float(549394560,61043840,9568,1) -> Float(549394560,61043840,9568,1) ***************
[09/27/2022-11:27:53] [V] [TRT] --------------- Timing Runner: Reformatting CopyNode for Network Input im_stack (Reformat)
[09/27/2022-11:27:53] [V] [TRT] Tactic: 0x00000000000003e8 Time: 5.15131
[09/27/2022-11:27:53] [V] [TRT] Tactic: 0x00000000000003ea Time: 5.1791
[09/27/2022-11:27:53] [V] [TRT] Tactic: 0x0000000000000000 Time: 5.19768
[09/27/2022-11:27:53] [V] [TRT] Fastest Tactic: 0x00000000000003e8 Time: 5.15131
[09/27/2022-11:27:53] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: Reformat Tactic: 0x00000000000003e8
[09/27/2022-11:27:53] [V] [TRT] *************** Autotuning Reformat: Float(549394560,61043840,9568,1) -> Float(549394560,1,86112,9) ***************
[09/27/2022-11:27:53] [V] [TRT] --------------- Timing Runner: Reformatting CopyNode for Network Input im_stack (Reformat)
[09/27/2022-11:27:53] [V] [TRT] Tactic: 0x00000000000003e8 Time: 5.8839
[09/27/2022-11:27:53] [V] [TRT] Tactic: 0x00000000000003ea Time: 9.20693
[09/27/2022-11:27:53] [V] [TRT] Tactic: 0x0000000000000000 Time: 5.84675
[09/27/2022-11:27:53] [V] [TRT] Fastest Tactic: 0x0000000000000000 Time: 5.84675
[09/27/2022-11:27:53] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: Reformat Tactic: 0x0000000000000000
[09/27/2022-11:27:53] [V] [TRT] *************** Autotuning Reformat: Float(549394560,61043840,9568,1) -> Float(183131520,1:4,28704,3) ***************
[09/27/2022-11:27:53] [V] [TRT] --------------- Timing Runner: Reformatting CopyNode for Network Input im_stack (Reformat)
[09/27/2022-11:27:53] [V] [TRT] Tactic: 0x00000000000003e8 Time: 9.9938
[09/27/2022-11:27:53] [V] [TRT] Tactic: 0x00000000000003ea Time: 8.93221
[09/27/2022-11:27:54] [V] [TRT] Tactic: 0x0000000000000000 Time: 10.0437
[09/27/2022-11:27:54] [V] [TRT] Fastest Tactic: 0x00000000000003ea Time: 8.93221
[09/27/2022-11:27:54] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: Reformat Tactic: 0x00000000000003ea
[09/27/2022-11:27:54] [V] [TRT] =============== Computing reformatting costs
[09/27/2022-11:27:54] [V] [TRT] *************** Autotuning Reformat: Float(549394560,61043840,9568,1) -> Float(549394560,1,86112,9) ***************
[09/27/2022-11:27:54] [V] [TRT] --------------- Timing Runner: Optimizer Reformat(Reformatted im_stack -> <out>) (Reformat)
[09/27/2022-11:27:54] [V] [TRT] Tactic: 0x00000000000003e8 Time: 5.82305
[09/27/2022-11:27:54] [V] [TRT] Tactic: 0x00000000000003ea Time: 9.13218
[09/27/2022-11:27:54] [V] [TRT] Tactic: 0x0000000000000000 Time: 5.85128
[09/27/2022-11:27:54] [V] [TRT] Fastest Tactic: 0x00000000000003e8 Time: 5.82305
[09/27/2022-11:27:54] [V] [TRT] *************** Autotuning Reformat: Float(549394560,61043840,9568,1) -> Float(183131520,1:4,28704,3) ***************
[09/27/2022-11:27:54] [V] [TRT] --------------- Timing Runner: Optimizer Reformat(Reformatted im_stack -> <out>) (Reformat)
[09/27/2022-11:27:54] [V] [TRT] Tactic: 0x00000000000003e8 Time: 10.0235
[09/27/2022-11:27:54] [V] [TRT] Tactic: 0x00000000000003ea Time: 9.1667
[09/27/2022-11:27:54] [V] [TRT] Tactic: 0x0000000000000000 Time: 10.0469
[09/27/2022-11:27:54] [V] [TRT] Fastest Tactic: 0x00000000000003ea Time: 9.1667
[09/27/2022-11:27:54] [V] [TRT] *************** Autotuning Reformat: Float(549394560,1,86112,9) -> Float(549394560,61043840,9568,1) ***************
[09/27/2022-11:27:54] [V] [TRT] --------------- Timing Runner: Optimizer Reformat(Reformatted im_stack -> <out>) (Reformat)
[09/27/2022-11:27:54] [V] [TRT] Tactic: 0x00000000000003e8 Time: 22.6671
[09/27/2022-11:27:54] [V] [TRT] Tactic: 0x00000000000003ea Time: 20.9873
[09/27/2022-11:27:54] [V] [TRT] Tactic: 0x0000000000000000 Time: 22.6668
[09/27/2022-11:27:54] [V] [TRT] Fastest Tactic: 0x00000000000003ea Time: 20.9873
[09/27/2022-11:27:54] [V] [TRT] *************** Autotuning Reformat: Float(549394560,1,86112,9) -> Float(183131520,1:4,28704,3) ***************
[09/27/2022-11:27:54] [V] [TRT] --------------- Timing Runner: Optimizer Reformat(Reformatted im_stack -> <out>) (Reformat)
[09/27/2022-11:27:55] [V] [TRT] Tactic: 0x00000000000003e8 Time: 9.31196
[09/27/2022-11:27:55] [V] [TRT] Tactic: 0x00000000000003ea Time: 20.8592
[09/27/2022-11:27:55] [V] [TRT] Tactic: 0x0000000000000000 Time: 9.36492
[09/27/2022-11:27:55] [V] [TRT] Fastest Tactic: 0x00000000000003e8 Time: 9.31196
[09/27/2022-11:27:55] [V] [TRT] *************** Autotuning Reformat: Float(183131520,1:4,28704,3) -> Float(549394560,61043840,9568,1) ***************
[09/27/2022-11:27:55] [V] [TRT] --------------- Timing Runner: Optimizer Reformat(Reformatted im_stack -> <out>) (Reformat)
[09/27/2022-11:27:55] [V] [TRT] Tactic: 0x00000000000003e8 Time: 22.6651
[09/27/2022-11:27:55] [V] [TRT] Tactic: 0x00000000000003ea Time: 21.408
[09/27/2022-11:27:55] [V] [TRT] Tactic: 0x0000000000000000 Time: 22.663
[09/27/2022-11:27:55] [V] [TRT] Fastest Tactic: 0x00000000000003ea Time: 21.408
[09/27/2022-11:27:55] [V] [TRT] *************** Autotuning Reformat: Float(183131520,1:4,28704,3) -> Float(549394560,1,86112,9) ***************
[09/27/2022-11:27:55] [V] [TRT] --------------- Timing Runner: Optimizer Reformat(Reformatted im_stack -> <out>) (Reformat)
[09/27/2022-11:27:55] [V] [TRT] Tactic: 0x00000000000003e8 Time: 5.98938
[09/27/2022-11:27:56] [V] [TRT] Tactic: 0x00000000000003ea Time: 21.8466
[09/27/2022-11:27:56] [V] [TRT] Tactic: 0x0000000000000000 Time: 5.98806
[09/27/2022-11:27:56] [V] [TRT] Fastest Tactic: 0x0000000000000000 Time: 5.98806
[09/27/2022-11:27:56] [V] [TRT] =============== Computing reformatting costs
[09/27/2022-11:27:56] [V] [TRT] *************** Autotuning Reformat: Float(549394560,61043840,9568,1) -> Float(549394560,1,86112,9) ***************
[09/27/2022-11:27:56] [V] [TRT] --------------- Timing Runner: Optimizer Reformat(<in> -> input.12) (Reformat)
[09/27/2022-11:27:56] [V] [TRT] Tactic: 0x00000000000003e8 Time: 5.84046
[09/27/2022-11:27:56] [V] [TRT] Tactic: 0x00000000000003ea Time: 9.17782
[09/27/2022-11:27:56] [V] [TRT] Tactic: 0x0000000000000000 Time: 5.85172
[09/27/2022-11:27:56] [V] [TRT] Fastest Tactic: 0x00000000000003e8 Time: 5.84046
[09/27/2022-11:27:56] [V] [TRT] *************** Autotuning Reformat: Float(549394560,61043840,9568,1) -> Float(183131520,1:4,28704,3) ***************
[09/27/2022-11:27:56] [V] [TRT] --------------- Timing Runner: Optimizer Reformat(<in> -> input.12) (Reformat)
[09/27/2022-11:27:56] [V] [TRT] Tactic: 0x00000000000003e8 Time: 10.0251
[09/27/2022-11:27:56] [V] [TRT] Tactic: 0x00000000000003ea Time: 9.22258
[09/27/2022-11:27:56] [V] [TRT] Tactic: 0x0000000000000000 Time: 10.0871
[09/27/2022-11:27:56] [V] [TRT] Fastest Tactic: 0x00000000000003ea Time: 9.22258
[09/27/2022-11:27:56] [V] [TRT] *************** Autotuning Reformat: Float(549394560,1,86112,9) -> Float(549394560,61043840,9568,1) ***************
[09/27/2022-11:27:56] [V] [TRT] --------------- Timing Runner: Optimizer Reformat(<in> -> input.12) (Reformat)
[09/27/2022-11:27:56] [V] [TRT] Tactic: 0x00000000000003e8 Time: 22.6668
[09/27/2022-11:27:56] [V] [TRT] Tactic: 0x00000000000003ea Time: 21.1332
[09/27/2022-11:27:57] [V] [TRT] Tactic: 0x0000000000000000 Time: 22.6678
[09/27/2022-11:27:57] [V] [TRT] Fastest Tactic: 0x00000000000003ea Time: 21.1332
[09/27/2022-11:27:57] [V] [TRT] *************** Autotuning Reformat: Float(549394560,1,86112,9) -> Float(183131520,1:4,28704,3) ***************
[09/27/2022-11:27:57] [V] [TRT] --------------- Timing Runner: Optimizer Reformat(<in> -> input.12) (Reformat)
[09/27/2022-11:27:57] [V] [TRT] Tactic: 0x00000000000003e8 Time: 9.35073
[09/27/2022-11:27:57] [V] [TRT] Tactic: 0x00000000000003ea Time: 20.9194
[09/27/2022-11:27:57] [V] [TRT] Tactic: 0x0000000000000000 Time: 9.38876
[09/27/2022-11:27:57] [V] [TRT] Fastest Tactic: 0x00000000000003e8 Time: 9.35073
[09/27/2022-11:27:57] [V] [TRT] *************** Autotuning Reformat: Float(183131520,1:4,28704,3) -> Float(549394560,61043840,9568,1) ***************
[09/27/2022-11:27:57] [V] [TRT] --------------- Timing Runner: Optimizer Reformat(<in> -> input.12) (Reformat)
[09/27/2022-11:27:57] [V] [TRT] Tactic: 0x00000000000003e8 Time: 22.6632
[09/27/2022-11:27:57] [V] [TRT] Tactic: 0x00000000000003ea Time: 21.4521
[09/27/2022-11:27:58] [V] [TRT] Tactic: 0x0000000000000000 Time: 22.6627
[09/27/2022-11:27:58] [V] [TRT] Fastest Tactic: 0x00000000000003ea Time: 21.4521
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(183131520,1:4,28704,3) -> Float(549394560,1,86112,9) ***************
[09/27/2022-11:27:58] [V] [TRT] --------------- Timing Runner: Optimizer Reformat(<in> -> input.12) (Reformat)
[09/27/2022-11:27:58] [V] [TRT] Tactic: 0x00000000000003e8 Time: 5.98323
[09/27/2022-11:27:58] [V] [TRT] Tactic: 0x00000000000003ea Time: 21.8561
[09/27/2022-11:27:58] [V] [TRT] Tactic: 0x0000000000000000 Time: 5.98645
[09/27/2022-11:27:58] [V] [TRT] Fastest Tactic: 0x00000000000003e8 Time: 5.98323
[09/27/2022-11:27:58] [V] [TRT] =============== Computing reformatting costs
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(549394560,61043840,9568,1) -> Float(549394560,1,86112,9) ***************
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(549394560,61043840,9568,1) -> Float(183131520,1:4,28704,3) ***************
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(549394560,1,86112,9) -> Float(549394560,61043840,9568,1) ***************
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(549394560,1,86112,9) -> Float(183131520,1:4,28704,3) ***************
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(183131520,1:4,28704,3) -> Float(549394560,61043840,9568,1) ***************
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(183131520,1:4,28704,3) -> Float(549394560,1,86112,9) ***************
[09/27/2022-11:27:58] [V] [TRT] =============== Computing reformatting costs
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(549394560,61043840,9568,1) -> Float(549394560,1,86112,9) ***************
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(549394560,61043840,9568,1) -> Float(183131520,1:4,28704,3) ***************
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(549394560,1,86112,9) -> Float(549394560,61043840,9568,1) ***************
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(549394560,1,86112,9) -> Float(183131520,1:4,28704,3) ***************
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(183131520,1:4,28704,3) -> Float(549394560,61043840,9568,1) ***************
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(183131520,1:4,28704,3) -> Float(549394560,1,86112,9) ***************
[09/27/2022-11:27:58] [V] [TRT] =============== Computing reformatting costs
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(549394560,61043840,9568,1) -> Float(549394560,1,86112,9) ***************
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(549394560,61043840,9568,1) -> Float(183131520,1:4,28704,3) ***************
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(549394560,1,86112,9) -> Float(549394560,61043840,9568,1) ***************
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(549394560,1,86112,9) -> Float(183131520,1:4,28704,3) ***************
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(183131520,1:4,28704,3) -> Float(549394560,61043840,9568,1) ***************
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(183131520,1:4,28704,3) -> Float(549394560,1,86112,9) ***************
[09/27/2022-11:27:58] [V] [TRT] =============== Computing reformatting costs
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(549394560,61043840,9568,1) -> Float(549394560,1,86112,9) ***************
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(549394560,61043840,9568,1) -> Float(183131520,1:4,28704,3) ***************
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(549394560,1,86112,9) -> Float(549394560,61043840,9568,1) ***************
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(549394560,1,86112,9) -> Float(183131520,1:4,28704,3) ***************
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(183131520,1:4,28704,3) -> Float(549394560,61043840,9568,1) ***************
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(183131520,1:4,28704,3) -> Float(549394560,1,86112,9) ***************
[09/27/2022-11:27:58] [V] [TRT] =============== Computing reformatting costs
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(549394560,61043840,9568,1) -> Float(549394560,1,86112,9) ***************
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(549394560,61043840,9568,1) -> Float(183131520,1:4,28704,3) ***************
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(549394560,1,86112,9) -> Float(549394560,61043840,9568,1) ***************
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(549394560,1,86112,9) -> Float(183131520,1:4,28704,3) ***************
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(183131520,1:4,28704,3) -> Float(549394560,61043840,9568,1) ***************
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(183131520,1:4,28704,3) -> Float(549394560,1,86112,9) ***************
[09/27/2022-11:27:58] [V] [TRT] =============== Computing reformatting costs
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(549394560,61043840,9568,1) -> Float(549394560,1,86112,9) ***************
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(549394560,61043840,9568,1) -> Float(183131520,1:4,28704,3) ***************
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(549394560,1,86112,9) -> Float(549394560,61043840,9568,1) ***************
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(549394560,1,86112,9) -> Float(183131520,1:4,28704,3) ***************
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(183131520,1:4,28704,3) -> Float(549394560,61043840,9568,1) ***************
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(183131520,1:4,28704,3) -> Float(549394560,1,86112,9) ***************
[09/27/2022-11:27:58] [V] [TRT] =============== Computing reformatting costs
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(549394560,61043840,9568,1) -> Float(549394560,1,86112,9) ***************
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(549394560,61043840,9568,1) -> Float(183131520,1:4,28704,3) ***************
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(549394560,1,86112,9) -> Float(549394560,61043840,9568,1) ***************
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(549394560,1,86112,9) -> Float(183131520,1:4,28704,3) ***************
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(183131520,1:4,28704,3) -> Float(549394560,61043840,9568,1) ***************
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(183131520,1:4,28704,3) -> Float(549394560,1,86112,9) ***************
[09/27/2022-11:27:58] [V] [TRT] =============== Computing reformatting costs
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(549394560,61043840,9568,1) -> Float(549394560,1,86112,9) ***************
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(549394560,61043840,9568,1) -> Float(183131520,1:4,28704,3) ***************
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(549394560,1,86112,9) -> Float(549394560,61043840,9568,1) ***************
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(549394560,1,86112,9) -> Float(183131520,1:4,28704,3) ***************
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(183131520,1:4,28704,3) -> Float(549394560,61043840,9568,1) ***************
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(183131520,1:4,28704,3) -> Float(549394560,1,86112,9) ***************
[09/27/2022-11:27:58] [V] [TRT] =============== Computing reformatting costs
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(549394560,61043840,9568,1) -> Float(549394560,1,86112,9) ***************
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(549394560,61043840,9568,1) -> Float(183131520,1:4,28704,3) ***************
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(549394560,1,86112,9) -> Float(549394560,61043840,9568,1) ***************
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(549394560,1,86112,9) -> Float(183131520,1:4,28704,3) ***************
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(183131520,1:4,28704,3) -> Float(549394560,61043840,9568,1) ***************
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(183131520,1:4,28704,3) -> Float(549394560,1,86112,9) ***************
[09/27/2022-11:27:58] [V] [TRT] =============== Computing reformatting costs
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(549394560,61043840,9568,1) -> Float(549394560,1,86112,9) ***************
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(549394560,61043840,9568,1) -> Float(183131520,1:4,28704,3) ***************
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(549394560,1,86112,9) -> Float(549394560,61043840,9568,1) ***************
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(549394560,1,86112,9) -> Float(183131520,1:4,28704,3) ***************
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(183131520,1:4,28704,3) -> Float(549394560,61043840,9568,1) ***************
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(183131520,1:4,28704,3) -> Float(549394560,1,86112,9) ***************
[09/27/2022-11:27:58] [V] [TRT] =============== Computing reformatting costs
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(549394560,61043840,9568,1) -> Float(549394560,1,86112,9) ***************
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(549394560,61043840,9568,1) -> Float(183131520,1:4,28704,3) ***************
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(549394560,1,86112,9) -> Float(549394560,61043840,9568,1) ***************
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(549394560,1,86112,9) -> Float(183131520,1:4,28704,3) ***************
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(183131520,1:4,28704,3) -> Float(549394560,61043840,9568,1) ***************
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(183131520,1:4,28704,3) -> Float(549394560,1,86112,9) ***************
[09/27/2022-11:27:58] [V] [TRT] =============== Computing reformatting costs
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(549394560,61043840,9568,1) -> Float(549394560,1,86112,9) ***************
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(549394560,61043840,9568,1) -> Float(183131520,1:4,28704,3) ***************
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(549394560,1,86112,9) -> Float(549394560,61043840,9568,1) ***************
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(549394560,1,86112,9) -> Float(183131520,1:4,28704,3) ***************
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(183131520,1:4,28704,3) -> Float(549394560,61043840,9568,1) ***************
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(183131520,1:4,28704,3) -> Float(549394560,1,86112,9) ***************
[09/27/2022-11:27:58] [V] [TRT] =============== Computing reformatting costs
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(549394560,1,86112,9) -> Float(549394560,61043840,9568,1) ***************
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(183131520,1:4,28704,3) -> Float(549394560,61043840,9568,1) ***************
[09/27/2022-11:27:58] [V] [TRT] =============== Computing reformatting costs
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(549394560,1,86112,9) -> Float(549394560,61043840,9568,1) ***************
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(183131520,1:4,28704,3) -> Float(549394560,61043840,9568,1) ***************
[09/27/2022-11:27:58] [V] [TRT] =============== Computing reformatting costs
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(549394560,1,86112,9) -> Float(549394560,61043840,9568,1) ***************
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(183131520,1:4,28704,3) -> Float(549394560,61043840,9568,1) ***************
[09/27/2022-11:27:58] [V] [TRT] =============== Computing reformatting costs
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(549394560,1,86112,9) -> Float(549394560,61043840,9568,1) ***************
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(183131520,1:4,28704,3) -> Float(549394560,61043840,9568,1) ***************
[09/27/2022-11:27:58] [V] [TRT] =============== Computing reformatting costs
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(549394560,1,86112,9) -> Float(549394560,61043840,9568,1) ***************
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(183131520,1:4,28704,3) -> Float(549394560,61043840,9568,1) ***************
[09/27/2022-11:27:58] [V] [TRT] =============== Computing reformatting costs
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(549394560,1,86112,9) -> Float(549394560,61043840,9568,1) ***************
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(183131520,1:4,28704,3) -> Float(549394560,61043840,9568,1) ***************
[09/27/2022-11:27:58] [V] [TRT] =============== Computing reformatting costs
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(549394560,1,86112,9) -> Float(549394560,61043840,9568,1) ***************
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(183131520,1:4,28704,3) -> Float(549394560,61043840,9568,1) ***************
[09/27/2022-11:27:58] [V] [TRT] =============== Computing reformatting costs
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(549394560,1,86112,9) -> Float(549394560,61043840,9568,1) ***************
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning Reformat: Float(183131520,1:4,28704,3) -> Float(549394560,61043840,9568,1) ***************
[09/27/2022-11:27:58] [V] [TRT] =============== Computing reformatting costs
[09/27/2022-11:27:58] [V] [TRT] =============== Computing reformatting costs
[09/27/2022-11:27:58] [V] [TRT] =============== Computing costs for 
[09/27/2022-11:27:58] [V] [TRT] *************** Autotuning format combination: Float(549394560,61043840,9568,1) -> Float(549394560,61043840,9568,1) ***************
[09/27/2022-11:27:58] [V] [TRT] --------------- Timing Runner: Conv_33 (CudaDepthwiseConvolution)
[09/27/2022-11:27:58] [V] [TRT] CudaDepthwiseConvolution has no valid tactics for this config, skipping
[09/27/2022-11:27:58] [V] [TRT] --------------- Timing Runner: Conv_33 (CudnnConvolution)
[09/27/2022-11:27:59] [V] [TRT] Tactic: 0x0000000000000000 Time: 54.0036
[09/27/2022-11:28:00] [V] [TRT] Tactic: 0x0000000000000001 Time: 53.8094
[09/27/2022-11:28:00] [V] [TRT] Tactic: 0x0000000000000002 Time: 53.7037
[09/27/2022-11:28:48] [V] [TRT] Tactic: 0x0000000000000005 Time: 5962.46
[09/27/2022-11:28:48] [V] [TRT] Tactic: 0x0000000000000038 Time: 53.7623
[09/27/2022-11:28:49] [V] [TRT] Tactic: 0x0000000000000039 Time: 53.8564
[09/27/2022-11:28:49] [V] [TRT] Tactic: 0x000000000000003a Time: 54.2495
[09/27/2022-11:29:37] [V] [TRT] Tactic: 0x000000000000003d Time: 5977.33
[09/27/2022-11:29:37] [V] [TRT] Tactic: 0x0000000000000070 Time: 53.9619
[09/27/2022-11:29:38] [V] [TRT] Tactic: 0x0000000000000071 Time: 54.0219
[09/27/2022-11:29:38] [V] [TRT] Tactic: 0x0000000000000072 Time: 53.9205
[09/27/2022-11:30:26] [V] [TRT] Tactic: 0x0000000000000075 Time: 5970.37
[09/27/2022-11:30:26] [V] [TRT] Fastest Tactic: 0x0000000000000002 Time: 53.7037
[09/27/2022-11:30:26] [V] [TRT] --------------- Timing Runner: Conv_33 (CaskConvolution)
[09/27/2022-11:30:26] [V] [TRT] Conv_33 Set Tactic Name: ampere_scudnn_128x128_relu_xregs_large_nn_v1 Tactic: 0x5403ad713f811a18
[09/27/2022-11:30:33] [V] [TRT] Tactic: 0x5403ad713f811a18 Time: 818.41
[09/27/2022-11:30:33] [V] [TRT] Conv_33 Set Tactic Name: ampere_scudnn_128x64_relu_xregs_large_nn_v1 Tactic: 0x5deb29b7a8e275f7
[09/27/2022-11:30:36] [V] [TRT] Tactic: 0x5deb29b7a8e275f7 Time: 373.166
[09/27/2022-11:30:36] [V] [TRT] Conv_33 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_indexed_f32f32_f32f32_f32_nchwkcrs_nchw_tilesize64x64x8_stage3_warpsize1x4x1_g1_ffma_aligna4_alignc4 Tactic: 0xd828f024626fa982
[09/27/2022-11:30:41] [V] [TRT] Tactic: 0xd828f024626fa982 Time: 642.053
[09/27/2022-11:30:41] [V] [TRT] Fastest Tactic: 0x5deb29b7a8e275f7 Time: 373.166
[09/27/2022-11:30:41] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CudnnConvolution Tactic: 0x0000000000000002
[09/27/2022-11:30:41] [V] [TRT] *************** Autotuning format combination: Float(549394560,1,86112,9) -> Float(549394560,1,86112,9) ***************
[09/27/2022-11:30:41] [V] [TRT] --------------- Timing Runner: Conv_33 (CaskConvolution)
[09/27/2022-11:30:41] [V] [TRT] Conv_33 Set Tactic Name: sm80_xmma_fprop_implicit_gemm_indexed_f32f32_f32f32_f32_nhwckrsc_nhwc_tilesize64x64x8_stage3_warpsize1x4x1_g1_ffma_aligna4_alignc4 Tactic: 0x19b688348f983aa0
[09/27/2022-11:31:03] [V] [TRT] Tactic: 0x19b688348f983aa0 Time: 2802.2
[09/27/2022-11:31:03] [V] [TRT] Fastest Tactic: 0x19b688348f983aa0 Time: 2802.2
[09/27/2022-11:31:03] [V] [TRT] >>>>>>>>>>>>>>> Chose Runner Type: CaskConvolution Tactic: 0x19b688348f983aa0
[09/27/2022-11:31:03] [V] [TRT] *************** Autotuning format combination: Float(183131520,1:4,28704,3) -> Float(183131520,1:4,28704,3) ***************
[09/27/2022-11:31:03] [V] [TRT] --------------- Timing Runner: Conv_33 (CaskConvolution)
[09/27/2022-11:31:03] [V] [TRT] CaskConvolution has no valid tactics for this config, skipping
[09/27/2022-11:31:03] [V] [TRT] =============== Computing costs for 
[09/27/2022-11:31:03] [V] [TRT] *************** Autotuning format combination: Float(549394560,61043840,9568,1) -> Float(549394560,61043840,9568,1) ***************
[09/27/2022-11:31:03] [V] [TRT] *************** Autotuning format combination: Float(549394560,1,86112,9) -> Float(549394560,1,86112,9) ***************
[09/27/2022-11:31:03] [V] [TRT] *************** Autotuning format combination: Float(183131520,1:4,28704,3) -> Float(183131520,1:4,28704,3) ***************
[09/27/2022-11:31:03] [V] [TRT] --------------- Timing Runner: Conv_42 (CaskConvolution)
[09/27/2022-11:31:03] [V] [TRT] CaskConvolution has no valid tactics for this config, skipping
[09/27/2022-11:31:03] [V] [TRT] =============== Computing costs for 
[09/27/2022-11:31:03] [V] [TRT] *************** Autotuning format combination: Float(549394560,61043840,9568,1) -> Float(549394560,61043840,9568,1) ***************
[09/27/2022-11:31:03] [V] [TRT] *************** Autotuning format combination: Float(549394560,1,86112,9) -> Float(549394560,1,86112,9) ***************
[09/27/2022-11:31:03] [V] [TRT] *************** Autotuning format combination: Float(183131520,1:4,28704,3) -> Float(183131520,1:4,28704,3) ***************
[09/27/2022-11:31:03] [V] [TRT] --------------- Timing Runner: Conv_51 (CaskConvolution)
[09/27/2022-11:31:03] [V] [TRT] CaskConvolution has no valid tactics for this config, skipping
[09/27/2022-11:31:03] [V] [TRT] =============== Computing costs for 
[09/27/2022-11:31:03] [V] [TRT] *************** Autotuning format combination: Float(549394560,61043840,9568,1) -> Float(549394560,61043840,9568,1) ***************
[09/27/2022-11:31:03] [V] [TRT] *************** Autotuning format combination: Float(549394560,1,86112,9) -> Float(549394560,1,86112,9) ***************
[09/27/2022-11:31:03] [V] [TRT] *************** Autotuning format combination: Float(183131520,1:4,28704,3) -> Float(183131520,1:4,28704,3) ***************
[09/27/2022-11:31:03] [V] [TRT] --------------- Timing Runner: Conv_60 (CaskConvolution)
[09/27/2022-11:31:03] [V] [TRT] CaskConvolution has no valid tactics for this config, skipping
[09/27/2022-11:31:03] [V] [TRT] =============== Computing costs for 
[09/27/2022-11:31:03] [V] [TRT] *************** Autotuning format combination: Float(549394560,61043840,9568,1) -> Float(549394560,61043840,9568,1) ***************
[09/27/2022-11:31:03] [V] [TRT] *************** Autotuning format combination: Float(549394560,1,86112,9) -> Float(549394560,1,86112,9) ***************
[09/27/2022-11:31:03] [V] [TRT] *************** Autotuning format combination: Float(183131520,1:4,28704,3) -> Float(183131520,1:4,28704,3) ***************
[09/27/2022-11:31:03] [V] [TRT] --------------- Timing Runner: Conv_0 (CaskConvolution)
[09/27/2022-11:31:03] [V] [TRT] CaskConvolution has no valid tactics for this config, skipping
[09/27/2022-11:31:03] [V] [TRT] =============== Computing costs for 
[09/27/2022-11:31:03] [V] [TRT] *************** Autotuning format combination: Float(549394560,61043840,9568,1) -> Float(549394560,61043840,9568,1) ***************
[09/27/2022-11:31:03] [V] [TRT] *************** Autotuning format combination: Float(549394560,1,86112,9) -> Float(549394560,1,86112,9) ***************
[09/27/2022-11:31:03] [V] [TRT] *************** Autotuning format combination: Float(183131520,1:4,28704,3) -> Float(183131520,1:4,28704,3) ***************
[09/27/2022-11:31:03] [V] [TRT] --------------- Timing Runner: Conv_6 (CaskConvolution)
[09/27/2022-11:31:03] [V] [TRT] CaskConvolution has no valid tactics for this config, skipping
[09/27/2022-11:31:03] [V] [TRT] =============== Computing costs for 
[09/27/2022-11:31:03] [V] [TRT] *************** Autotuning format combination: Float(549394560,61043840,9568,1) -> Float(549394560,61043840,9568,1) ***************
[09/27/2022-11:31:03] [V] [TRT] *************** Autotuning format combination: Float(549394560,1,86112,9) -> Float(549394560,1,86112,9) ***************
[09/27/2022-11:31:03] [V] [TRT] *************** Autotuning format combination: Float(183131520,1:4,28704,3) -> Float(183131520,1:4,28704,3) ***************
[09/27/2022-11:31:03] [V] [TRT] --------------- Timing Runner: Conv_15 (CaskConvolution)
[09/27/2022-11:31:03] [V] [TRT] CaskConvolution has no valid tactics for this config, skipping
[09/27/2022-11:31:03] [V] [TRT] =============== Computing costs for 
[09/27/2022-11:31:03] [V] [TRT] *************** Autotuning format combination: Float(549394560,61043840,9568,1) -> Float(549394560,61043840,9568,1) ***************
[09/27/2022-11:31:03] [V] [TRT] *************** Autotuning format combination: Float(549394560,1,86112,9) -> Float(549394560,1,86112,9) ***************
[09/27/2022-11:31:03] [V] [TRT] *************** Autotuning format combination: Float(183131520,1:4,28704,3) -> Float(183131520,1:4,28704,3) ***************
[09/27/2022-11:31:03] [V] [TRT] --------------- Timing Runner: Conv_24 (CaskConvolution)
[09/27/2022-11:31:03] [V] [TRT] CaskConvolution has no valid tactics for this config, skipping
[09/27/2022-11:31:03] [V] [TRT] =============== Computing costs for 
[09/27/2022-11:31:03] [V] [TRT] *************** Autotuning format combination: Float(549394560,61043840,9568,1), Float(549394560,61043840,9568,1), Float(549394560,61043840,9568,1), Float(549394560,61043840,9568,1), Float(549394560,61043840,9568,1), Float(549394560,61043840,9568,1), Float(549394560,61043840,9568,1), Float(549394560,61043840,9568,1) -> Float(61043840,9568,1), Int32(61043840,9568,1) ***************
[09/27/2022-11:31:03] [V] [TRT] --------------- Timing Runner: {ForeignNode[ReduceSum_2...Where_68]} (Myelin)
[09/27/2022-11:31:03] [W] [TRT] Skipping tactic 0x0000000000000000 due to Myelin error: autotuning: CUDA error 2 allocating 0-byte buffer: out of memory
[09/27/2022-11:31:03] [V] [TRT] Fastest Tactic: 0xd15ea5edd15ea5ed Time: inf
[09/27/2022-11:31:03] [V] [TRT] Deleting timing cache: 17 entries, served 102 hits since creation.
[09/27/2022-11:31:03] [E] Error[10]: [optimizer.cpp::computeCosts::3626] Error Code 10: Internal Error (Could not find any implementation for node {ForeignNode[ReduceSum_2...Where_68]}.)
[09/27/2022-11:31:03] [E] Error[2]: [builder.cpp::buildSerializedNetwork::636] Error Code 2: Internal Error (Assertion engine != nullptr failed. )
[09/27/2022-11:31:03] [E] Engine could not be created from network
[09/27/2022-11:31:03] [E] Building engine failed
[09/27/2022-11:31:03] [E] Failed to create engine from model or file.
[09/27/2022-11:31:03] [E] Engine set up failed
&&&& FAILED TensorRT.trtexec [TensorRT v8403] # ./trtexec --onnx=/home/c.mourning/maxnet_2x4.onnx --saveEngine=/home/c.mourning/maxnet_2x4.engine --verbose

Hi,

We could reproduce the same behavior, Please allow us some time to check more details.

Thank you.

Hi,
Please refer to below links related custom plugin implementation and sample:

While IPluginV2 and IPluginV2Ext interfaces are still supported for backward compatibility with TensorRT 5.1 and 6.0.x respectively, however, we recommend that you write new plugins or refactor existing ones to target the IPluginV2DynamicExt or IPluginV2IOExt interfaces instead.

Thanks!

Sorry, I’m not sure I understand. I am not knowingly using any plugins. Are you suggesting that I implement some of the functionality of this model with a custom plugin? If so, which parts?

Hi @c.mourning,

Could you please confirm whether you’d like to run this model in FP32 or FP16?

Thank you.