Using TensorFlow backend. Using TensorFlow backend. WARNING:tensorflow:Deprecation warnings have been disabled. Set TF_ENABLE_DEPRECATION_WARNINGS=1 to re-enable them. 2021-09-29 14:33:03,926 [INFO] iva.faster_rcnn.spec_loader.spec_loader: Loading experiment spec at /workspace/tlt-experiments/fasterRCNN/resnet18/specs/default_spec_resnet18_retrain_spec.txt. __________________________________________________________________________________________________ Layer (type) Output Shape Param # Connected to ================================================================================================== input_image (InputLayer) (None, 3, 480, 640) 0 __________________________________________________________________________________________________ conv1 (Conv2D) (None, 48, 240, 320) 7056 input_image[0][0] __________________________________________________________________________________________________ bn_conv1 (BatchNormalization) (None, 48, 240, 320) 192 conv1[0][0] __________________________________________________________________________________________________ activation_1 (Activation) (None, 48, 240, 320) 0 bn_conv1[0][0] __________________________________________________________________________________________________ block_1a_conv_1 (Conv2D) (None, 64, 120, 160) 27648 activation_1[0][0] __________________________________________________________________________________________________ block_1a_bn_1 (BatchNormalizati (None, 64, 120, 160) 256 block_1a_conv_1[0][0] __________________________________________________________________________________________________ block_1a_relu_1 (Activation) (None, 64, 120, 160) 0 block_1a_bn_1[0][0] __________________________________________________________________________________________________ block_1a_conv_2 (Conv2D) (None, 64, 120, 160) 36864 block_1a_relu_1[0][0] __________________________________________________________________________________________________ block_1a_conv_shortcut (Conv2D) (None, 64, 120, 160) 3072 activation_1[0][0] __________________________________________________________________________________________________ block_1a_bn_2 (BatchNormalizati (None, 64, 120, 160) 256 block_1a_conv_2[0][0] __________________________________________________________________________________________________ block_1a_bn_shortcut (BatchNorm (None, 64, 120, 160) 256 block_1a_conv_shortcut[0][0] __________________________________________________________________________________________________ add_1 (Add) (None, 64, 120, 160) 0 block_1a_bn_2[0][0] block_1a_bn_shortcut[0][0] __________________________________________________________________________________________________ block_1a_relu (Activation) (None, 64, 120, 160) 0 add_1[0][0] __________________________________________________________________________________________________ block_1b_conv_1 (Conv2D) (None, 64, 120, 160) 36864 block_1a_relu[0][0] __________________________________________________________________________________________________ block_1b_bn_1 (BatchNormalizati (None, 64, 120, 160) 256 block_1b_conv_1[0][0] __________________________________________________________________________________________________ block_1b_relu_1 (Activation) (None, 64, 120, 160) 0 block_1b_bn_1[0][0] __________________________________________________________________________________________________ block_1b_conv_2 (Conv2D) (None, 64, 120, 160) 36864 block_1b_relu_1[0][0] __________________________________________________________________________________________________ block_1b_conv_shortcut (Conv2D) (None, 64, 120, 160) 4096 block_1a_relu[0][0] __________________________________________________________________________________________________ block_1b_bn_2 (BatchNormalizati (None, 64, 120, 160) 256 block_1b_conv_2[0][0] __________________________________________________________________________________________________ block_1b_bn_shortcut (BatchNorm (None, 64, 120, 160) 256 block_1b_conv_shortcut[0][0] __________________________________________________________________________________________________ add_2 (Add) (None, 64, 120, 160) 0 block_1b_bn_2[0][0] block_1b_bn_shortcut[0][0] __________________________________________________________________________________________________ block_1b_relu (Activation) (None, 64, 120, 160) 0 add_2[0][0] __________________________________________________________________________________________________ block_2a_conv_1 (Conv2D) (None, 128, 60, 80) 73728 block_1b_relu[0][0] __________________________________________________________________________________________________ block_2a_bn_1 (BatchNormalizati (None, 128, 60, 80) 512 block_2a_conv_1[0][0] __________________________________________________________________________________________________ block_2a_relu_1 (Activation) (None, 128, 60, 80) 0 block_2a_bn_1[0][0] __________________________________________________________________________________________________ block_2a_conv_2 (Conv2D) (None, 128, 60, 80) 147456 block_2a_relu_1[0][0] __________________________________________________________________________________________________ block_2a_conv_shortcut (Conv2D) (None, 128, 60, 80) 8192 block_1b_relu[0][0] __________________________________________________________________________________________________ block_2a_bn_2 (BatchNormalizati (None, 128, 60, 80) 512 block_2a_conv_2[0][0] __________________________________________________________________________________________________ block_2a_bn_shortcut (BatchNorm (None, 128, 60, 80) 512 block_2a_conv_shortcut[0][0] __________________________________________________________________________________________________ add_3 (Add) (None, 128, 60, 80) 0 block_2a_bn_2[0][0] block_2a_bn_shortcut[0][0] __________________________________________________________________________________________________ block_2a_relu (Activation) (None, 128, 60, 80) 0 add_3[0][0] __________________________________________________________________________________________________ block_2b_conv_1 (Conv2D) (None, 128, 60, 80) 147456 block_2a_relu[0][0] __________________________________________________________________________________________________ block_2b_bn_1 (BatchNormalizati (None, 128, 60, 80) 512 block_2b_conv_1[0][0] __________________________________________________________________________________________________ block_2b_relu_1 (Activation) (None, 128, 60, 80) 0 block_2b_bn_1[0][0] __________________________________________________________________________________________________ block_2b_conv_2 (Conv2D) (None, 128, 60, 80) 147456 block_2b_relu_1[0][0] __________________________________________________________________________________________________ block_2b_conv_shortcut (Conv2D) (None, 128, 60, 80) 16384 block_2a_relu[0][0] __________________________________________________________________________________________________ block_2b_bn_2 (BatchNormalizati (None, 128, 60, 80) 512 block_2b_conv_2[0][0] __________________________________________________________________________________________________ block_2b_bn_shortcut (BatchNorm (None, 128, 60, 80) 512 block_2b_conv_shortcut[0][0] __________________________________________________________________________________________________ add_4 (Add) (None, 128, 60, 80) 0 block_2b_bn_2[0][0] block_2b_bn_shortcut[0][0] __________________________________________________________________________________________________ block_2b_relu (Activation) (None, 128, 60, 80) 0 add_4[0][0] __________________________________________________________________________________________________ block_3a_conv_1 (Conv2D) (None, 256, 30, 40) 294912 block_2b_relu[0][0] __________________________________________________________________________________________________ block_3a_bn_1 (BatchNormalizati (None, 256, 30, 40) 1024 block_3a_conv_1[0][0] __________________________________________________________________________________________________ block_3a_relu_1 (Activation) (None, 256, 30, 40) 0 block_3a_bn_1[0][0] __________________________________________________________________________________________________ block_3a_conv_2 (Conv2D) (None, 256, 30, 40) 589824 block_3a_relu_1[0][0] __________________________________________________________________________________________________ block_3a_conv_shortcut (Conv2D) (None, 256, 30, 40) 32768 block_2b_relu[0][0] __________________________________________________________________________________________________ block_3a_bn_2 (BatchNormalizati (None, 256, 30, 40) 1024 block_3a_conv_2[0][0] __________________________________________________________________________________________________ block_3a_bn_shortcut (BatchNorm (None, 256, 30, 40) 1024 block_3a_conv_shortcut[0][0] __________________________________________________________________________________________________ add_5 (Add) (None, 256, 30, 40) 0 block_3a_bn_2[0][0] block_3a_bn_shortcut[0][0] __________________________________________________________________________________________________ block_3a_relu (Activation) (None, 256, 30, 40) 0 add_5[0][0] __________________________________________________________________________________________________ block_3b_conv_1 (Conv2D) (None, 256, 30, 40) 589824 block_3a_relu[0][0] __________________________________________________________________________________________________ block_3b_bn_1 (BatchNormalizati (None, 256, 30, 40) 1024 block_3b_conv_1[0][0] __________________________________________________________________________________________________ block_3b_relu_1 (Activation) (None, 256, 30, 40) 0 block_3b_bn_1[0][0] __________________________________________________________________________________________________ block_3b_conv_2 (Conv2D) (None, 256, 30, 40) 589824 block_3b_relu_1[0][0] __________________________________________________________________________________________________ block_3b_conv_shortcut (Conv2D) (None, 256, 30, 40) 65536 block_3a_relu[0][0] __________________________________________________________________________________________________ block_3b_bn_2 (BatchNormalizati (None, 256, 30, 40) 1024 block_3b_conv_2[0][0] __________________________________________________________________________________________________ block_3b_bn_shortcut (BatchNorm (None, 256, 30, 40) 1024 block_3b_conv_shortcut[0][0] __________________________________________________________________________________________________ add_6 (Add) (None, 256, 30, 40) 0 block_3b_bn_2[0][0] block_3b_bn_shortcut[0][0] __________________________________________________________________________________________________ block_3b_relu (Activation) (None, 256, 30, 40) 0 add_6[0][0] __________________________________________________________________________________________________ rpn_conv1 (Conv2D) (None, 512, 30, 40) 1180160 block_3b_relu[0][0] __________________________________________________________________________________________________ rpn_out_class (Conv2D) (None, 9, 30, 40) 4617 rpn_conv1[0][0] __________________________________________________________________________________________________ rpn_out_regress (Conv2D) (None, 36, 30, 40) 18468 rpn_conv1[0][0] __________________________________________________________________________________________________ proposal_1 (Proposal) (None, 300, 4) 0 rpn_out_class[0][0] rpn_out_regress[0][0] input_image[0][0] __________________________________________________________________________________________________ crop_and_resize_1 (CropAndResiz (None, 300, 256, 7, 0 block_3b_relu[0][0] proposal_1[0][0] input_image[0][0] __________________________________________________________________________________________________ time_distributed_1 (TimeDistrib (None, 300, 512, 7, 1179648 crop_and_resize_1[0][0] __________________________________________________________________________________________________ time_distributed_2 (TimeDistrib (None, 300, 512, 7, 2048 time_distributed_1[0][0] __________________________________________________________________________________________________ block_4a_relu_1 (Activation) (None, 300, 512, 7, 0 time_distributed_2[0][0] __________________________________________________________________________________________________ time_distributed_3 (TimeDistrib (None, 300, 512, 7, 2359296 block_4a_relu_1[0][0] __________________________________________________________________________________________________ time_distributed_5 (TimeDistrib (None, 300, 512, 7, 131072 crop_and_resize_1[0][0] __________________________________________________________________________________________________ time_distributed_4 (TimeDistrib (None, 300, 512, 7, 2048 time_distributed_3[0][0] __________________________________________________________________________________________________ time_distributed_6 (TimeDistrib (None, 300, 512, 7, 2048 time_distributed_5[0][0] __________________________________________________________________________________________________ add_7 (Add) (None, 300, 512, 7, 0 time_distributed_4[0][0] time_distributed_6[0][0] __________________________________________________________________________________________________ block_4a_relu (Activation) (None, 300, 512, 7, 0 add_7[0][0] __________________________________________________________________________________________________ time_distributed_7 (TimeDistrib (None, 300, 512, 7, 2359296 block_4a_relu[0][0] __________________________________________________________________________________________________ time_distributed_8 (TimeDistrib (None, 300, 512, 7, 2048 time_distributed_7[0][0] __________________________________________________________________________________________________ block_4b_relu_1 (Activation) (None, 300, 512, 7, 0 time_distributed_8[0][0] __________________________________________________________________________________________________ time_distributed_9 (TimeDistrib (None, 300, 512, 7, 2359296 block_4b_relu_1[0][0] __________________________________________________________________________________________________ time_distributed_11 (TimeDistri (None, 300, 512, 7, 262144 block_4a_relu[0][0] __________________________________________________________________________________________________ time_distributed_10 (TimeDistri (None, 300, 512, 7, 2048 time_distributed_9[0][0] __________________________________________________________________________________________________ time_distributed_12 (TimeDistri (None, 300, 512, 7, 2048 time_distributed_11[0][0] __________________________________________________________________________________________________ add_8 (Add) (None, 300, 512, 7, 0 time_distributed_10[0][0] time_distributed_12[0][0] __________________________________________________________________________________________________ block_4b_relu (Activation) (None, 300, 512, 7, 0 add_8[0][0] __________________________________________________________________________________________________ time_distributed_13 (TimeDistri (None, 300, 512, 1, 0 block_4b_relu[0][0] __________________________________________________________________________________________________ time_distributed_flatten (TimeD (None, 300, 512) 0 time_distributed_13[0][0] __________________________________________________________________________________________________ dense_class_td (TimeDistributed (None, 300, 5) 2565 time_distributed_flatten[0][0] __________________________________________________________________________________________________ dense_regress_td (TimeDistribut (None, 300, 16) 8208 time_distributed_flatten[0][0] __________________________________________________________________________________________________ nms_inputs_1 (NmsInputs) [(None, None, 1, 1), 0 proposal_1[0][0] dense_class_td[0][0] dense_regress_td[0][0] ================================================================================================== Total params: 12,743,826 Trainable params: 12,579,746 Non-trainable params: 164,080 __________________________________________________________________________________________________ NOTE: UFF has been tested with TensorFlow 1.14.0. WARNING: The version of TensorFlow installed on this system is not guaranteed to work with UFF. Warning: No conversion function registered for layer: NMS_TRT yet. Converting NMS as custom op: NMS_TRT DEBUG: convert reshape to flatten node Warning: No conversion function registered for layer: CropAndResize yet. Converting roi_pooling_conv_1/CropAndResize_new as custom op: CropAndResize Warning: No conversion function registered for layer: Proposal yet. Converting proposal as custom op: Proposal DEBUG [/usr/local/lib/python3.6/dist-packages/uff/converters/tensorflow/converter.py:96] Marking ['NMS'] as outputs