Loading pretrained model... Create EncryptCheckpointSaverHook. ================================= Start training cycle 01 ================================= *********************** Building model graph... *********************** [ROI OPs] Using Batched NMS... Scope: MLP/multilevel_propose_rois/level_2/ [ROI OPs] Using Batched NMS... Scope: MLP/multilevel_propose_rois/level_3/ [ROI OPs] Using Batched NMS... Scope: MLP/multilevel_propose_rois/level_4/ [ROI OPs] Using Batched NMS... Scope: MLP/multilevel_propose_rois/level_5/ [ROI OPs] Using Batched NMS... Scope: MLP/multilevel_propose_rois/level_6/ [Training Compute Statistics] 544.2 GFLOPS/image Checkpoint is missing variable [l2/kernel] Checkpoint is missing variable [l2/bias] Checkpoint is missing variable [l3/kernel] Checkpoint is missing variable [l3/bias] Checkpoint is missing variable [l4/kernel] Checkpoint is missing variable [l4/bias] Checkpoint is missing variable [l5/kernel] Checkpoint is missing variable [l5/bias] Checkpoint is missing variable [post_hoc_d2/kernel] Checkpoint is missing variable [post_hoc_d2/bias] Checkpoint is missing variable [post_hoc_d3/kernel] Checkpoint is missing variable [post_hoc_d3/bias] Checkpoint is missing variable [post_hoc_d4/kernel] Checkpoint is missing variable [post_hoc_d4/bias] Checkpoint is missing variable [post_hoc_d5/kernel] Checkpoint is missing variable [post_hoc_d5/bias] Checkpoint is missing variable [rpn/kernel] Checkpoint is missing variable [rpn/bias] Checkpoint is missing variable [rpn-class/kernel] Checkpoint is missing variable [rpn-class/bias] Checkpoint is missing variable [rpn-box/kernel] Checkpoint is missing variable [rpn-box/bias] Checkpoint is missing variable [fc6/kernel] Checkpoint is missing variable [fc6/bias] Checkpoint is missing variable [fc7/kernel] Checkpoint is missing variable [fc7/bias] Checkpoint is missing variable [class-predict/kernel] Checkpoint is missing variable [class-predict/bias] Checkpoint is missing variable [box-predict/kernel] Checkpoint is missing variable [box-predict/bias] Checkpoint is missing variable [mask-conv-l0/kernel] Checkpoint is missing variable [mask-conv-l0/bias] Checkpoint is missing variable [mask-conv-l1/kernel] Checkpoint is missing variable [mask-conv-l1/bias] Checkpoint is missing variable [mask-conv-l2/kernel] Checkpoint is missing variable [mask-conv-l2/bias] Checkpoint is missing variable [mask-conv-l3/kernel] Checkpoint is missing variable [mask-conv-l3/bias] Checkpoint is missing variable [conv5-mask/kernel] Checkpoint is missing variable [conv5-mask/bias] Checkpoint is missing variable [mask_fcn_logits/kernel] Checkpoint is missing variable [mask_fcn_logits/bias] ============================ GIT REPOSITORY ============================ BRANCH NAME: %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% ============================ MODEL STATISTICS =========================== # Model Weights: 28,752,563 # Trainable Weights: 44,169,267 %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% ============================ TRAINABLE VARIABLES ======================== [#0001] conv1/kernel:0 => (7, 7, 3, 64) [#0002] bn_conv1/gamma:0 => (64,) [#0003] bn_conv1/beta:0 => (64,) [#0004] block_1a_conv_1/kernel:0 => (1, 1, 64, 64) [#0005] block_1a_bn_1/gamma:0 => (64,) [#0006] block_1a_bn_1/beta:0 => (64,) [#0007] block_1a_conv_2/kernel:0 => (3, 3, 64, 64) [#0008] block_1a_bn_2/gamma:0 => (64,) [#0009] block_1a_bn_2/beta:0 => (64,) [#0010] block_1a_conv_3/kernel:0 => (1, 1, 64, 256) [#0011] block_1a_bn_3/gamma:0 => (256,) [#0012] block_1a_bn_3/beta:0 => (256,) [#0013] block_1a_conv_shortcut/kernel:0 => (1, 1, 64, 256) [#0014] block_1a_bn_shortcut/gamma:0 => (256,) [#0015] block_1a_bn_shortcut/beta:0 => (256,) [#0016] block_1b_conv_1/kernel:0 => (1, 1, 256, 64) [#0017] block_1b_bn_1/gamma:0 => (64,) [#0018] block_1b_bn_1/beta:0 => (64,) [#0019] block_1b_conv_2/kernel:0 => (3, 3, 64, 64) [#0020] block_1b_bn_2/gamma:0 => (64,) [#0021] block_1b_bn_2/beta:0 => (64,) [#0022] block_1b_conv_3/kernel:0 => (1, 1, 64, 256) [#0023] block_1b_bn_3/gamma:0 => (256,) [#0024] block_1b_bn_3/beta:0 => (256,) [#0025] block_1c_conv_1/kernel:0 => (1, 1, 256, 64) [#0026] block_1c_bn_1/gamma:0 => (64,) [#0027] block_1c_bn_1/beta:0 => (64,) [#0028] block_1c_conv_2/kernel:0 => (3, 3, 64, 64) [#0029] block_1c_bn_2/gamma:0 => (64,) [#0030] block_1c_bn_2/beta:0 => (64,) [#0031] block_1c_conv_3/kernel:0 => (1, 1, 64, 256) [#0032] block_1c_bn_3/gamma:0 => (256,) [#0033] block_1c_bn_3/beta:0 => (256,) [#0034] block_2a_conv_1/kernel:0 => (1, 1, 256, 128) [#0035] block_2a_bn_1/gamma:0 => (128,) [#0036] block_2a_bn_1/beta:0 => (128,) [#0037] block_2a_conv_2/kernel:0 => (3, 3, 128, 128) [#0038] block_2a_bn_2/gamma:0 => (128,) [#0039] block_2a_bn_2/beta:0 => (128,) [#0040] block_2a_conv_3/kernel:0 => (1, 1, 128, 512) [#0041] block_2a_bn_3/gamma:0 => (512,) [#0042] block_2a_bn_3/beta:0 => (512,) [#0043] block_2a_conv_shortcut/kernel:0 => (1, 1, 256, 512) [#0044] block_2a_bn_shortcut/gamma:0 => (512,) [#0045] block_2a_bn_shortcut/beta:0 => (512,) [#0046] block_2b_conv_1/kernel:0 => (1, 1, 512, 128) [#0047] block_2b_bn_1/gamma:0 => (128,) [#0048] block_2b_bn_1/beta:0 => (128,) [#0049] block_2b_conv_2/kernel:0 => (3, 3, 128, 128) [#0050] block_2b_bn_2/gamma:0 => (128,) [#0051] block_2b_bn_2/beta:0 => (128,) [#0052] block_2b_conv_3/kernel:0 => (1, 1, 128, 512) [#0053] block_2b_bn_3/gamma:0 => (512,) [#0054] block_2b_bn_3/beta:0 => (512,) [#0055] block_2c_conv_1/kernel:0 => (1, 1, 512, 128) [#0056] block_2c_bn_1/gamma:0 => (128,) [#0057] block_2c_bn_1/beta:0 => (128,) [#0058] block_2c_conv_2/kernel:0 => (3, 3, 128, 128) [#0059] block_2c_bn_2/gamma:0 => (128,) [#0060] block_2c_bn_2/beta:0 => (128,) [#0061] block_2c_conv_3/kernel:0 => (1, 1, 128, 512) [#0062] block_2c_bn_3/gamma:0 => (512,) [#0063] block_2c_bn_3/beta:0 => (512,) [#0064] block_2d_conv_1/kernel:0 => (1, 1, 512, 128) [#0065] block_2d_bn_1/gamma:0 => (128,) [#0066] block_2d_bn_1/beta:0 => (128,) [#0067] block_2d_conv_2/kernel:0 => (3, 3, 128, 128) [#0068] block_2d_bn_2/gamma:0 => (128,) [#0069] block_2d_bn_2/beta:0 => (128,) [#0070] block_2d_conv_3/kernel:0 => (1, 1, 128, 512) [#0071] block_2d_bn_3/gamma:0 => (512,) [#0072] block_2d_bn_3/beta:0 => (512,) [#0073] block_3a_conv_1/kernel:0 => (1, 1, 512, 256) [#0074] block_3a_bn_1/gamma:0 => (256,) [#0075] block_3a_bn_1/beta:0 => (256,) [#0076] block_3a_conv_2/kernel:0 => (3, 3, 256, 256) [#0077] block_3a_bn_2/gamma:0 => (256,) [#0078] block_3a_bn_2/beta:0 => (256,) [#0079] block_3a_conv_3/kernel:0 => (1, 1, 256, 1024) [#0080] block_3a_bn_3/gamma:0 => (1024,) [#0081] block_3a_bn_3/beta:0 => (1024,) [#0082] block_3a_conv_shortcut/kernel:0 => (1, 1, 512, 1024) [#0083] block_3a_bn_shortcut/gamma:0 => (1024,) [#0084] block_3a_bn_shortcut/beta:0 => (1024,) [#0085] block_3b_conv_1/kernel:0 => (1, 1, 1024, 256) [#0086] block_3b_bn_1/gamma:0 => (256,) [#0087] block_3b_bn_1/beta:0 => (256,) [#0088] block_3b_conv_2/kernel:0 => (3, 3, 256, 256) [#0089] block_3b_bn_2/gamma:0 => (256,) [#0090] block_3b_bn_2/beta:0 => (256,) [#0091] block_3b_conv_3/kernel:0 => (1, 1, 256, 1024) [#0092] block_3b_bn_3/gamma:0 => (1024,) [#0093] block_3b_bn_3/beta:0 => (1024,) [#0094] block_3c_conv_1/kernel:0 => (1, 1, 1024, 256) [#0095] block_3c_bn_1/gamma:0 => (256,) [#0096] block_3c_bn_1/beta:0 => (256,) [#0097] block_3c_conv_2/kernel:0 => (3, 3, 256, 256) [#0098] block_3c_bn_2/gamma:0 => (256,) [#0099] block_3c_bn_2/beta:0 => (256,) [#0100] block_3c_conv_3/kernel:0 => (1, 1, 256, 1024) [#0101] block_3c_bn_3/gamma:0 => (1024,) [#0102] block_3c_bn_3/beta:0 => (1024,) [#0103] block_3d_conv_1/kernel:0 => (1, 1, 1024, 256) [#0104] block_3d_bn_1/gamma:0 => (256,) [#0105] block_3d_bn_1/beta:0 => (256,) [#0106] block_3d_conv_2/kernel:0 => (3, 3, 256, 256) [#0107] block_3d_bn_2/gamma:0 => (256,) [#0108] block_3d_bn_2/beta:0 => (256,) [#0109] block_3d_conv_3/kernel:0 => (1, 1, 256, 1024) [#0110] block_3d_bn_3/gamma:0 => (1024,) [#0111] block_3d_bn_3/beta:0 => (1024,) [#0112] block_3e_conv_1/kernel:0 => (1, 1, 1024, 256) [#0113] block_3e_bn_1/gamma:0 => (256,) [#0114] block_3e_bn_1/beta:0 => (256,) [#0115] block_3e_conv_2/kernel:0 => (3, 3, 256, 256) [#0116] block_3e_bn_2/gamma:0 => (256,) [#0117] block_3e_bn_2/beta:0 => (256,) [#0118] block_3e_conv_3/kernel:0 => (1, 1, 256, 1024) [#0119] block_3e_bn_3/gamma:0 => (1024,) [#0120] block_3e_bn_3/beta:0 => (1024,) [#0121] block_3f_conv_1/kernel:0 => (1, 1, 1024, 256) [#0122] block_3f_bn_1/gamma:0 => (256,) [#0123] block_3f_bn_1/beta:0 => (256,) [#0124] block_3f_conv_2/kernel:0 => (3, 3, 256, 256) [#0125] block_3f_bn_2/gamma:0 => (256,) [#0126] block_3f_bn_2/beta:0 => (256,) [#0127] block_3f_conv_3/kernel:0 => (1, 1, 256, 1024) [#0128] block_3f_bn_3/gamma:0 => (1024,) [#0129] block_3f_bn_3/beta:0 => (1024,) [#0130] block_4a_conv_1/kernel:0 => (1, 1, 1024, 512) [#0131] block_4a_bn_1/gamma:0 => (512,) [#0132] block_4a_bn_1/beta:0 => (512,) [#0133] block_4a_conv_2/kernel:0 => (3, 3, 512, 512) [#0134] block_4a_bn_2/gamma:0 => (512,) [#0135] block_4a_bn_2/beta:0 => (512,) [#0136] block_4a_conv_3/kernel:0 => (1, 1, 512, 2048) [#0137] block_4a_bn_3/gamma:0 => (2048,) [#0138] block_4a_bn_3/beta:0 => (2048,) [#0139] block_4a_conv_shortcut/kernel:0 => (1, 1, 1024, 2048) [#0140] block_4a_bn_shortcut/gamma:0 => (2048,) [#0141] block_4a_bn_shortcut/beta:0 => (2048,) [#0142] block_4b_conv_1/kernel:0 => (1, 1, 2048, 512) [#0143] block_4b_bn_1/gamma:0 => (512,) [#0144] block_4b_bn_1/beta:0 => (512,) [#0145] block_4b_conv_2/kernel:0 => (3, 3, 512, 512) [#0146] block_4b_bn_2/gamma:0 => (512,) [#0147] block_4b_bn_2/beta:0 => (512,) [#0148] block_4b_conv_3/kernel:0 => (1, 1, 512, 2048) [#0149] block_4b_bn_3/gamma:0 => (2048,) [#0150] block_4b_bn_3/beta:0 => (2048,) [#0151] block_4c_conv_1/kernel:0 => (1, 1, 2048, 512) [#0152] block_4c_bn_1/gamma:0 => (512,) [#0153] block_4c_bn_1/beta:0 => (512,) [#0154] block_4c_conv_2/kernel:0 => (3, 3, 512, 512) [#0155] block_4c_bn_2/gamma:0 => (512,) [#0156] block_4c_bn_2/beta:0 => (512,) [#0157] block_4c_conv_3/kernel:0 => (1, 1, 512, 2048) [#0158] block_4c_bn_3/gamma:0 => (2048,) [#0159] block_4c_bn_3/beta:0 => (2048,) [#0160] l2/kernel:0 => (1, 1, 256, 256) [#0161] l2/bias:0 => (256,) [#0162] l3/kernel:0 => (1, 1, 512, 256) [#0163] l3/bias:0 => (256,) [#0164] l4/kernel:0 => (1, 1, 1024, 256) [#0165] l4/bias:0 => (256,) [#0166] l5/kernel:0 => (1, 1, 2048, 256) [#0167] l5/bias:0 => (256,) [#0168] post_hoc_d2/kernel:0 => (3, 3, 256, 256) [#0169] post_hoc_d2/bias:0 => (256,) [#0170] post_hoc_d3/kernel:0 => (3, 3, 256, 256) [#0171] post_hoc_d3/bias:0 => (256,) [#0172] post_hoc_d4/kernel:0 => (3, 3, 256, 256) [#0173] post_hoc_d4/bias:0 => (256,) [#0174] post_hoc_d5/kernel:0 => (3, 3, 256, 256) [#0175] post_hoc_d5/bias:0 => (256,) [#0176] rpn/kernel:0 => (3, 3, 256, 256) [#0177] rpn/bias:0 => (256,) [#0178] rpn-class/kernel:0 => (1, 1, 256, 3) [#0179] rpn-class/bias:0 => (3,) [#0180] rpn-box/kernel:0 => (1, 1, 256, 12) [#0181] rpn-box/bias:0 => (12,) [#0182] fc6/kernel:0 => (12544, 1024) [#0183] fc6/bias:0 => (1024,) [#0184] fc7/kernel:0 => (1024, 1024) [#0185] fc7/bias:0 => (1024,) [#0186] class-predict/kernel:0 => (1024, 38) [#0187] class-predict/bias:0 => (38,) [#0188] box-predict/kernel:0 => (1024, 152) [#0189] box-predict/bias:0 => (152,) [#0190] mask-conv-l0/kernel:0 => (3, 3, 256, 256) [#0191] mask-conv-l0/bias:0 => (256,) [#0192] mask-conv-l1/kernel:0 => (3, 3, 256, 256) [#0193] mask-conv-l1/bias:0 => (256,) [#0194] mask-conv-l2/kernel:0 => (3, 3, 256, 256) [#0195] mask-conv-l2/bias:0 => (256,) [#0196] mask-conv-l3/kernel:0 => (3, 3, 256, 256) [#0197] mask-conv-l3/bias:0 => (256,) [#0198] conv5-mask/kernel:0 => (2, 2, 256, 256) [#0199] conv5-mask/bias:0 => (256,) [#0200] mask_fcn_logits/kernel:0 => (1, 1, 256, 38) [#0201] mask_fcn_logits/bias:0 => (38,) %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% # ============================================= # Start Training # %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% # Pretrained weights loaded with success... Saving checkpoints for 0 into /workspace/tao-experiments/mask_rcnn/experiment_dir_unpruned/model.step-0.tlt.