Nvidia Tegra X2

Hello,

I am new to Digits and TX2. I am using the tutorial from: https://github.com/dusty-nv/jetson-inference

While creating a model, I am getting the following error.

Memory required for data: 3268934784
Creating layer bbox_loss
Creating Layer bbox_loss
bbox_loss <- bboxes-obj-masked-norm
bbox_loss <- bbox-obj-label-norm
bbox_loss -> loss_bbox
Setting up bbox_loss
Top shape: (1)
with loss weight 2
Memory required for data: 3268934788
Creating layer coverage_loss
Creating Layer coverage_loss
coverage_loss <- coverage_coverage/sig_0_split_0
coverage_loss <- coverage-label_slice-label_4_split_0
coverage_loss -> loss_coverage
Setting up coverage_loss
Top shape: (1)
with loss weight 1
Memory required for data: 3268934792
Creating layer cluster

The job directory information on the left is:

Job Directory
/home/nvidia/DIGITS/digits/jobs/20180816-161051-e67a
Disk Size
0 B
Network (train/val)
train_val.prototxt
Network (deploy)
deploy.prototxt
Network (original)
original.prototxt
Solver
solver.prototxt
Raw caffe output
caffe_output.log
Pretrained Model
/home/nvidia/bvlc_googlenet.caffemodel.4
Visualizations
Tensorboard

The error on the server is

2018-08-16 16:10:53 [20180816-161051-e67a] [INFO ] Task subprocess args: “/home/nvidia/Caffe/caffe/build/tools/caffe train --solver=/home/nvidia/DIGITS/digits/jobs/20180816-161051-e67a/solver.prototxt --gpu=0 --weights=/home/nvidia/bvlc_googlenet.caffemodel.4”
2018-08-16 16:11:00 [20180816-161051-e67a] [ERROR] Train Caffe Model task failed with error code 1

I have no idea on how to free up memory as I have more than 2 gb available in the job directory.
Please help me. Thanks in advance.

The caffe_output log is as follows:

I0816 16:10:53.665153 9341 upgrade_proto.cpp:1044] Attempting to upgrade input file specified using deprecated ‘solver_type’ field (enum)’: /home/nvidia/DIGITS/digits/jobs/20180816-161051-e67a/solver.prototxt
I0816 16:10:53.665993 9341 upgrade_proto.cpp:1051] Successfully upgraded file specified using deprecated ‘solver_type’ field (enum) to ‘type’ field (string).
W0816 16:10:53.666019 9341 upgrade_proto.cpp:1053] Note that future Caffe releases will only support ‘type’ field (string) for a solver’s type.
I0816 16:10:53.803117 9341 caffe.cpp:197] Using GPUs 0
I0816 16:10:53.803254 9341 caffe.cpp:202] GPU 0: NVIDIA Tegra X2
I0816 16:10:56.484235 9341 solver.cpp:48] Initializing solver from parameters:
test_iter: 26
test_interval: 50
base_lr: 0.01
display: 6
max_iter: 2500
lr_policy: “exp”
gamma: 0.99795038
momentum: 0.9
weight_decay: 0.0001
snapshot: 50
snapshot_prefix: “snapshot”
solver_mode: GPU
device_id: 0
net: “train_val.prototxt”
type: “Adam”
I0816 16:10:56.485126 9341 solver.cpp:91] Creating training net from net file: train_val.prototxt
I0816 16:10:56.492863 9341 net.cpp:323] The NetState phase (0) differed from the phase (1) specified by a rule in layer val_data
I0816 16:10:56.492923 9341 net.cpp:323] The NetState phase (0) differed from the phase (1) specified by a rule in layer val_label
I0816 16:10:56.492951 9341 net.cpp:323] The NetState phase (0) differed from the phase (1) specified by a rule in layer val_transform
I0816 16:10:56.493413 9341 net.cpp:323] The NetState phase (0) differed from the phase (1) specified by a rule in layer cluster
I0816 16:10:56.493438 9341 net.cpp:323] The NetState phase (0) differed from the phase (1) specified by a rule in layer cluster_gt
I0816 16:10:56.493453 9341 net.cpp:323] The NetState phase (0) differed from the phase (1) specified by a rule in layer score
I0816 16:10:56.493468 9341 net.cpp:323] The NetState phase (0) differed from the phase (1) specified by a rule in layer mAP
I0816 16:10:56.493548 9341 net.cpp:52] Initializing net from parameters:
state {
phase: TRAIN
}
layer {
name: “train_data”
type: “Data”
top: “data”
include {
phase: TRAIN
}
data_param {
source: “/home/nvidia/DIGITS/digits/jobs/20180816-144514-175b/train_db/features”
batch_size: 10
backend: LMDB
}
}
layer {
name: “train_label”
type: “Data”
top: “label”
include {
phase: TRAIN
}
data_param {
source: “/home/nvidia/DIGITS/digits/jobs/20180816-144514-175b/train_db/labels”
batch_size: 10
backend: LMDB
}
}
layer {
name: “train_transform”
type: “DetectNetTransformation”
bottom: “data”
bottom: “label”
top: “transformed_data”
top: “transformed_label”
include {
phase: TRAIN
}
transform_param {
mean_value: 127
}
detectnet_groundtruth_param {
stride: 16
scale_cvg: 0.4
gridbox_type: GRIDBOX_MIN
min_cvg_len: 20
coverage_type: RECTANGULAR
image_size_x: 640
image_size_y: 640
obj_norm: true
crop_bboxes: false
object_class {
src: 1
dst: 0
}
}
detectnet_augmentation_param {
crop_prob: 1
shift_x: 32
shift_y: 32
scale_prob: 0.4
scale_min: 0.8
scale_max: 1.2
flip_prob: 0.5
rotation_prob: 0
max_rotate_degree: 5
hue_rotation_prob: 0.8
hue_rotation: 30
desaturation_prob: 0.8
desaturation_max: 0.8
}
}
layer {
name: “slice-label”
type: “Slice”
bottom: “transformed_label”
top: “foreground-label”
top: “bbox-label”
top: “size-label”
top: “obj-label”
top: “coverage-label”
slice_param {
slice_dim: 1
slice_point: 1
slice_point: 5
slice_point: 7
slice_point: 8
}
}
layer {
name: “coverage-block”
type: “Concat”
bottom: “foreground-label”
bottom: “foreground-label”
bottom: “foreground-label”
bottom: “foreground-label”
top: “coverage-block”
concat_param {
concat_dim: 1
}
}
layer {
name: “size-block”
type: “Concat”
bottom: “size-label”
bottom: “size-label”
top: “size-block”
concat_param {
concat_dim: 1
}
}
layer {
name: “obj-block”
type: “Concat”
bottom: “obj-label”
bottom: “obj-label”
bottom: “obj-label”
bottom: “obj-label”
top: “obj-block”
concat_param {
concat_dim: 1
}
}
layer {
name: “bb-label-norm”
type: “Eltwise”
bottom: “bbox-label”
bottom: “size-block”
top: “bbox-label-norm”
eltwise_param {
operation: PROD
}
}
layer {
name: “bb-obj-norm”
type: “Eltwise”
bottom: “bbox-label-norm”
bottom: “obj-block”
top: “bbox-obj-label-norm”
eltwise_param {
operation: PROD
}
}
layer {
name: “conv1/7x7_s2”
type: “Convolution”
bottom: “transformed_data”
top: “conv1/7x7_s2”
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 64
pad: 3
kernel_size: 7
stride: 2
weight_filler {
type: “xavier”
std: 0.1
}
bias_filler {
type: “constant”
value: 0.2
}
}
}
layer {
name: “conv1/relu_7x7”
type: “ReLU”
bottom: “conv1/7x7_s2”
top: “conv1/7x7_s2”
}
layer {
name: “pool1/3x3_s2”
type: “Pooling”
bottom: “conv1/7x7_s2”
top: “pool1/3x3_s2”
pooling_param {
pool: MAX
kernel_size: 3
stride: 2
}
}
layer {
name: “pool1/norm1”
type: “LRN”
bottom: “pool1/3x3_s2”
top: “pool1/norm1”
lrn_param {
local_size: 5
alpha: 0.0001
beta: 0.75
}
}
layer {
name: “conv2/3x3_reduce”
type: “Convolution”
bottom: “pool1/norm1”
top: “conv2/3x3_reduce”
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 64
kernel_size: 1
weight_filler {
type: “xavier”
std: 0.1
}
bias_filler {
type: “constant”
value: 0.2
}
}
}
layer {
name: “conv2/relu_3x3_reduce”
type: “ReLU”
bottom: “conv2/3x3_reduce”
top: “conv2/3x3_reduce”
}
layer {
name: “conv2/3x3”
type: “Convolution”
bottom: “conv2/3x3_reduce”
top: “conv2/3x3”
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 192
pad: 1
kernel_size: 3
weight_filler {
type: “xavier”
std: 0.03
}
bias_filler {
type: “constant”
value: 0.2
}
}
}
layer {
name: “conv2/relu_3x3”
type: “ReLU”
bottom: “conv2/3x3”
top: “conv2/3x3”
}
layer {
name: “conv2/norm2”
type: “LRN”
bottom: “conv2/3x3”
top: “conv2/norm2”
lrn_param {
local_size: 5
alpha: 0.0001
beta: 0.75
}
}
layer {
name: “pool2/3x3_s2”
type: “Pooling”
bottom: “conv2/norm2”
top: “pool2/3x3_s2”
pooling_param {
pool: MAX
kernel_size: 3
stride: 2
}
}
layer {
name: “inception_3a/1x1”
type: “Convolution”
bottom: “pool2/3x3_s2”
top: “inception_3a/1x1”
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 64
kernel_size: 1
weight_filler {
type: “xavier”
std: 0.03
}
bias_filler {
type: “constant”
value: 0.2
}
}
}
layer {
name: “inception_3a/relu_1x1”
type: “ReLU”
bottom: “inception_3a/1x1”
top: “inception_3a/1x1”
}
layer {
name: “inception_3a/3x3_reduce”
type: “Convolution”
bottom: “pool2/3x3_s2”
top: “inception_3a/3x3_reduce”
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 96
kernel_size: 1
weight_filler {
type: “xavier”
std: 0.09
}
bias_filler {
type: “constant”
value: 0.2
}
}
}
layer {
name: “inception_3a/relu_3x3_reduce”
type: “ReLU”
bottom: “inception_3a/3x3_reduce”
top: “inception_3a/3x3_reduce”
}
layer {
name: “inception_3a/3x3”
type: “Convolution”
bottom: “inception_3a/3x3_reduce”
top: “inception_3a/3x3”
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 128
pad: 1
kernel_size: 3
weight_filler {
type: “xavier”
std: 0.03
}
bias_filler {
type: “constant”
value: 0.2
}
}
}
layer {
name: “inception_3a/relu_3x3”
type: “ReLU”
bottom: “inception_3a/3x3”
top: “inception_3a/3x3”
}
layer {
name: “inception_3a/5x5_reduce”
type: “Convolution”
bottom: “pool2/3x3_s2”
top: “inception_3a/5x5_reduce”
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 16
kernel_size: 1
weight_filler {
type: “xavier”
std: 0.2
}
bias_filler {
type: “constant”
value: 0.2
}
}
}
layer {
name: “inception_3a/relu_5x5_reduce”
type: “ReLU”
bottom: “inception_3a/5x5_reduce”
top: “inception_3a/5x5_reduce”
}
layer {
name: “inception_3a/5x5”
type: “Convolution”
bottom: “inception_3a/5x5_reduce”
top: “inception_3a/5x5”
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 32
pad: 2
kernel_size: 5
weight_filler {
type: “xavier”
std: 0.03
}
bias_filler {
type: “constant”
value: 0.2
}
}
}
layer {
name: “inception_3a/relu_5x5”
type: “ReLU”
bottom: “inception_3a/5x5”
top: “inception_3a/5x5”
}
layer {
name: “inception_3a/pool”
type: “Pooling”
bottom: “pool2/3x3_s2”
top: “inception_3a/pool”
pooling_param {
pool: MAX
kernel_size: 3
stride: 1
pad: 1
}
}
layer {
name: “inception_3a/pool_proj”
type: “Convolution”
bottom: “inception_3a/pool”
top: “inception_3a/pool_proj”
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 32
kernel_size: 1
weight_filler {
type: “xavier”
std: 0.1
}
bias_filler {
type: “constant”
value: 0.2
}
}
}
layer {
name: “inception_3a/relu_pool_proj”
type: “ReLU”
bottom: “inception_3a/pool_proj”
top: “inception_3a/pool_proj”
}
layer {
name: “inception_3a/output”
type: “Concat”
bottom: “inception_3a/1x1”
bottom: “inception_3a/3x3”
bottom: “inception_3a/5x5”
bottom: “inception_3a/pool_proj”
top: “inception_3a/output”
}
layer {
name: “inception_3b/1x1”
type: “Convolution”
bottom: “inception_3a/output”
top: “inception_3b/1x1”
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 128
kernel_size: 1
weight_filler {
type: “xavier”
std: 0.03
}
bias_filler {
type: “constant”
value: 0.2
}
}
}
layer {
name: “inception_3b/relu_1x1”
type: “ReLU”
bottom: “inception_3b/1x1”
top: “inception_3b/1x1”
}
layer {
name: “inception_3b/3x3_reduce”
type: “Convolution”
bottom: “inception_3a/output”
top: “inception_3b/3x3_reduce”
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 128
kernel_size: 1
weight_filler {
type: “xavier”
std: 0.09
}
bias_filler {
type: “constant”
value: 0.2
}
}
}
layer {
name: “inception_3b/relu_3x3_reduce”
type: “ReLU”
bottom: “inception_3b/3x3_reduce”
top: “inception_3b/3x3_reduce”
}
layer {
name: “inception_3b/3x3”
type: “Convolution”
bottom: “inception_3b/3x3_reduce”
top: “inception_3b/3x3”
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 192
pad: 1
kernel_size: 3
weight_filler {
type: “xavier”
std: 0.03
}
bias_filler {
type: “constant”
value: 0.2
}
}
}
layer {
name: “inception_3b/relu_3x3”
type: “ReLU”
bottom: “inception_3b/3x3”
top: “inception_3b/3x3”
}
layer {
name: “inception_3b/5x5_reduce”
type: “Convolution”
bottom: “inception_3a/output”
top: “inception_3b/5x5_reduce”
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 32
kernel_size: 1
weight_filler {
type: “xavier”
std: 0.2
}
bias_filler {
type: “constant”
value: 0.2
}
}
}
layer {
name: “inception_3b/relu_5x5_reduce”
type: “ReLU”
bottom: “inception_3b/5x5_reduce”
top: “inception_3b/5x5_reduce”
}
layer {
name: “inception_3b/5x5”
type: “Convolution”
bottom: “inception_3b/5x5_reduce”
top: “inception_3b/5x5”
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 96
pad: 2
kernel_size: 5
weight_filler {
type: “xavier”
std: 0.03
}
bias_filler {
type: “constant”
value: 0.2
}
}
}
layer {
name: “inception_3b/relu_5x5”
type: “ReLU”
bottom: “inception_3b/5x5”
top: “inception_3b/5x5”
}
layer {
name: “inception_3b/pool”
type: “Pooling”
bottom: “inception_3a/output”
top: “inception_3b/pool”
pooling_param {
pool: MAX
kernel_size: 3
stride: 1
pad: 1
}
}
layer {
name: “inception_3b/pool_proj”
type: “Convolution”
bottom: “inception_3b/pool”
top: “inception_3b/pool_proj”
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 64
kernel_size: 1
weight_filler {
type: “xavier”
std: 0.1
}
bias_filler {
type: “constant”
value: 0.2
}
}
}
layer {
name: “inception_3b/relu_pool_proj”
type: “ReLU”
bottom: “inception_3b/pool_proj”
top: “inception_3b/pool_proj”
}
layer {
name: “inception_3b/output”
type: “Concat”
bottom: “inception_3b/1x1”
bottom: “inception_3b/3x3”
bottom: “inception_3b/5x5”
bottom: “inception_3b/pool_proj”
top: “inception_3b/output”
}
layer {
name: “pool3/3x3_s2”
type: “Pooling”
bottom: “inception_3b/output”
top: “pool3/3x3_s2”
pooling_param {
pool: MAX
kernel_size: 3
stride: 2
}
}
layer {
name: “inception_4a/1x1”
type: “Convolution”
bottom: “pool3/3x3_s2”
top: “inception_4a/1x1”
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 192
kernel_size: 1
weight_filler {
type: “xavier”
std: 0.03
}
bias_filler {
type: “constant”
value: 0.2
}
}
}
layer {
name: “inception_4a/relu_1x1”
type: “ReLU”
bottom: “inception_4a/1x1”
top: “inception_4a/1x1”
}
layer {
name: “inception_4a/3x3_reduce”
type: “Convolution”
bottom: “pool3/3x3_s2”
top: “inception_4a/3x3_reduce”
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 96
kernel_size: 1
weight_filler {
type: “xavier”
std: 0.09
}
bias_filler {
type: “constant”
value: 0.2
}
}
}
layer {
name: “inception_4a/relu_3x3_reduce”
type: “ReLU”
bottom: “inception_4a/3x3_reduce”
top: “inception_4a/3x3_reduce”
}
layer {
name: “inception_4a/3x3”
type: “Convolution”
bottom: “inception_4a/3x3_reduce”
top: “inception_4a/3x3”
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 208
pad: 1
kernel_size: 3
weight_filler {
type: “xavier”
std: 0.03
}
bias_filler {
type: “constant”
value: 0.2
}
}
}
layer {
name: “inception_4a/relu_3x3”
type: “ReLU”
bottom: “inception_4a/3x3”
top: “inception_4a/3x3”
}
layer {
name: “inception_4a/5x5_reduce”
type: “Convolution”
bottom: “pool3/3x3_s2”
top: “inception_4a/5x5_reduce”
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 16
kernel_size: 1
weight_filler {
type: “xavier”
std: 0.2
}
bias_filler {
type: “constant”
value: 0.2
}
}
}
layer {
name: “inception_4a/relu_5x5_reduce”
type: “ReLU”
bottom: “inception_4a/5x5_reduce”
top: “inception_4a/5x5_reduce”
}
layer {
name: “inception_4a/5x5”
type: “Convolution”
bottom: “inception_4a/5x5_reduce”
top: “inception_4a/5x5”
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 48
pad: 2
kernel_size: 5
weight_filler {
type: “xavier”
std: 0.03
}
bias_filler {
type: “constant”
value: 0.2
}
}
}
layer {
name: “inception_4a/relu_5x5”
type: “ReLU”
bottom: “inception_4a/5x5”
top: “inception_4a/5x5”
}
layer {
name: “inception_4a/pool”
type: “Pooling”
bottom: “pool3/3x3_s2”
top: “inception_4a/pool”
pooling_param {
pool: MAX
kernel_size: 3
stride: 1
pad: 1
}
}
layer {
name: “inception_4a/pool_proj”
type: “Convolution”
bottom: “inception_4a/pool”
top: “inception_4a/pool_proj”
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 64
kernel_size: 1
weight_filler {
type: “xavier”
std: 0.1
}
bias_filler {
type: “constant”
value: 0.2
}
}
}
layer {
name: “inception_4a/relu_pool_proj”
type: “ReLU”
bottom: “inception_4a/pool_proj”
top: “inception_4a/pool_proj”
}
layer {
name: “inception_4a/output”
type: “Concat”
bottom: “inception_4a/1x1”
bottom: “inception_4a/3x3”
bottom: “inception_4a/5x5”
bottom: “inception_4a/pool_proj”
top: “inception_4a/output”
}
layer {
name: “inception_4b/1x1”
type: “Convolution”
bottom: “inception_4a/output”
top: “inception_4b/1x1”
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 160
kernel_size: 1
weight_filler {
type: “xavier”
std: 0.03
}
bias_filler {
type: “constant”
value: 0.2
}
}
}
layer {
name: “inception_4b/relu_1x1”
type: “ReLU”
bottom: “inception_4b/1x1”
top: “inception_4b/1x1”
}
layer {
name: “inception_4b/3x3_reduce”
type: “Convolution”
bottom: “inception_4a/output”
top: “inception_4b/3x3_reduce”
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 112
kernel_size: 1
weight_filler {
type: “xavier”
std: 0.09
}
bias_filler {
type: “constant”
value: 0.2
}
}
}
layer {
name: “inception_4b/relu_3x3_reduce”
type: “ReLU”
bottom: “inception_4b/3x3_reduce”
top: “inception_4b/3x3_reduce”
}
layer {
name: “inception_4b/3x3”
type: “Convolution”
bottom: “inception_4b/3x3_reduce”
top: “inception_4b/3x3”
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 224
pad: 1
kernel_size: 3
weight_filler {
type: “xavier”
std: 0.03
}
bias_filler {
type: “constant”
value: 0.2
}
}
}
layer {
name: “inception_4b/relu_3x3”
type: “ReLU”
bottom: “inception_4b/3x3”
top: “inception_4b/3x3”
}
layer {
name: “inception_4b/5x5_reduce”
type: “Convolution”
bottom: “inception_4a/output”
top: “inception_4b/5x5_reduce”
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 24
kernel_size: 1
weight_filler {
type: “xavier”
std: 0.2
}
bias_filler {
type: “constant”
value: 0.2
}
}
}
layer {
name: “inception_4b/relu_5x5_reduce”
type: “ReLU”
bottom: “inception_4b/5x5_reduce”
top: “inception_4b/5x5_reduce”
}
layer {
name: “inception_4b/5x5”
type: “Convolution”
bottom: “inception_4b/5x5_reduce”
top: “inception_4b/5x5”
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 64
pad: 2
kernel_size: 5
weight_filler {
type: “xavier”
std: 0.03
}
bias_filler {
type: “constant”
value: 0.2
}
}
}
layer {
name: “inception_4b/relu_5x5”
type: “ReLU”
bottom: “inception_4b/5x5”
top: “inception_4b/5x5”
}
layer {
name: “inception_4b/pool”
type: “Pooling”
bottom: “inception_4a/output”
top: “inception_4b/pool”
pooling_param {
pool: MAX
kernel_size: 3
stride: 1
pad: 1
}
}
layer {
name: “inception_4b/pool_proj”
type: “Convolution”
bottom: “inception_4b/pool”
top: “inception_4b/pool_proj”
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 64
kernel_size: 1
weight_filler {
type: “xavier”
std: 0.1
}
bias_filler {
type: “constant”
value: 0.2
}
}
}
layer {
name: “inception_4b/relu_pool_proj”
type: “ReLU”
bottom: “inception_4b/pool_proj”
top: “inception_4b/pool_proj”
}
layer {
name: “inception_4b/output”
type: “Concat”
bottom: “inception_4b/1x1”
bottom: “inception_4b/3x3”
bottom: “inception_4b/5x5”
bottom: “inception_4b/pool_proj”
top: “inception_4b/output”
}
layer {
name: “inception_4c/1x1”
type: “Convolution”
bottom: “inception_4b/output”
top: “inception_4c/1x1”
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 128
kernel_size: 1
weight_filler {
type: “xavier”
std: 0.03
}
bias_filler {
type: “constant”
value: 0.2
}
}
}
layer {
name: “inception_4c/relu_1x1”
type: “ReLU”
bottom: “inception_4c/1x1”
top: “inception_4c/1x1”
}
layer {
name: “inception_4c/3x3_reduce”
type: “Convolution”
bottom: “inception_4b/output”
top: “inception_4c/3x3_reduce”
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 128
kernel_size: 1
weight_filler {
type: “xavier”
std: 0.09
}
bias_filler {
type: “constant”
value: 0.2
}
}
}
layer {
name: “inception_4c/relu_3x3_reduce”
type: “ReLU”
bottom: “inception_4c/3x3_reduce”
top: “inception_4c/3x3_reduce”
}
layer {
name: “inception_4c/3x3”
type: “Convolution”
bottom: “inception_4c/3x3_reduce”
top: “inception_4c/3x3”
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 256
pad: 1
kernel_size: 3
weight_filler {
type: “xavier”
std: 0.03
}
bias_filler {
type: “constant”
value: 0.2
}
}
}
layer {
name: “inception_4c/relu_3x3”
type: “ReLU”
bottom: “inception_4c/3x3”
top: “inception_4c/3x3”
}
layer {
name: “inception_4c/5x5_reduce”
type: “Convolution”
bottom: “inception_4b/output”
top: “inception_4c/5x5_reduce”
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 24
kernel_size: 1
weight_filler {
type: “xavier”
std: 0.2
}
bias_filler {
type: “constant”
value: 0.2
}
}
}
layer {
name: “inception_4c/relu_5x5_reduce”
type: “ReLU”
bottom: “inception_4c/5x5_reduce”
top: “inception_4c/5x5_reduce”
}
layer {
name: “inception_4c/5x5”
type: “Convolution”
bottom: “inception_4c/5x5_reduce”
top: “inception_4c/5x5”
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 64
pad: 2
kernel_size: 5
weight_filler {
type: “xavier”
std: 0.03
}
bias_filler {
type: “constant”
value: 0.2
}
}
}
layer {
name: “inception_4c/relu_5x5”
type: “ReLU”
bottom: “inception_4c/5x5”
top: “inception_4c/5x5”
}
layer {
name: “inception_4c/pool”
type: “Pooling”
bottom: “inception_4b/output”
top: “inception_4c/pool”
pooling_param {
pool: MAX
kernel_size: 3
stride: 1
pad: 1
}
}
layer {
name: “inception_4c/pool_proj”
type: “Convolution”
bottom: “inception_4c/pool”
top: “inception_4c/pool_proj”
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 64
kernel_size: 1
weight_filler {
type: “xavier”
std: 0.1
}
bias_filler {
type: “constant”
value: 0.2
}
}
}
layer {
name: “inception_4c/relu_pool_proj”
type: “ReLU”
bottom: “inception_4c/pool_proj”
top: “inception_4c/pool_proj”
}
layer {
name: “inception_4c/output”
type: “Concat”
bottom: “inception_4c/1x1”
bottom: “inception_4c/3x3”
bottom: “inception_4c/5x5”
bottom: “inception_4c/pool_proj”
top: “inception_4c/output”
}
layer {
name: “inception_4d/1x1”
type: “Convolution”
bottom: “inception_4c/output”
top: “inception_4d/1x1”
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 112
kernel_size: 1
weight_filler {
type: “xavier”
std: 0.1
}
bias_filler {
type: “constant”
value: 0.2
}
}
}
layer {
name: “inception_4d/relu_1x1”
type: “ReLU”
bottom: “inception_4d/1x1”
top: “inception_4d/1x1”
}
layer {
name: “inception_4d/3x3_reduce”
type: “Convolution”
bottom: “inception_4c/output”
top: “inception_4d/3x3_reduce”
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 144
kernel_size: 1
weight_filler {
type: “xavier”
std: 0.1
}
bias_filler {
type: “constant”
value: 0.2
}
}
}
layer {
name: “inception_4d/relu_3x3_reduce”
type: “ReLU”
bottom: “inception_4d/3x3_reduce”
top: “inception_4d/3x3_reduce”
}
layer {
name: “inception_4d/3x3”
type: “Convolution”
bottom: “inception_4d/3x3_reduce”
top: “inception_4d/3x3”
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 288
pad: 1
kernel_size: 3
weight_filler {
type: “xavier”
std: 0.1
}
bias_filler {
type: “constant”
value: 0.2
}
}
}
layer {
name: “inception_4d/relu_3x3”
type: “ReLU”
bottom: “inception_4d/3x3”
top: “inception_4d/3x3”
}
layer {
name: “inception_4d/5x5_reduce”
type: “Convolution”
bottom: “inception_4c/output”
top: “inception_4d/5x5_reduce”
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 32
kernel_size: 1
weight_filler {
type: “xavier”
std: 0.1
}
bias_filler {
type: “constant”
value: 0.2
}
}
}
layer {
name: “inception_4d/relu_5x5_reduce”
type: “ReLU”
bottom: “inception_4d/5x5_reduce”
top: “inception_4d/5x5_reduce”
}
layer {
name: “inception_4d/5x5”
type: “Convolution”
bottom: “inception_4d/5x5_reduce”
top: “inception_4d/5x5”
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 64
pad: 2
kernel_size: 5
weight_filler {
type: “xavier”
std: 0.1
}
bias_filler {
type: “constant”
value: 0.2
}
}
}
layer {
name: “inception_4d/relu_5x5”
type: “ReLU”
bottom: “inception_4d/5x5”
top: “inception_4d/5x5”
}
layer {
name: “inception_4d/pool”
type: “Pooling”
bottom: “inception_4c/output”
top: “inception_4d/pool”
pooling_param {
pool: MAX
kernel_size: 3
stride: 1
pad: 1
}
}
layer {
name: “inception_4d/pool_proj”
type: “Convolution”
bottom: “inception_4d/pool”
top: “inception_4d/pool_proj”
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 64
kernel_size: 1
weight_filler {
type: “xavier”
std: 0.1
}
bias_filler {
type: “constant”
value: 0.2
}
}
}
layer {
name: “inception_4d/relu_pool_proj”
type: “ReLU”
bottom: “inception_4d/pool_proj”
top: “inception_4d/pool_proj”
}
layer {
name: “inception_4d/output”
type: “Concat”
bottom: “inception_4d/1x1”
bottom: “inception_4d/3x3”
bottom: “inception_4d/5x5”
bottom: “inception_4d/pool_proj”
top: “inception_4d/output”
}
layer {
name: “inception_4e/1x1”
type: “Convolution”
bottom: “inception_4d/output”
top: “inception_4e/1x1”
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 256
kernel_size: 1
weight_filler {
type: “xavier”
std: 0.03
}
bias_filler {
type: “constant”
value: 0.2
}
}
}
layer {
name: “inception_4e/relu_1x1”
type: “ReLU”
bottom: “inception_4e/1x1”
top: “inception_4e/1x1”
}
layer {
name: “inception_4e/3x3_reduce”
type: “Convolution”
bottom: “inception_4d/output”
top: “inception_4e/3x3_reduce”
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 160
kernel_size: 1
weight_filler {
type: “xavier”
std: 0.09
}
bias_filler {
type: “constant”
value: 0.2
}
}
}
layer {
name: “inception_4e/relu_3x3_reduce”
type: “ReLU”
bottom: “inception_4e/3x3_reduce”
top: “inception_4e/3x3_reduce”
}
layer {
name: “inception_4e/3x3”
type: “Convolution”
bottom: “inception_4e/3x3_reduce”
top: “inception_4e/3x3”
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 320
pad: 1
kernel_size: 3
weight_filler {
type: “xavier”
std: 0.03
}
bias_filler {
type: “constant”
value: 0.2
}
}
}
layer {
name: “inception_4e/relu_3x3”
type: “ReLU”
bottom: “inception_4e/3x3”
top: “inception_4e/3x3”
}
layer {
name: “inception_4e/5x5_reduce”
type: “Convolution”
bottom: “inception_4d/output”
top: “inception_4e/5x5_reduce”
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 32
kernel_size: 1
weight_filler {
type: “xavier”
std: 0.2
}
bias_filler {
type: “constant”
value: 0.2
}
}
}
layer {
name: “inception_4e/relu_5x5_reduce”
type: “ReLU”
bottom: “inception_4e/5x5_reduce”
top: “inception_4e/5x5_reduce”
}
layer {
name: “inception_4e/5x5”
type: “Convolution”
bottom: “inception_4e/5x5_reduce”
top: “inception_4e/5x5”
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 128
pad: 2
kernel_size: 5
weight_filler {
type: “xavier”
std: 0.03
}
bias_filler {
type: “constant”
value: 0.2
}
}
}
layer {
name: “inception_4e/relu_5x5”
type: “ReLU”
bottom: “inception_4e/5x5”
top: “inception_4e/5x5”
}
layer {
name: “inception_4e/pool”
type: “Pooling”
bottom: “inception_4d/output”
top: “inception_4e/pool”
pooling_param {
pool: MAX
kernel_size: 3
stride: 1
pad: 1
}
}
layer {
name: “inception_4e/pool_proj”
type: “Convolution”
bottom: “inception_4e/pool”
top: “inception_4e/pool_proj”
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 128
kernel_size: 1
weight_filler {
type: “xavier”
std: 0.1
}
bias_filler {
type: “constant”
value: 0.2
}
}
}
layer {
name: “inception_4e/relu_pool_proj”
type: “ReLU”
bottom: “inception_4e/pool_proj”
top: “inception_4e/pool_proj”
}
layer {
I0816 16:10:56.502926 9341 layer_factory.hpp:77] Creating layer train_data
I0816 16:10:56.504567 9341 net.cpp:94] Creating Layer train_data
I0816 16:10:56.504627 9341 net.cpp:409] train_data -> data
I0816 16:10:56.506983 9348 db_lmdb.cpp:35] Opened lmdb /home/nvidia/DIGITS/digits/jobs/20180816-144514-175b/train_db/features
I0816 16:10:56.531487 9341 data_layer.cpp:78] ReshapePrefetch 10, 3, 640, 640
I0816 16:10:56.531692 9341 data_layer.cpp:83] output data size: 10,3,640,640
I0816 16:10:56.816025 9341 net.cpp:144] Setting up train_data
I0816 16:10:56.816148 9341 net.cpp:151] Top shape: 10 3 640 640 (12288000)
I0816 16:10:56.816201 9341 net.cpp:159] Memory required for data: 49152000
I0816 16:10:56.816262 9341 layer_factory.hpp:77] Creating layer train_label
I0816 16:10:56.817281 9341 net.cpp:94] Creating Layer train_label
I0816 16:10:56.817339 9341 net.cpp:409] train_label -> label
I0816 16:10:56.823161 9357 db_lmdb.cpp:35] Opened lmdb /home/nvidia/DIGITS/digits/jobs/20180816-144514-175b/train_db/labels
I0816 16:10:56.823688 9341 data_layer.cpp:78] ReshapePrefetch 10, 1, 52, 16
I0816 16:10:56.823926 9341 data_layer.cpp:83] output data size: 10,1,52,16
I0816 16:10:56.826773 9341 net.cpp:144] Setting up train_label
I0816 16:10:56.826876 9341 net.cpp:151] Top shape: 10 1 52 16 (8320)
I0816 16:10:56.826917 9341 net.cpp:159] Memory required for data: 49185280
I0816 16:10:56.826946 9341 layer_factory.hpp:77] Creating layer train_transform
I0816 16:10:56.827100 9341 net.cpp:94] Creating Layer train_transform
I0816 16:10:56.827136 9341 net.cpp:435] train_transform <- data
I0816 16:10:56.827177 9341 net.cpp:435] train_transform <- label
I0816 16:10:56.827226 9341 net.cpp:409] train_transform -> transformed_data
I0816 16:10:56.827304 9341 net.cpp:409] train_transform -> transformed_label
I0816 16:10:56.832087 9341 net.cpp:144] Setting up train_transform
I0816 16:10:56.832208 9341 net.cpp:151] Top shape: 10 3 640 640 (12288000)
I0816 16:10:56.832250 9341 net.cpp:151] Top shape: 10 9 40 40 (144000)
I0816 16:10:56.832278 9341 net.cpp:159] Memory required for data: 98913280
I0816 16:10:56.832303 9341 layer_factory.hpp:77] Creating layer slice-label
I0816 16:10:56.832346 9341 net.cpp:94] Creating Layer slice-label
I0816 16:10:56.832370 9341 net.cpp:435] slice-label <- transformed_label
I0816 16:10:56.832404 9341 net.cpp:409] slice-label -> foreground-label
I0816 16:10:56.832561 9341 net.cpp:409] slice-label -> bbox-label
I0816 16:10:56.832656 9341 net.cpp:409] slice-label -> size-label
I0816 16:10:56.832834 9341 net.cpp:409] slice-label -> obj-label
I0816 16:10:56.832973 9341 net.cpp:409] slice-label -> coverage-label
I0816 16:10:56.835383 9341 net.cpp:144] Setting up slice-label
I0816 16:10:56.835424 9341 net.cpp:151] Top shape: 10 1 40 40 (16000)
I0816 16:10:56.835461 9341 net.cpp:151] Top shape: 10 4 40 40 (64000)
I0816 16:10:56.835489 9341 net.cpp:151] Top shape: 10 2 40 40 (32000)
I0816 16:10:56.835515 9341 net.cpp:151] Top shape: 10 1 40 40 (16000)
I0816 16:10:56.835541 9341 net.cpp:151] Top shape: 10 1 40 40 (16000)
I0816 16:10:56.835566 9341 net.cpp:159] Memory required for data: 99489280
I0816 16:10:56.835589 9341 layer_factory.hpp:77] Creating layer foreground-label_slice-label_0_split
I0816 16:10:56.835633 9341 net.cpp:94] Creating Layer foreground-label_slice-label_0_split
I0816 16:10:56.835655 9341 net.cpp:435] foreground-label_slice-label_0_split <- foreground-label
I0816 16:10:56.835690 9341 net.cpp:409] foreground-label_slice-label_0_split -> foreground-label_slice-label_0_split_0
I0816 16:10:56.835736 9341 net.cpp:409] foreground-label_slice-label_0_split -> foreground-label_slice-label_0_split_1
I0816 16:10:56.835774 9341 net.cpp:409] foreground-label_slice-label_0_split -> foreground-label_slice-label_0_split_2
I0816 16:10:56.835811 9341 net.cpp:409] foreground-label_slice-label_0_split -> foreground-label_slice-label_0_split_3
I0816 16:10:56.836383 9341 net.cpp:144] Setting up foreground-label_slice-label_0_split
I0816 16:10:56.836416 9341 net.cpp:151] Top shape: 10 1 40 40 (16000)
I0816 16:10:56.836447 9341 net.cpp:151] Top shape: 10 1 40 40 (16000)
I0816 16:10:56.836473 9341 net.cpp:151] Top shape: 10 1 40 40 (16000)
I0816 16:10:56.836498 9341 net.cpp:151] Top shape: 10 1 40 40 (16000)
I0816 16:10:56.836563 9341 net.cpp:159] Memory required for data: 99745280
I0816 16:10:56.836589 9341 layer_factory.hpp:77] Creating layer size-label_slice-label_2_split
I0816 16:10:56.836621 9341 net.cpp:94] Creating Layer size-label_slice-label_2_split
I0816 16:10:56.836695 9341 net.cpp:435] size-label_slice-label_2_split <- size-label
I0816 16:10:56.836729 9341 net.cpp:409] size-label_slice-label_2_split -> size-label_slice-label_2_split_0
I0816 16:10:56.836768 9341 net.cpp:409] size-label_slice-label_2_split -> size-label_slice-label_2_split_1
I0816 16:10:56.840402 9341 net.cpp:144] Setting up size-label_slice-label_2_split
I0816 16:10:56.840435 9341 net.cpp:151] Top shape: 10 2 40 40 (32000)
I0816 16:10:56.840466 9341 net.cpp:151] Top shape: 10 2 40 40 (32000)
I0816 16:10:56.840636 9341 net.cpp:159] Memory required for data: 100001280
I0816 16:10:56.840662 9341 layer_factory.hpp:77] Creating layer obj-label_slice-label_3_split
I0816 16:10:56.840695 9341 net.cpp:94] Creating Layer obj-label_slice-label_3_split
I0816 16:10:56.840718 9341 net.cpp:435] obj-label_slice-label_3_split <- obj-label
I0816 16:10:56.840749 9341 net.cpp:409] obj-label_slice-label_3_split -> obj-label_slice-label_3_split_0
I0816 16:10:56.840790 9341 net.cpp:409] obj-label_slice-label_3_split -> obj-label_slice-label_3_split_1
I0816 16:10:56.840844 9341 net.cpp:409] obj-label_slice-label_3_split -> obj-label_slice-label_3_split_2
I0816 16:10:56.840883 9341 net.cpp:409] obj-label_slice-label_3_split -> obj-label_slice-label_3_split_3
I0816 16:10:56.846215 9341 net.cpp:144] Setting up obj-label_slice-label_3_split
I0816 16:10:56.846266 9341 net.cpp:151] Top shape: 10 1 40 40 (16000)
I0816 16:10:56.846293 9341 net.cpp:151] Top shape: 10 1 40 40 (16000)
I0816 16:10:56.846312 9341 net.cpp:151] Top shape: 10 1 40 40 (16000)
I0816 16:10:56.846326 9341 net.cpp:151] Top shape: 10 1 40 40 (16000)
I0816 16:10:56.846341 9341 net.cpp:159] Memory required for data: 100257280
I0816 16:10:56.846359 9341 layer_factory.hpp:77] Creating layer coverage-block
I0816 16:10:56.846397 9341 net.cpp:94] Creating Layer coverage-block
I0816 16:10:56.846424 9341 net.cpp:435] coverage-block <- foreground-label_slice-label_0_split_0
I0816 16:10:56.846451 9341 net.cpp:435] coverage-block <- foreground-label_slice-label_0_split_1
I0816 16:10:56.846467 9341 net.cpp:435] coverage-block <- foreground-label_slice-label_0_split_2
I0816 16:10:56.846483 9341 net.cpp:435] coverage-block <- foreground-label_slice-label_0_split_3
I0816 16:10:56.846504 9341 net.cpp:409] coverage-block -> coverage-block
I0816 16:10:56.846693 9341 net.cpp:144] Setting up coverage-block
I0816 16:10:56.846722 9341 net.cpp:151] Top shape: 10 4 40 40 (64000)
I0816 16:10:56.846740 9341 net.cpp:159] Memory required for data: 100513280
I0816 16:10:56.846755 9341 layer_factory.hpp:77] Creating layer size-block
I0816 16:10:56.846791 9341 net.cpp:94] Creating Layer size-block
I0816 16:10:56.846809 9341 net.cpp:435] size-block <- size-label_slice-label_2_split_0
I0816 16:10:56.846832 9341 net.cpp:435] size-block <- size-label_slice-label_2_split_1
I0816 16:10:56.846863 9341 net.cpp:409] size-block -> size-block
I0816 16:10:56.848335 9341 net.cpp:144] Setting up size-block
I0816 16:10:56.848417 9341 net.cpp:151] Top shape: 10 4 40 40 (64000)
I0816 16:10:56.848444 9341 net.cpp:159] Memory required for data: 100769280
I0816 16:10:56.848459 9341 layer_factory.hpp:77] Creating layer size-block_size-block_0_split
I0816 16:10:56.848670 9341 net.cpp:94] Creating Layer size-block_size-block_0_split
I0816 16:10:56.848737 9341 net.cpp:435] size-block_size-block_0_split <- size-block
I0816 16:10:56.848793 9341 net.cpp:409] size-block_size-block_0_split -> size-block_size-block_0_split_0
I0816 16:10:56.848932 9341 net.cpp:409] size-block_size-block_0_split -> size-block_size-block_0_split_1
I0816 16:10:56.853925 9341 net.cpp:144] Setting up size-block_size-block_0_split
I0816 16:10:56.854033 9341 net.cpp:151] Top shape: 10 4 40 40 (64000)
I0816 16:10:56.854252 9341 net.cpp:151] Top shape: 10 4 40 40 (64000)
I0816 16:10:56.854326 9341 net.cpp:159] Memory required for data: 101281280
I0816 16:10:56.854385 9341 layer_factory.hpp:77] Creating layer obj-block
I0816 16:10:56.854488 9341 net.cpp:94] Creating Layer obj-block
I0816 16:10:56.854549 9341 net.cpp:435] obj-block <- obj-label_slice-label_3_split_0
I0816 16:10:56.854638 9341 net.cpp:435] obj-block <- obj-label_slice-label_3_split_1
I0816 16:10:56.854710 9341 net.cpp:435] obj-block <- obj-label_slice-label_3_split_2
I0816 16:10:56.854915 9341 net.cpp:435] obj-block <- obj-label_slice-label_3_split_3
I0816 16:10:56.855101 9341 net.cpp:409] obj-block -> obj-block
I0816 16:10:56.855409 9341 net.cpp:144] Setting up obj-block
I0816 16:10:56.855479 9341 net.cpp:151] Top shape: 10 4 40 40 (64000)
I0816 16:10:56.855737 9341 net.cpp:159] Memory required for data: 101537280
I0816 16:10:56.855803 9341 layer_factory.hpp:77] Creating layer obj-block_obj-block_0_split
I0816 16:10:56.855952 9341 net.cpp:94] Creating Layer obj-block_obj-block_0_split
I0816 16:10:56.856010 9341 net.cpp:435] obj-block_obj-block_0_split <- obj-block
I0816 16:10:56.856094 9341 net.cpp:409] obj-block_obj-block_0_split -> obj-block_obj-block_0_split_0
I0816 16:10:56.856202 9341 net.cpp:409] obj-block_obj-block_0_split -> obj-block_obj-block_0_split_1
I0816 16:10:56.856828 9341 net.cpp:144] Setting up obj-block_obj-block_0_split
I0816 16:10:56.856904 9341 net.cpp:151] Top shape: 10 4 40 40 (64000)
I0816 16:10:56.857013 9341 net.cpp:151] Top shape: 10 4 40 40 (64000)
I0816 16:10:56.857044 9341 net.cpp:159] Memory required for data: 102049280
I0816 16:10:56.857069 9341 layer_factory.hpp:77] Creating layer bb-label-norm
I0816 16:10:56.857108 9341 net.cpp:94] Creating Layer bb-label-norm
I0816 16:10:56.857133 9341 net.cpp:435] bb-label-norm <- bbox-label
I0816 16:10:56.857163 9341 net.cpp:435] bb-label-norm <- size-block_size-block_0_split_0
I0816 16:10:56.857198 9341 net.cpp:409] bb-label-norm -> bbox-label-norm
I0816 16:10:56.857336 9341 net.cpp:144] Setting up bb-label-norm
I0816 16:10:56.857365 9341 net.cpp:151] Top shape: 10 4 40 40 (64000)
I0816 16:10:56.857398 9341 net.cpp:159] Memory required for data: 102305280
I0816 16:10:56.857422 9341 layer_factory.hpp:77] Creating layer bb-obj-norm
I0816 16:10:56.857456 9341 net.cpp:94] Creating Layer bb-obj-norm
I0816 16:10:56.857481 9341 net.cpp:435] bb-obj-norm <- bbox-label-norm
I0816 16:10:56.857508 9341 net.cpp:435] bb-obj-norm <- obj-block_obj-block_0_split_0
I0816 16:10:56.857543 9341 net.cpp:409] bb-obj-norm -> bbox-obj-label-norm
I0816 16:10:56.857657 9341 net.cpp:144] Setting up bb-obj-norm
I0816 16:10:56.857686 9341 net.cpp:151] Top shape: 10 4 40 40 (64000)
I0816 16:10:56.857717 9341 net.cpp:159] Memory required for data: 102561280
I0816 16:10:56.857739 9341 layer_factory.hpp:77] Creating layer conv1/7x7_s2
I0816 16:10:56.857823 9341 net.cpp:94] Creating Layer conv1/7x7_s2
I0816 16:10:56.857851 9341 net.cpp:435] conv1/7x7_s2 <- transformed_data
I0816 16:10:56.857954 9341 net.cpp:409] conv1/7x7_s2 -> conv1/7x7_s2
I0816 16:10:56.863085 9341 net.cpp:144] Setting up conv1/7x7_s2
I0816 16:10:56.863144 9341 net.cpp:151] Top shape: 10 64 320 320 (65536000)
I0816 16:10:56.863179 9341 net.cpp:159] Memory required for data: 364705280
I0816 16:10:56.863306 9341 layer_factory.hpp:77] Creating layer conv1/relu_7x7
I0816 16:10:56.863345 9341 net.cpp:94] Creating Layer conv1/relu_7x7
I0816 16:10:56.863366 9341 net.cpp:435] conv1/relu_7x7 <- conv1/7x7_s2
I0816 16:10:56.863394 9341 net.cpp:396] conv1/relu_7x7 -> conv1/7x7_s2 (in-place)
I0816 16:10:56.863456 9341 net.cpp:144] Setting up conv1/relu_7x7
I0816 16:10:56.863473 9341 net.cpp:151] Top shape: 10 64 320 320 (65536000)
I0816 16:10:56.863497 9341 net.cpp:159] Memory required for data: 626849280
I0816 16:10:56.863513 9341 layer_factory.hpp:77] Creating layer pool1/3x3_s2
I0816 16:10:56.863555 9341 net.cpp:94] Creating Layer pool1/3x3_s2
I0816 16:10:56.863572 9341 net.cpp:435] pool1/3x3_s2 <- conv1/7x7_s2
I0816 16:10:56.863597 9341 net.cpp:409] pool1/3x3_s2 -> pool1/3x3_s2
I0816 16:10:56.863816 9341 net.cpp:144] Setting up pool1/3x3_s2
I0816 16:10:56.863838 9341 net.cpp:151] Top shape: 10 64 160 160 (16384000)
I0816 16:10:56.863863 9341 net.cpp:159] Memory required for data: 692385280
I0816 16:10:56.863880 9341 layer_factory.hpp:77] Creating layer pool1/norm1
I0816 16:10:56.863914 9341 net.cpp:94] Creating Layer pool1/norm1
I0816 16:10:56.863931 9341 net.cpp:435] pool1/norm1 <- pool1/3x3_s2
I0816 16:10:56.863955 9341 net.cpp:409] pool1/norm1 -> pool1/norm1
I0816 16:10:56.864141 9341 net.cpp:144] Setting up pool1/norm1
I0816 16:10:56.864166 9341 net.cpp:151] Top shape: 10 64 160 160 (16384000)
I0816 16:10:56.864189 9341 net.cpp:159] Memory required for data: 757921280
I0816 16:10:56.864207 9341 layer_factory.hpp:77] Creating layer conv2/3x3_reduce
I0816 16:10:56.864399 9341 net.cpp:94] Creating Layer conv2/3x3_reduce
I0816 16:10:56.864421 9341 net.cpp:435] conv2/3x3_reduce <- pool1/norm1
I0816 16:10:56.864452 9341 net.cpp:409] conv2/3x3_reduce -> conv2/3x3_reduce
I0816 16:10:56.866444 9341 net.cpp:144] Setting up conv2/3x3_reduce
I0816 16:10:56.866485 9341 net.cpp:151] Top shape: 10 64 160 160 (16384000)
I0816 16:10:56.866627 9341 net.cpp:159] Memory required for data: 823457280
I0816 16:10:56.866734 9341 layer_factory.hpp:77] Creating layer conv2/relu_3x3_reduce
I0816 16:10:56.866765 9341 net.cpp:94] Creating Layer conv2/relu_3x3_reduce
I0816 16:10:56.866782 9341 net.cpp:435] conv2/relu_3x3_reduce <- conv2/3x3_reduce
I0816 16:10:56.866848 9341 net.cpp:396] conv2/relu_3x3_reduce -> conv2/3x3_reduce (in-place)
I0816 16:10:56.866885 9341 net.cpp:144] Setting up conv2/relu_3x3_reduce
I0816 16:10:56.866899 9341 net.cpp:151] Top shape: 10 64 160 160 (16384000)
I0816 16:10:56.866920 9341 net.cpp:159] Memory required for data: 888993280
I0816 16:10:56.866935 9341 layer_factory.hpp:77] Creating layer conv2/3x3
I0816 16:10:56.866971 9341 net.cpp:94] Creating Layer conv2/3x3
I0816 16:10:56.866986 9341 net.cpp:435] conv2/3x3 <- conv2/3x3_reduce
I0816 16:10:56.867012 9341 net.cpp:409] conv2/3x3 -> conv2/3x3
I0816 16:10:56.879534 9341 net.cpp:144] Setting up conv2/3x3
I0816 16:10:56.879587 9341 net.cpp:151] Top shape: 10 192 160 160 (49152000)
I0816 16:10:56.879657 9341 net.cpp:159] Memory required for data: 1085601280
I0816 16:10:56.879704 9341 layer_factory.hpp:77] Creating layer conv2/relu_3x3
I0816 16:10:56.879735 9341 net.cpp:94] Creating Layer conv2/relu_3x3
I0816 16:10:56.879751 9341 net.cpp:435] conv2/relu_3x3 <- conv2/3x3
I0816 16:10:56.879776 9341 net.cpp:396] conv2/relu_3x3 -> conv2/3x3 (in-place)
I0816 16:10:56.879811 9341 net.cpp:144] Setting up conv2/relu_3x3
I0816 16:10:56.879823 9341 net.cpp:151] Top shape: 10 192 160 160 (49152000)
I0816 16:10:56.879842 9341 net.cpp:159] Memory required for data: 1282209280
I0816 16:10:56.879853 9341 layer_factory.hpp:77] Creating layer conv2/norm2
I0816 16:10:56.879878 9341 net.cpp:94] Creating Layer conv2/norm2
I0816 16:10:56.879889 9341 net.cpp:435] conv2/norm2 <- conv2/3x3
I0816 16:10:56.879909 9341 net.cpp:409] conv2/norm2 -> conv2/norm2
I0816 16:10:56.880100 9341 net.cpp:144] Setting up conv2/norm2
I0816 16:10:56.880117 9341 net.cpp:151] Top shape: 10 192 160 160 (49152000)
I0816 16:10:56.880136 9341 net.cpp:159] Memory required for data: 1478817280
I0816 16:10:56.880149 9341 layer_factory.hpp:77] Creating layer pool2/3x3_s2
I0816 16:10:56.880170 9341 net.cpp:94] Creating Layer pool2/3x3_s2
I0816 16:10:56.880183 9341 net.cpp:435] pool2/3x3_s2 <- conv2/norm2
I0816 16:10:56.880203 9341 net.cpp:409] pool2/3x3_s2 -> pool2/3x3_s2
I0816 16:10:56.880318 9341 net.cpp:144] Setting up pool2/3x3_s2
I0816 16:10:56.880336 9341 net.cpp:151] Top shape: 10 192 80 80 (12288000)
I0816 16:10:56.880355 9341 net.cpp:159] Memory required for data: 1527969280
I0816 16:10:56.880367 9341 layer_factory.hpp:77] Creating layer pool2/3x3_s2_pool2/3x3_s2_0_split
I0816 16:10:56.880429 9341 net.cpp:94] Creating Layer pool2/3x3_s2_pool2/3x3_s2_0_split
I0816 16:10:56.880443 9341 net.cpp:435] pool2/3x3_s2_pool2/3x3_s2_0_split <- pool2/3x3_s2
I0816 16:10:56.880465 9341 net.cpp:409] pool2/3x3_s2_pool2/3x3_s2_0_split -> pool2/3x3_s2_pool2/3x3_s2_0_split_0
I0816 16:10:56.880496 9341 net.cpp:409] pool2/3x3_s2_pool2/3x3_s2_0_split -> pool2/3x3_s2_pool2/3x3_s2_0_split_1
I0816 16:10:56.880571 9341 net.cpp:409] pool2/3x3_s2_pool2/3x3_s2_0_split -> pool2/3x3_s2_pool2/3x3_s2_0_split_2
I0816 16:10:56.880599 9341 net.cpp:409] pool2/3x3_s2_pool2/3x3_s2_0_split -> pool2/3x3_s2_pool2/3x3_s2_0_split_3
I0816 16:10:56.880800 9341 net.cpp:144] Setting up pool2/3x3_s2_pool2/3x3_s2_0_split
I0816 16:10:56.880818 9341 net.cpp:151] Top shape: 10 192 80 80 (12288000)
I0816 16:10:56.880838 9341 net.cpp:151] Top shape: 10 192 80 80 (12288000)
I0816 16:10:56.880854 9341 net.cpp:151] Top shape: 10 192 80 80 (12288000)
I0816 16:10:56.880923 9341 net.cpp:151] Top shape: 10 192 80 80 (12288000)
I0816 16:10:56.880941 9341 net.cpp:159] Memory required for data: 1724577280
I0816 16:10:56.880955 9341 layer_factory.hpp:77] Creating layer inception_3a/1x1
I0816 16:10:56.880995 9341 net.cpp:94] Creating Layer inception_3a/1x1
I0816 16:10:56.881008 9341 net.cpp:435] inception_3a/1x1 <- pool2/3x3_s2_pool2/3x3_s2_0_split_0
I0816 16:10:56.881036 9341 net.cpp:409] inception_3a/1x1 -> inception_3a/1x1
I0816 16:10:56.882165 9341 net.cpp:144] Setting up inception_3a/1x1
I0816 16:10:56.882191 9341 net.cpp:151] Top shape: 10 64 80 80 (4096000)
I0816 16:10:56.882212 9341 net.cpp:159] Memory required for data: 1740961280
I0816 16:10:56.882238 9341 layer_factory.hpp:77] Creating layer inception_3a/relu_1x1
I0816 16:10:56.882261 9341 net.cpp:94] Creating Layer inception_3a/relu_1x1
I0816 16:10:56.882275 9341 net.cpp:435] inception_3a/relu_1x1 <- inception_3a/1x1
I0816 16:10:56.882295 9341 net.cpp:396] inception_3a/relu_1x1 -> inception_3a/1x1 (in-place)
I0816 16:10:56.882323 9341 net.cpp:144] Setting up inception_3a/relu_1x1
I0816 16:10:56.882334 9341 net.cpp:151] Top shape: 10 64 80 80 (4096000)
I0816 16:10:56.882351 9341 net.cpp:159] Memory required for data: 1757345280
I0816 16:10:56.882364 9341 layer_factory.hpp:77] Creating layer inception_3a/3x3_reduce
I0816 16:10:56.882395 9341 net.cpp:94] Creating Layer inception_3a/3x3_reduce
I0816 16:10:56.882408 9341 net.cpp:435] inception_3a/3x3_reduce <- pool2/3x3_s2_pool2/3x3_s2_0_split_1
I0816 16:10:56.882439 9341 net.cpp:409] inception_3a/3x3_reduce -> inception_3a/3x3_reduce
I0816 16:10:56.883682 9341 net.cpp:144] Setting up inception_3a/3x3_reduce
I0816 16:10:56.883709 9341 net.cpp:151] Top shape: 10 96 80 80 (6144000)
I0816 16:10:56.883771 9341 net.cpp:159] Memory required for data: 1781921280
I0816 16:10:56.883810 9341 layer_factory.hpp:77] Creating layer inception_3a/relu_3x3_reduce
I0816 16:10:56.883833 9341 net.cpp:94] Creating Layer inception_3a/relu_3x3_reduce
I0816 16:10:56.883848 9341 net.cpp:435] inception_3a/relu_3x3_reduce <- inception_3a/3x3_reduce
I0816 16:10:56.883868 9341 net.cpp:396] inception_3a/relu_3x3_reduce -> inception_3a/3x3_reduce (in-place)
I0816 16:10:56.883898 9341 net.cpp:144] Setting up inception_3a/relu_3x3_reduce
I0816 16:10:56.883910 9341 net.cpp:151] Top shape: 10 96 80 80 (6144000)
I0816 16:10:56.883927 9341 net.cpp:159] Memory required for data: 1806497280
I0816 16:10:56.883939 9341 layer_factory.hpp:77] Creating layer inception_3a/3x3
I0816 16:10:56.883970 9341 net.cpp:94] Creating Layer inception_3a/3x3
I0816 16:10:56.883983 9341 net.cpp:435] inception_3a/3x3 <- inception_3a/3x3_reduce
I0816 16:10:56.884006 9341 net.cpp:409] inception_3a/3x3 -> inception_3a/3x3
I0816 16:10:56.893452 9341 net.cpp:144] Setting up inception_3a/3x3
I0816 16:10:56.893647 9341 net.cpp:151] Top shape: 10 128 80 80 (8192000)
I0816 16:10:56.893687 9341 net.cpp:159] Memory required for data: 1839265280
I0816 16:10:56.893760 9341 layer_factory.hpp:77] Creating layer inception_3a/relu_3x3
I0816 16:10:56.893838 9341 net.cpp:94] Creating Layer inception_3a/relu_3x3
I0816 16:10:56.893859 9341 net.cpp:435] inception_3a/relu_3x3 <- inception_3a/3x3
I0816 16:10:56.893884 9341 net.cpp:396] inception_3a/relu_3x3 -> inception_3a/3x3 (in-place)
I0816 16:10:56.893949 9341 net.cpp:144] Setting up inception_3a/relu_3x3
I0816 16:10:56.893966 9341 net.cpp:151] Top shape: 10 128 80 80 (8192000)
I0816 16:10:56.893985 9341 net.cpp:159] Memory required for data: 1872033280
I0816 16:10:56.893998 9341 layer_factory.hpp:77] Creating layer inception_3a/5x5_reduce
I0816 16:10:56.894037 9341 net.cpp:94] Creating Layer inception_3a/5x5_reduce
I0816 16:10:56.894052 9341 net.cpp:435] inception_3a/5x5_reduce <- pool2/3x3_s2_pool2/3x3_s2_0_split_2
I0816 16:10:56.894075 9341 net.cpp:409] inception_3a/5x5_reduce -> inception_3a/5x5_reduce
I0816 16:10:56.895063 9341 net.cpp:144] Setting up inception_3a/5x5_reduce
I0816 16:10:56.895102 9341 net.cpp:151] Top shape: 10 16 80 80 (1024000)
I0816 16:10:56.895200 9341 net.cpp:159] Memory required for data: 1876129280
I0816 16:10:56.895269 9341 layer_factory.hpp:77] Creating layer inception_3a/relu_5x5_reduce
I0816 16:10:56.895334 9341 net.cpp:94] Creating Layer inception_3a/relu_5x5_reduce
I0816 16:10:56.895351 9341 net.cpp:435] inception_3a/relu_5x5_reduce <- inception_3a/5x5_reduce
I0816 16:10:56.895373 9341 net.cpp:396] inception_3a/relu_5x5_reduce -> inception_3a/5x5_reduce (in-place)
I0816 16:10:56.895409 9341 net.cpp:144] Setting up inception_3a/relu_5x5_reduce
I0816 16:10:56.895422 9341 net.cpp:151] Top shape: 10 16 80 80 (1024000)
I0816 16:10:56.895440 9341 net.cpp:159] Memory required for data: 1880225280
I0816 16:10:56.895453 9341 layer_factory.hpp:77] Creating layer inception_3a/5x5
I0816 16:10:56.895484 9341 net.cpp:94] Creating Layer inception_3a/5x5
I0816 16:10:56.895498 9341 net.cpp:435] inception_3a/5x5 <- inception_3a/5x5_reduce
I0816 16:10:56.895522 9341 net.cpp:409] inception_3a/5x5 -> inception_3a/5x5
I0816 16:10:56.896688 9341 net.cpp:144] Setting up inception_3a/5x5
I0816 16:10:56.896718 9341 net.cpp:151] Top shape: 10 32 80 80 (2048000)
I0816 16:10:56.896739 9341 net.cpp:159] Memory required for data: 1888417280
I0816 16:10:56.896765 9341 layer_factory.hpp:77] Creating layer inception_3a/relu_5x5
I0816 16:10:56.896790 9341 net.cpp:94] Creating Layer inception_3a/relu_5x5
I0816 16:10:56.896805 9341 net.cpp:435] inception_3a/relu_5x5 <- inception_3a/5x5
I0816 16:10:56.896824 9341 net.cpp:396] inception_3a/relu_5x5 -> inception_3a/5x5 (in-place)
I0816 16:10:56.896854 9341 net.cpp:144] Setting up inception_3a/relu_5x5
I0816 16:10:56.896867 9341 net.cpp:151] Top shape: 10 32 80 80 (2048000)
I0816 16:10:56.896883 9341 net.cpp:159] Memory required for data: 1896609280
I0816 16:10:56.896895 9341 layer_factory.hpp:77] Creating layer inception_3a/pool
I0816 16:10:56.896930 9341 net.cpp:94] Creating Layer inception_3a/pool
I0816 16:10:56.896945 9341 net.cpp:435] inception_3a/pool <- pool2/3x3_s2_pool2/3x3_s2_0_split_3
I0816 16:10:56.896966 9341 net.cpp:409] inception_3a/pool -> inception_3a/pool
I0816 16:10:56.897084 9341 net.cpp:144] Setting up inception_3a/pool
I0816 16:10:56.897100 9341 net.cpp:151] Top shape: 10 192 80 80 (12288000)
I0816 16:10:56.897119 9341 net.cpp:159] Memory required for data: 1945761280
I0816 16:10:56.897132 9341 layer_factory.hpp:77] Creating layer inception_3a/pool_proj
I0816 16:10:56.897163 9341 net.cpp:94] Creating Layer inception_3a/pool_proj
I0816 16:10:56.897177 9341 net.cpp:435] inception_3a/pool_proj <- inception_3a/pool
I0816 16:10:56.897200 9341 net.cpp:409] inception_3a/pool_proj -> inception_3a/pool_proj
I0816 16:10:56.898097 9341 net.cpp:144] Setting up inception_3a/pool_proj
I0816 16:10:56.898120 9341 net.cpp:151] Top shape: 10 32 80 80 (2048000)
I0816 16:10:56.898140 9341 net.cpp:159] Memory required for data: 1953953280
I0816 16:10:56.898180 9341 layer_factory.hpp:77] Creating layer inception_3a/relu_pool_proj
I0816 16:10:56.898203 9341 net.cpp:94] Creating Layer inception_3a/relu_pool_proj
I0816 16:10:56.898219 9341 net.cpp:435] inception_3a/relu_pool_proj <- inception_3a/pool_proj
I0816 16:10:56.898238 9341 net.cpp:396] inception_3a/relu_pool_proj -> inception_3a/pool_proj (in-place)
I0816 16:10:56.898278 9341 net.cpp:144] Setting up inception_3a/relu_pool_proj
I0816 16:10:56.898291 9341 net.cpp:151] Top shape: 10 32 80 80 (2048000)
I0816 16:10:56.898309 9341 net.cpp:159] Memory required for data: 1962145280
I0816 16:10:56.898321 9341 layer_factory.hpp:77] Creating layer inception_3a/output
I0816 16:10:56.898344 9341 net.cpp:94] Creating Layer inception_3a/output
I0816 16:10:56.898357 9341 net.cpp:435] inception_3a/output <- inception_3a/1x1
I0816 16:10:56.898373 9341 net.cpp:435] inception_3a/output <- inception_3a/3x3
I0816 16:10:56.898389 9341 net.cpp:435] inception_3a/output <- inception_3a/5x5
I0816 16:10:56.898404 9341 net.cpp:435] inception_3a/output <- inception_3a/pool_proj
I0816 16:10:56.898422 9341 net.cpp:409] inception_3a/output -> inception_3a/output
I0816 16:10:56.898564 9341 net.cpp:144] Setting up inception_3a/output
I0816 16:10:56.898581 9341 net.cpp:151] Top shape: 10 256 80 80 (16384000)
I0816 16:10:56.898600 9341 net.cpp:159] Memory required for data: 2027681280
I0816 16:10:56.898613 9341 layer_factory.hpp:77] Creating layer inception_3a/output_inception_3a/output_0_split
I0816 16:10:56.900241 9341 net.cpp:94] Creating Layer inception_3a/output_inception_3a/output_0_split
I0816 16:10:56.900282 9341 net.cpp:435] inception_3a/output_inception_3a/output_0_split <- inception_3a/output
I0816 16:10:56.900310 9341 net.cpp:409] inception_3a/output_inception_3a/output_0_split -> inception_3a/output_inception_3a/output_0_split_0
I0816 16:10:56.900390 9341 net.cpp:409] inception_3a/output_inception_3a/output_0_split -> inception_3a/output_inception_3a/output_0_split_1
I0816 16:10:56.900420 9341 net.cpp:409] inception_3a/output_inception_3a/output_0_split -> inception_3a/output_inception_3a/output_0_split_2
I0816 16:10:56.900447 9341 net.cpp:409] inception_3a/output_inception_3a/output_0_split -> inception_3a/output_inception_3a/output_0_split_3
I0816 16:10:56.912916 9341 net.cpp:144] Setting up inception_3a/output_inception_3a/output_0_split
I0816 16:10:56.912973 9341 net.cpp:151] Top shape: 10 256 80 80 (16384000)
I0816 16:10:56.913003 9341 net.cpp:151] Top shape: 10 256 80 80 (16384000)
I0816 16:10:56.913061 9341 net.cpp:151] Top shape: 10 256 80 80 (16384000)
I0816 16:10:56.913079 9341 net.cpp:151] Top shape: 10 256 80 80 (16384000)
I0816 16:10:56.913095 9341 net.cpp:159] Memory required for data: 2289825280
I0816 16:10:56.913113 9341 layer_factory.hpp:77] Creating layer inception_3b/1x1
I0816 16:10:56.913168 9341 net.cpp:94] Creating Layer inception_3b/1x1
I0816 16:10:56.913188 9341 net.cpp:435] inception_3b/1x1 <- inception_3a/output_inception_3a/output_0_split_0
I0816 16:10:56.913221 9341 net.cpp:409] inception_3b/1x1 -> inception_3b/1x1
I0816 16:10:56.915447 9341 net.cpp:144] Setting up inception_3b/1x1
I0816 16:10:56.915484 9341 net.cpp:151] Top shape: 10 128 80 80 (8192000)
I0816 16:10:56.915511 9341 net.cpp:159] Memory required for data: 2322593280
I0816 16:10:56.915582 9341 layer_factory.hpp:77] Creating layer inception_3b/relu_1x1
I0816 16:10:56.915612 9341 net.cpp:94] Creating Layer inception_3b/relu_1x1
I0816 16:10:56.915629 9341 net.cpp:435] inception_3b/relu_1x1 <- inception_3b/1x1
I0816 16:10:56.915652 9341 net.cpp:396] inception_3b/relu_1x1 -> inception_3b/1x1 (in-place)
I0816 16:10:56.915683 9341 net.cpp:144] Setting up inception_3b/relu_1x1
I0816 16:10:56.915695 9341 net.cpp:151] Top shape: 10 128 80 80 (8192000)
I0816 16:10:56.915712 9341 net.cpp:159] Memory required for data: 2355361280
I0816 16:10:56.915725 9341 layer_factory.hpp:77] Creating layer inception_3b/3x3_reduce
I0816 16:10:56.915758 9341 net.cpp:94] Creating Layer inception_3b/3x3_reduce
I0816 16:10:56.915773 9341 net.cpp:435] inception_3b/3x3_reduce <- inception_3a/output_inception_3a/output_0_split_1
I0816 16:10:56.915797 9341 net.cpp:409] inception_3b/3x3_reduce -> inception_3b/3x3_reduce
I0816 16:10:56.918097 9341 net.cpp:144] Setting up inception_3b/3x3_reduce
I0816 16:10:56.918159 9341 net.cpp:151] Top shape: 10 128 80 80 (8192000)
I0816 16:10:56.918195 9341 net.cpp:159] Memory required for data: 2388129280
I0816 16:10:56.918370 9341 layer_factory.hpp:77] Creating layer inception_3b/relu_3x3_reduce
I0816 16:10:56.918413 9341 net.cpp:94] Creating Layer inception_3b/relu_3x3_reduce
I0816 16:10:56.918431 9341 net.cpp:435] inception_3b/relu_3x3_reduce <- inception_3b/3x3_reduce
I0816 16:10:56.918455 9341 net.cpp:396] inception_3b/relu_3x3_reduce -> inception_3b/3x3_reduce (in-place)
I0816 16:10:56.918550 9341 net.cpp:144] Setting up inception_3b/relu_3x3_reduce
I0816 16:10:56.918566 9341 net.cpp:151] Top shape: 10 128 80 80 (8192000)
I0816 16:10:56.918586 9341 net.cpp:159] Memory required for data: 2420897280
I0816 16:10:56.918599 9341 layer_factory.hpp:77] Creating layer inception_3b/3x3
I0816 16:10:56.918709 9341 net.cpp:94] Creating Layer inception_3b/3x3
I0816 16:10:56.918725 9341 net.cpp:435] inception_3b/3x3 <- inception_3b/3x3_reduce
I0816 16:10:56.918750 9341 net.cpp:409] inception_3b/3x3 -> inception_3b/3x3
I0816 16:10:56.933670 9341 net.cpp:144] Setting up