Yes, I want to concatenate multiple tensors along the channel dimension in fp16.
All the concatenate layers in the prototxt I linked perform the operation on axis=1 which in Caffe is the channel axis (format is NCHW), so even if the parameter gets ignored it should work.