Does TLT flip images during training a classifier?

Hello,

I am training a TLT Resnet18 classifier on TLT 2.0.

I have a dataset which contains some shapes pointing in specific direction and the classname is the direction itself. (Ex: An object facing right should be labelled as Right, etc)

Does the training process automatically do some flipping augmentation? If so, is there any way to turn it off?

Thanks

Do you mean you were training with TLT classification network, right?
Can you share your training spec?

Hey,
Thanks for your reply.
Yes I am training a TLT Classification network

model_config {
  arch: "resnet",
  n_layers: 10

  use_bias: False
  use_batch_norm: True
  all_projections: False
  freeze_blocks: 0
  freeze_blocks: 1
  freeze_blocks: 2
  freeze_blocks: 3
  # image size should be "3,H,W", where H,W >= 16
  input_image_size: "3,224,224"

}

train_config {
  train_dataset_path: "/data/train"
  val_dataset_path:   "/data/valid"

  pretrained_model_path: "/workspace/experiment/pretrained/tlt_pretrained_classification_vresnet10/resnet_10.hdf5"
  optimizer: "sgd"
  batch_size_per_gpu: 256
  n_epochs: 60
  n_workers: 16

  # regularizer
  reg_config {
    type: "L2"
    scope: "Conv2D,Dense"
    weight_decay: 0.00005
  }

  # learning_rate
  lr_config {
    scheduler: "step"
    learning_rate: 0.0005
    step_size: 6
    gamma: 0.3
  }
}

So the problem I am facing is that when I separate the objects into classes say ‘Left’ and ‘Right’, my accuracy is always 50% but when I group the classes that are opposite (eg: left and right become 1 class, top and bottom become 1 class, etc), the accuracy goes very high.
Based on this observation, I want to verify if there is any flipping augmentation happening during the training. Because my data is direction specific, flipping would affect the accuracy.

Thanks.

Yes, there is horizontal_flip by default during training.

I see, thanks.
By any chance is there a way to disable it?

Currently not. Internal team will work for it.

That would be great.
Thank you very much :)