For ReIdentificationNet, a pre-trained model is available in the NGC catalog trainable_v1.1: resnet50_market1501_aicity156.tlt . I can’t find many information about the pre-trained process though.
includes a combination of NVIDIA proprietary datasets along with Open Images V5.
However in the NGC page, they are only mentioning market-1501 + synthetic IDs. Therefore, I am wondering on which dataset has the model been pre-trained? is it on a combination of NVIDIA proprietary datasets along with Open Images or is it on market-1501 + synthetic data?
Also the blog mention, fine-tuning using 4470 real IDs, does that mean the model tested here is different from deployable_v1.2 for NGC?
Therefore, we should have 41712/128 = 236 batches for each epoch. I am wondering why TAO is showing 570?
And why are the epochs split into 2? thought this was due to num_workers but even setting it up to 1 shows a split epochs output.
Looks like logs are showing something different. If above 570 was the total train+test here we should get 25/43 and 18/43 but instead the bar is going to 95% and it seems like the total number is /train or /test not /(test+train).
Why is it different depending on the training?
Epoch 0: 95%|███████████████████████████████████████████████████████████████████████████████████████████████████████████▊ | 21/22 [00:05<00:00, 3.79it/s, loss=5.37, v_num=0]Train and Val metrics generated.
Epoch 0: 95%|███████████████████████████████████████████████████████▎ | 21/22 [00:05<00:00, 3.76it/s, loss=5.37, v_num=0, train_loss=5.510, base_lr=3.81e-5, train_acc_1=0.0134]Training loop in progress
Epoch 1: 95%|███████████████████████████████████████████████████████▎ | 21/22 [00:03<00:00, 5.34it/s, loss=5.13, v_num=0, train_loss=5.510, base_lr=3.81e-5, train_acc_1=0.0134]Train and Val metrics generated.
Epoch 1: 95%|███████████████████████████████████████████████████████▎ | 21/22 [00:03<00:00, 5.30it/s, loss=5.13, v_num=0, train_loss=5.130, base_lr=3.81e-5, train_acc_1=0.0201]Training loop in progress
There is no update from you for a period, assuming this is not an issue anymore. Hence we are closing this topic. If need further support, please open a new one. Thanks