Next TLT release will support retraining from checkpoints.
For your case, if you get a tlt model which already run at 10th epochs, then you can set it as pre-trained model, trigger retraining…
But unfortunately for current 1.0.1 version, there is a problem for loading pre-trained model during retraining. Retraining with pretrained tlt models - #27 by Morganh
It will be fixed in next release too.