You can use the trained AutoML tlt model as pretrained model. It is a .tlt model which can be finetuned in corresponding network in TAO.
It is not related to BYOM.
After saving the best model obtained from AutoML, you can plug the model and spec file in the end-to-end notebook and then prune and optimize the model for inference.
Figure 3. End-to-end workflow from AutoML training to model optimization
To plug the model into the new notebook, copy the train job ID from the AutoML notebook. The AutoML train job ID is printed when you run the training job.
I see.
Excuse me. Could I modify the parameter named freeze_blocks in the process of fine-tuning if the architecture of my trained AutoML model is ResNet ?
Excuse me. Which api command should I use in the process of fine-tuing the trained model ?
Is it endpoint = f"{base_url}/model/{model_id}/specs/train/schema" or endpoint = f"{base_url}/model/{model_id}/specs/retrain/schema" ?
There is no update from you for a period, assuming this is not an issue anymore. Hence we are closing this topic. If need further support, please open a new one. Thanks