I’m using a mobilenetv2 based tltb model and trying to generate a Pruned etlt model.
When I try to run a prune command on it I get the following error.
Attaching my command here:
!tao classification_tf1 prune -bm /workspace/tao-experiments/classification_tf1/byom_voc/output-tltb-coco-norm-pytorch/output-tltb-coco-norm-pytorch.tltb
-m /workspace/tao-experiments/classification_tf1/byom_voc/output-byom-mobv2-pytorch-trained-noprune/weights/byom_001.tlt
-o /workspace/tao-experiments/classification_tf1/byom_voc/output-byom-mobv2-pytorch-trained-prune/output-tltb-coco-norm-pruned.etlt
-k nvidia_tlt
-pth 0.5
–results_dir /workspace/tao-experiments/classification_tf1/byom_voc/output-byom-mobv2-pytorch-trained-prune/
I have done the following steps:
- Generated tltb from onnx model using tao_byom.
- Retrained tltb model to generate tlt model checkpoint.
- Run the above provided command for pruning.
Attaching dataset sample:
test_person224.zip (4.2 MB)
Onnx Model:
mobilenet_pretr_v2_person_ep0.onnx (8.4 MB)
Tltb model that I generated:
output-tltb-coco-norm-pytorch.tltb (7.9 MB)
• Hardware 3090
• Network Type Mobilenetv2
• TLT Version nvidia-taov4.0.0