Please provide the following information when requesting support.
• Hardware (A6000)
• Network Type (bpnet)
• TLT Version (Please run “tlt info --verbose” and share “docker_tag” here)
format_version: 2.0
toolkit_version: 4.0.1
published_date: 03/06/2023
• Training spec file(If have, please share here)
I'm using the default bpnet_train_m1_coco.yaml that's provided with the examples with no modifications.
• How to reproduce the issue ? (This is for errors. Please share the command line and the detailed log here.)
Hello, I’m going through the examples in the tao-toolkit, and am stuck on the training step for bpnet:
!tao bpnet train -e $SPECS_DIR/bpnet_train_m1_coco.yaml
-r $USER_EXPERIMENT_DIR/models/exp_m1_unpruned
-k $KEY
–gpus $NUM_GPUS
Errors out with
Traceback (most recent call last):
File “</usr/local/lib/python3.6/dist-packages/driveix/bpnet/scripts/train.py>”, line 3, in
File “”, line 221, in
File “”, line 150, in main
File “”, line 167, in deserialize_maglev_object
File “”, line 432, in wrapper
File “”, line 100, in init
AssertionError: Please specify inference spec path in the config file.
Telemetry data couldn’t be sent, but the command ran successfully.
[WARNING]: <urlopen error [Errno -2] Name or service not known>
Execution status: FAIL
2023-04-19 17:03:14,336 [INFO] tlt.components.docker_handler.docker_handler: Stopping container.
There is an infer_spec.yaml, but I can’t find any documentation on how to include it. I tried specifying with --inference_spec. Any help greatly appreciated.