Please provide the following information when requesting support.
• Hardware (GTX 1060 6GB)
• Network Type (TrafficCamNet)
• TLT Version (Please run “tlt info --verbose” and share “docker_tag” here)
• Training spec file(
detectnet_v2_inference_kitti_tlt.txt (2.3 KB)
)
• How to reproduce the issue ? My pipeline
my.ipynb (15.9 KB)
I took the pre-trained TrafficCamNet and want to run inference using
!tao detectnet_v2 inference -e $SPECS_DIR/detectnet_v2_inference_kitti_tlt.txt \
-o $USER_EXPERIMENT_DIR/tlt_infer_testing \
-i $DATA_DOWNLOAD_DIR/data \
-k $KEY
I have structure:
where
TrafficCamNet is LOCAL_PROJECT_DIR
TrafficCamNet/data is LOCAL_DATA_DIR
TrafficCamNet/specs is LOCAL_SPECS_DIR
TrafficCamNet/tlt_trafficcamnet_vunpruned_v1.0 is downloaded local path for pre-train TrafficCamNet
And i get this error:
/usr/local/lib/python3.6/dist-packages/keras/engine/saving.py:292: UserWarning: No training configuration found in save file: the model was *not* compiled. Compile it manually.
warnings.warn('No training configuration found in save file: '
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
input_1 (InputLayer) (None, 3, 384, 1248) 0
_________________________________________________________________
model_1 (Model) multiple 11558548
=================================================================
Total params: 11,558,548
Trainable params: 11,546,900
Non-trainable params: 11,648
_________________________________________________________________
2022-01-25 16:40:18,236 [INFO] __main__: Initialized model
2022-01-25 16:40:18,239 [INFO] __main__: Commencing inference
0%| | 0/94 [00:00<?, ?it/s]
Traceback (most recent call last):
File "/opt/tlt/.cache/dazel/_dazel_tlt/75913d2aee35770fa76c4a63d877f3aa/execroot/ai_infra/bazel-out/k8-fastbuild/bin/magnet/packages/iva/build_wheel.runfiles/ai_infra/iva/detectnet_v2/scripts/inference.py", line 210, in <module>
File "/opt/tlt/.cache/dazel/_dazel_tlt/75913d2aee35770fa76c4a63d877f3aa/execroot/ai_infra/bazel-out/k8-fastbuild/bin/magnet/packages/iva/build_wheel.runfiles/ai_infra/iva/detectnet_v2/scripts/inference.py", line 206, in main
File "/opt/tlt/.cache/dazel/_dazel_tlt/75913d2aee35770fa76c4a63d877f3aa/execroot/ai_infra/bazel-out/k8-fastbuild/bin/magnet/packages/iva/build_wheel.runfiles/ai_infra/iva/detectnet_v2/scripts/inference.py", line 159, in inference_wrapper_batch
File "/opt/tlt/.cache/dazel/_dazel_tlt/75913d2aee35770fa76c4a63d877f3aa/execroot/ai_infra/bazel-out/k8-fastbuild/bin/magnet/packages/iva/build_wheel.runfiles/ai_infra/iva/detectnet_v2/postprocessor/bbox_handler.py", line 245, in bbox_preprocessing
File "/opt/tlt/.cache/dazel/_dazel_tlt/75913d2aee35770fa76c4a63d877f3aa/execroot/ai_infra/bazel-out/k8-fastbuild/bin/magnet/packages/iva/build_wheel.runfiles/ai_infra/iva/detectnet_v2/postprocessor/bbox_handler.py", line 271, in abs_bbox_converter
File "/usr/local/lib/python3.6/dist-packages/addict/addict.py", line 64, in __getitem__
if name not in self:
TypeError: unhashable type: 'slice'
2022-01-25 18:40:24,460 [INFO] tlt.components.docker_handler.docker_handler: Stopping container.
I think I made a lot of mistakes. I will be grateful for any help