AttributeError: 'QDQ' object has no attribute 'data_format'

Hello there,
I am trying to train the resnet10 model for my own data using transfer learning toolkit. In “visualize Inference” section its giving the following error:
Traceback (most recent call last):
File “/usr/local/bin/tlt-infer”, line 8, in
sys.exit(main())
File “/home/vpraveen/.cache/dazel/_dazel_vpraveen/715c8bafe7816f3bb6f309cd506049bb/execroot/ai_infra/bazel-out/k8-py3-fastbuild/bin/magnet/packages/iva/build_wheel.runfiles/ai_infra/iva/common/magnet_infer.py”, line 54, in main
File “/home/vpraveen/.cache/dazel/_dazel_vpraveen/715c8bafe7816f3bb6f309cd506049bb/execroot/ai_infra/bazel-out/k8-py3-fastbuild/bin/magnet/packages/iva/build_wheel.runfiles/ai_infra/iva/detectnet_v2/scripts/inference.py”, line 194, in main
File “/home/vpraveen/.cache/dazel/_dazel_vpraveen/715c8bafe7816f3bb6f309cd506049bb/execroot/ai_infra/bazel-out/k8-py3-fastbuild/bin/magnet/packages/iva/build_wheel.runfiles/ai_infra/iva/detectnet_v2/scripts/inference.py”, line 117, in inference_wrapper_batch
File “/home/vpraveen/.cache/dazel/_dazel_vpraveen/715c8bafe7816f3bb6f309cd506049bb/execroot/ai_infra/bazel-out/k8-py3-fastbuild/bin/magnet/packages/iva/build_wheel.runfiles/ai_infra/iva/detectnet_v2/inferencer/tlt_inferencer.py”, line 102, in network_init
AttributeError: ‘QDQ’ object has no attribute ‘data_format’
Kindly guide me trough it.
Thanks

Please share your detectnet_v2_inference_kitti_tlt.txt.
Did you run with jupyter notebook? If yes, could you please save your jupyter notebook as an html file and attach here?

Thanks for your reply @Morganh.
jupyter notebook
detectnet_v2_inference_kitti_tlt

The https://drive.google.com/file/d/1zRgokTuLqWra9EYIWgKNydnPNEEmhSKv/view?usp=sharing is only 30K. I cannot see more details inside it. Is it broken? Could you check it again?

detectnet_v2.html

Could you try to run tlt-infer with fewer images? For example, you can copy several images to a new folder and run inference against it. This experiment is in order to narrow down the issue.

Now there are only 5 images in the folder and its giving the same error:

Traceback (most recent call last):
File “/usr/local/bin/tlt-infer”, line 8, in
sys.exit(main())
File “/home/vpraveen/.cache/dazel/_dazel_vpraveen/715c8bafe7816f3bb6f309cd506049bb/execroot/ai_infra/bazel-out/k8-py3-fastbuild/bin/magnet/packages/iva/build_wheel.runfiles/ai_infra/iva/common/magnet_infer.py”, line 54, in main
File “/home/vpraveen/.cache/dazel/_dazel_vpraveen/715c8bafe7816f3bb6f309cd506049bb/execroot/ai_infra/bazel-out/k8-py3-fastbuild/bin/magnet/packages/iva/build_wheel.runfiles/ai_infra/iva/detectnet_v2/scripts/inference.py”, line 194, in main
File “/home/vpraveen/.cache/dazel/_dazel_vpraveen/715c8bafe7816f3bb6f309cd506049bb/execroot/ai_infra/bazel-out/k8-py3-fastbuild/bin/magnet/packages/iva/build_wheel.runfiles/ai_infra/iva/detectnet_v2/scripts/inference.py”, line 117, in inference_wrapper_batch
File “/home/vpraveen/.cache/dazel/_dazel_vpraveen/715c8bafe7816f3bb6f309cd506049bb/execroot/ai_infra/bazel-out/k8-py3-fastbuild/bin/magnet/packages/iva/build_wheel.runfiles/ai_infra/iva/detectnet_v2/inferencer/tlt_inferencer.py”, line 102, in network_init
AttributeError: ‘QDQ’ object has no attribute ‘data_format’

OK, seems that there is an issue when run tlt-infer against qat tlt model. I will sync with internal team.

I can confirm that there is not this error when run tlt-infer against the non-qat tlt model.

I am not sure but I think I’m working with the non-qat tlt model. Still its giving the error.

You were running tlt-infer with “/workspace/transfer_learning/detectnet_v2/experiment_dir_retrain/weights/resnet10_detector_pruned.tlt”
It is generated by section 6. You can check if it is qat enabled inside the retraining spec.

I haven’t explicitly set enable_qat = true. In section 6 it is written:

Note: DetectNet_v2 now supports Quantization Aware Training, to help with optmizing the model. By default the training in the cell below doesn’t run the model with QAT enabled. For information on training a model with QAT please refer to the cells under section 11
So I am assuming I am working with non-qat tlt model.

OK, thanks for the info. Internal team will check the issue.
Will update to you when there is more info.

@Morganh Thanks alot for your help

To unblock your inference, you can still generate etlt model or trt engine and run inference with DS.

1 Like