Detectnet Custom Model Inference - config file error

1 - Following “Object Detection with DetectNetv2” in Isaac SDK 2020.2 documentation.
[object-detection-with-detect-net]

2 - Now at the " Inference on custom models" step, but run into an issue with running the config file for the custom model.

3 - Trying to run the command below with the new custom model config file as directed in the " Inference on custom models" section.

bazel run packages/detect_net/apps:detect_net_inference_app – --mode sim

4 - When I use the --config packages/detect_net/apps/detect_net_custom.config.json with the above command, I get the following error.


Traceback (most recent call last):
File “/home/x/.cache/bazel/_bazel_x/5c596d5d74171eca74d35356441b45c0/execroot/com_nvidia_isaac_sdk/bazel-out/k8-opt/bin/packages/detect_net/apps/detect_net_inference_app.runfiles/com_nvidia_isaac_sdk/packages/detect_net/apps/detect_net_inference_app.py”, line 181, in
main(args)
File “/home/x/.cache/bazel/_bazel_x/5c596d5d74171eca74d35356441b45c0/execroot/com_nvidia_isaac_sdk/bazel-out/k8-opt/bin/packages/detect_net/apps/detect_net_inference_app.runfiles/com_nvidia_isaac_sdk/packages/detect_net/apps/detect_net_inference_app.py”, line 102, in main
app.load(args.config)
File “/home/x/.cache/bazel/_bazel_x/5c596d5d74171eca74d35356441b45c0/execroot/com_nvidia_isaac_sdk/bazel-out/k8-opt/bin/packages/detect_net/apps/detect_net_inference_app.runfiles/com_nvidia_isaac_sdk/packages/pyalice/Application.py”, line 486, in load
raise ValueError(’"{}" is not a valid file for subgraph.’.format(filename))
ValueError: “packages/detect_net/apps/detect_net_custom.config.json” is not a valid file for subgraph.


5 - And when I run it with --detection_model and --etlt_password (providing appropriate model and password), it works fine but of course, “dolly” shows up as an object detected (custom object is detected correctly).

6 - Below is the custom model config file (same as dolly except for custom model)


{
“config”: {
“detect_net_inference.tensor_r_t_inference”: {
“isaac.ml.TensorRTInference”: {
“model_file_path”: “/home/user/models/resnet18_detector_custom.etlt”,
“etlt_password”: “mypassword”,
“force_engine_update”: false
}
},
“detect_net_inference.detection_decoder”: {
“isaac.detect_net.DetectNetDecoder”: {
“labels”: [“CustomModelLabel”],
“non_maximum_suppression_threshold”: 0.3,
“confidence_threshold”: 0.35
}
}
}
}

Thank you for your help.

The issue here may be that the config file is being specified with a relative path but is not included in the Bazel target to package with ‘detect_net_inference_app’. You might want to try providing an absolute path so the application can find it now just to see if that unblocks you. Then, you can add it as a “data” dependency of the target so it is packaged with the app and can be referenced using relative paths again.

Thanks. Will try later.