FPENET dimensions export error

• Network Type: FPENET
• TAO Version: 4.0.1-tf1.15.5 / 5.0.0-tf1.15.5
• Training spec: experiment_spec.yaml (2.8 KB)

Hi, I am finetuning the pretrained FPENET model from NGC with a custom dataset of 10 keypoints with input dimensions of 160x160. The training is working fine, but when I try to export the model to a .etlt file, I encounter an error. It is similar to Fpenet custom landmarks export error. - but even with adding the --input_dims flag as mentioned in the thread, it does not work.

When I run:

fpenet export  -m $USER_EXPERIMENT_DIR/models/exp1/model.tlt -k $KEY --input_dims 1,160,160

I am getting the following output:

Traceback (most recent call last):
  File "</usr/local/lib/python3.6/dist-packages/driveix/fpenet/scripts/export.py>", line 3, in <module>
  File "<frozen driveix.fpenet.scripts.export>", line 251, in <module>
  File "<frozen driveix.fpenet.scripts.export>", line 247, in main
  File "<frozen driveix.fpenet.scripts.export>", line 239, in run_export
  File "<frozen driveix.fpenet.exporter.fpenet_exporter>", line 199, in export
  File "<frozen driveix.fpenet.exporter.fpenet_exporter>", line 124, in export_to_etlt
  File "<frozen driveix.common.utilities.tlt_utils>", line 461, in change_model_batch_size
  File "/usr/local/lib/python3.6/dist-packages/keras/engine/saving.py", line 492, in model_from_json
    return deserialize(config, custom_objects=custom_objects)
  File "/usr/local/lib/python3.6/dist-packages/keras/layers/__init__.py", line 55, in deserialize
    printable_module_name='layer')
  File "/usr/local/lib/python3.6/dist-packages/keras/utils/generic_utils.py", line 145, in deserialize_keras_object
    list(custom_objects.items())))
  File "/usr/local/lib/python3.6/dist-packages/keras/engine/network.py", line 1032, in from_config
    process_node(layer, node_data)
  File "/usr/local/lib/python3.6/dist-packages/keras/engine/network.py", line 991, in process_node
    layer(unpack_singleton(input_tensors), **kwargs)
  File "/usr/local/lib/python3.6/dist-packages/keras/engine/base_layer.py", line 431, in __call__
    self.build(unpack_singleton(input_shapes))
  File "<frozen driveix.fpenet.models.custom.softargmax>", line 78, in build
AssertionError
/usr/local/lib/python3.6/dist-packages/requests/__init__.py:91: RequestsDependencyWarning: urllib3 (1.26.5) or chardet (3.0.4) doesn't match a supported version!
  RequestsDependencyWarning)
Telemetry data couldn't be sent, but the command ran successfully.
[WARNING]: __init__() missing 4 required positional arguments: 'code', 'msg', 'hdrs', and 'fp'
Execution status: FAIL

When I train and export the model with 80x80, also the export is working fine.

I also tried training and exporting the network with the 5.0.0-tf1.15.5 container: There I get a similar error message when exporting the 160x160 network:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/nvidia_tao_tf1/cv/fpenet/scripts/export.py", line 318, in <module>
    main()
  File "/usr/local/lib/python3.8/dist-packages/nvidia_tao_tf1/cv/fpenet/scripts/export.py", line 314, in main
    raise e
  File "/usr/local/lib/python3.8/dist-packages/nvidia_tao_tf1/cv/fpenet/scripts/export.py", line 298, in main
    run_export(args)
  File "/usr/local/lib/python3.8/dist-packages/nvidia_tao_tf1/cv/fpenet/scripts/export.py", line 275, in run_export
    exporter.export(input_dims,
  File "/usr/local/lib/python3.8/dist-packages/nvidia_tao_tf1/cv/fpenet/exporter/fpenet_exporter.py", line 215, in export
    _, in_tensor_name, out_tensor_names = self.generate_exported_model(
  File "/usr/local/lib/python3.8/dist-packages/nvidia_tao_tf1/cv/fpenet/exporter/fpenet_exporter.py", line 140, in generate_exported_model
    new_model = change_model_batch_size(model,
  File "/usr/local/lib/python3.8/dist-packages/nvidia_tao_tf1/cv/common/utilities/tlt_utils.py", line 474, in change_model_batch_size
    new_model = keras.models.model_from_json(model.to_json(), custom_objects=custom_objects)
  File "/usr/local/lib/python3.8/dist-packages/keras/engine/saving.py", line 492, in model_from_json
    return deserialize(config, custom_objects=custom_objects)
  File "/usr/local/lib/python3.8/dist-packages/keras/layers/__init__.py", line 52, in deserialize
    return deserialize_keras_object(config,
  File "/usr/local/lib/python3.8/dist-packages/keras/utils/generic_utils.py", line 142, in deserialize_keras_object
    return cls.from_config(
  File "/usr/local/lib/python3.8/dist-packages/keras/engine/network.py", line 1032, in from_config
    process_node(layer, node_data)
  File "/usr/local/lib/python3.8/dist-packages/keras/engine/network.py", line 991, in process_node
    layer(unpack_singleton(input_tensors), **kwargs)
  File "/usr/local/lib/python3.8/dist-packages/keras/engine/base_layer.py", line 431, in __call__
    self.build(unpack_singleton(input_shapes))
  File "/usr/local/lib/python3.8/dist-packages/nvidia_tao_tf1/cv/fpenet/models/custom/softargmax.py", line 92, in build
    assert tuple(self._input_shape[1:]) == tuple(input_shape[1:])
AssertionError
Execution status: FAIL

I also printed the assertion args during the export from
“/usr/local/lib/python3.8/dist-packages/nvidia_tao_tf1/cv/fpenet/models/custom/softargmax.py”,

90	        print(tuple(input_shape[:])) 
91	        print(tuple(self._input_shape[:]))
92	        assert tuple(self._input_shape[1:]) == tuple(input_shape[1:])

The output is

(32, 10, 160, 160)
(32, 10, 160, 160)

(None, 10, 80, 80)
(32, 10, 160, 160)
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/nvidia_tao_tf1/cv/fpenet/scripts/export.py", line 318, in <module>
    main()
...

It seems during the first call, the assertion is True, and the second time it is false.

I hope you can provide me some workaround for exporting the network with a larger size than 80x80.
Thanks

Could you modify https://github.com/NVIDIA/tao_tensorflow1_backend/blob/d73dcfcc2191a91ae80616b7ffed7f55ac6692ef/nvidia_tao_tf1/cv/fpenet/exporter/fpenet_exporter.py#L139 and replace it in the docker?
Login the docker and find the fpenet_exporter.py.
Then replace it with the modified version.
Suggest to backup the original fpenet_exporter.py.

Your proposed modification fixed the issue.
Thanks!

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.