AIAA - Bring Your Own Inference

Hello,

According to the documentation of “Bring Your Own Inference” of the Clara Train AIAA ([Bring your own Inference — Clara Train SDK v3.1 documentation ]], we have to place the custom_inference.py in the lib folder.
I have created a lib folder in my workspace (where I have other folders like data, mmar, logs, transforms, sessions, etc) and placed the custom_inference.py file there. I also changed the config_aiaa.json file in the liver model (which is in the mmars folder). The models get uploaded to AIAA server, but when I run the inference I get error. The error in aiaa.log file is “Request for unknown model: ‘custom_inference’ is not found”. Will you please tell why am I getting the file is not found?

Hello,

What are you writing in your config?
Is it something like custom_inference.[YourClassName]?

You might need to restart for the server to pick up from that path.

Hello,
Thank you for your reply.

After restarting the server to pick up that path, I a now even unable to upload models. I get below error:

[INFO] (nvmidl.apps.aas.www.api.api_admin) - Model: clara_ct_seg_liver_and_tumor_no_amp
[INFO] (nvmidl.apps.aas.www.api.api_admin) - Reading Config from Input Files
[INFO] (nvmidl.apps.aas.www.api.api_admin) - Input Model Config:
{‘version’: ‘2’, ‘type’: ‘segmentation’, ‘labels’: [‘liver’, ‘liver tumor’], ‘description’: ‘A pre-trained model for volumetric (3D) segmentation of the liver and lesion in portal venous phase CT image’, ‘pre_transforms’: [{‘name’: ‘LoadNifti’, ‘args’: {‘fields’: ‘image’}}, {‘name’: ‘ConvertToChannelsFirst’, ‘args’: {‘fields’: ‘image’}}, {‘name’: ‘ScaleByResolution’, ‘args’: {‘fields’: ‘image’, ‘target_resolution’: [1.0, 1.0, 1.0]}}, {‘name’: ‘ScaleIntensityRange’, ‘args’: {‘fields’: ‘image’, ‘a_min’: -21, ‘a_max’: 189, ‘b_min’: 0.0, ‘b_max’: 1.0, ‘clip’: True}}], ‘inference’: {‘name’: ‘custom_inference.CustomInference’, ‘args’: {}}, ‘post_transforms’: [{‘name’: ‘ArgmaxAcrossChannels’, ‘args’: {‘fields’: ‘model’}}, {‘name’: ‘FetchExtremePoints’, ‘args’: {‘image_field’: ‘image’, ‘label_field’: ‘model’, ‘points’: ‘points’}}, {‘name’: ‘CopyProperties’, ‘args’: {‘fields’: [‘model’], ‘from_field’: ‘image’, ‘properties’: [‘affine’]}}, {‘name’: ‘RestoreOriginalShape’, ‘args’: {‘field’: ‘model’, ‘src_field’: ‘image’, ‘is_label’: True}}], ‘writer’: {‘name’: ‘WriteNifti’, ‘args’: {‘field’: ‘model’, ‘dtype’: ‘uint8’}}}

[INFO] (nvmidl.apps.aas.www.api.api_admin) - Saved archive file as: /workspace/downloads/0/model.trt.pb
[INFO] (nvmidl.apps.aas.actions.model_import) - Model: clara_ct_seg_liver_and_tumor_no_amp
[INFO] (nvmidl.apps.aas.actions.model_import) - Unpack Archive: /workspace/downloads/0/model.trt.pb
[INFO] (nvmidl.apps.aas.actions.model_import) - Model Dir: /workspace/downloads/0/clara_ct_seg_liver_and_tumor_no_amp
[INFO] (nvmidl.apps.aas.actions.model_import) - ++ Model Format: TRT
[INFO] (nvmidl.apps.aas.actions.model_import) - ++ Model Path: /workspace/downloads/0/clara_ct_seg_liver_and_tumor_no_amp/model.trt.pb
[INFO] (nvmidl.apps.aas.actions.model_import_trtis) - Model file: /workspace/downloads/0/clara_ct_seg_liver_and_tumor_no_amp/model.graphdef

[ERROR] (nvmidl.apps.aas.actions.model_import_trtis) - Gen Triton model failed
Traceback (most recent call last):
File “apps/aas/actions/model_import_trtis.py”, line 103, in save_model
File “apps/aas/actions/model_import_trtis.py”, line 178, in _import_model_v2
AttributeError: ‘NoneType’ object has no attribute ‘get’

[ERROR] (nvmidl.apps.aas.www.api.api_admin) - (5, ‘Failed to export model to TRTIS’)
Traceback (most recent call last):
File “apps/aas/actions/model_import_trtis.py”, line 103, in save_model
File “apps/aas/actions/model_import_trtis.py”, line 178, in _import_model_v2
AttributeError: ‘NoneType’ object has no attribute ‘get’

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File “/usr/local/lib/python3.6/dist-packages/flask/app.py”, line 1950, in full_dispatch_request
rv = self.dispatch_request()
File “/usr/local/lib/python3.6/dist-packages/flask/app.py”, line 1936, in dispatch_request
return self.view_functionsrule.endpoint
File “apps/aas/www/api/api_admin.py”, line 201, in admin_model_load
File “apps/aas/actions/model_import.py”, line 80, in model_import_local
File “apps/aas/actions/model_import.py”, line 163, in _import_model
File “apps/aas/actions/model_import_trtis.py”, line 122, in save_model
nvmidl.apps.aas.utils.aiaa_exception.AIAAException: (5, ‘Failed to export model to TRTIS’)

=========================================================================================================
My “CustomInference” Class is same as the example, only thing I have changed is the network_config as mentioned below:

network_config = {
“name”: “SegAhnet”,
“args”: {
“input_channels”: 1,
“output_channels”: 3
}
}

Hello,
Any help regarding this will be appreciated. Thank you.

Hello,

If you are running Clara-Train v3.1, you should follow: Bring your own Inference — Clara Train SDK v3.1 documentation .
And note that this example is NOT using Triton/TRTIS as backend.
I notice you are using the pre-trained models from NGC, I would just suggest you load that model without writing custom inference.

If you are running 4.0 container, you should check: AI-Assisted Annotation — Clara Train SDK v4.0 documentation
The list of pre-trained models can be found here: NVIDIA NGC