Validation metrics per class

I am trying to retrieve the validation metrics per class as outlined
I can see in my run that the training metrics get printed. The problem happens in validation. I have exactly used the code and modified it to include the extra classes i have. JSON file is correct. The error i get is (after removing leading paths o the json file):
Config error in config_train_64C_ce_unet.json: Error parsing config_train_64C_ce_unet.json in JSON element validate.metrics.#1: Cannot find component class “MetricAverageFromArrayDice”
Am I doing something wrong or is this a bug?
I am using
Any help appreciated.

This FAQ was from an old version V1 so as in
you should simply change the MetricAverageFromArrayDice to ComputeAverageDice

Also this FAQ was focusing on showing each dice per label so you realistically don’t need the section of

        "name": "MetricAverageFromArrayDice",
        "args": {
            "name": "mean_dice",
            "applied_key": "model"

you simply has all info you need and the average dice of all labels is the first one

{"name": "MetricAverage", "args": {"name": "val_dice", "field": "dice"}},

i am sorry to bring this again. i have couple of remarks.

  1. This means that the documentation needs to be updated for the MetricAverageFromArrayDice as you suggested above.
  2. As you suggested, i removed it and replaced it. Sadly, now i am getting an error, "in JSON element validate.metrics.#1: Cannot find component class “MetricAverage”.
    Can you please help by providing a working example? It looks like documentation is somewhat broken or i am not able to comprehend it.
    Best regards,

I am not sure why you do get an error but this config should work for you clara-train-examples/trn_base.json at master · NVIDIA/clara-train-examples · GitHub

You can then change the aux part to

    "aux_ops": [
        "name": "DiceMaskedOutput",
        "args": {
          "is_onehot_targets": false,
          "skip_background": false,
          "is_independent_predictions": true,
          "tags": ["dice_all", "d00", "d01"]
       "do_summary": true,
       "do_print": false

then have the metric as

  "validate": {
    "metrics": [
      {"name": "ComputeAverage", "args": {"is_key_metric": true,"name": "mean_dice_all", "field": "dice_all"}},
      {"name": "ComputeAverage", "args": {"name": "val_d00", "field": "d00"}},
      {"name": "ComputeAverage", "args": {"name": "val_d01", "field": "d01"}}

Thanks. This worked. I am really not sure because earlier also i was using the same instructions. might be some goofup from my side though i cant figure out what.
thanks you,