Distributed training support

Hello, does Clara Train support distributed training? I know MONAI supports it and since Clara Training v4.0 is based on MONAI I assume it can do distributed training too, but I did not find any confirmation in Clara Training docs

Hi
Thanks for your interest in Clara Train SDK.
You are correct Clara train V4 supports all features in MONAI. We do have examples in the getting started notebooks to show multi-gpu training. multi-node examples is still a work in progress

Hope this helps