Speeding Up Semantic Segmentation Using MATLAB Container from NVIDIA NGC

Originally published at: https://developer.nvidia.com/blog/speeding-up-semantic-segmentation-matlab-nvidia-ngc/

Gone are the days of using a single GPU to train a deep learning model. With computationally intensive algorithms such as semantic segmentation, a single GPU can take days to optimize a model. But multi-GPU hardware is expensive, you say. Not any longer; NVIDIA multi-GPU hardware on cloud instances like the AWS P3 allow you to pay…