Deep Learning Hyperparameter Optimization with Competing Objectives

Originally published at: Deep Learning Hyperparameter Optimization with Competing Objectives | NVIDIA Technical Blog

In this post we’ll show how to use SigOpt’s Bayesian optimization platform to jointly optimize competing objectives in deep learning pipelines on NVIDIA GPUs more than ten times faster than traditional approaches like random search. A screenshot of the SigOpt web dashboard where users track the progress of their machine learning model optimization. Measuring how well a model…

Comparing intelligent algorithm with random search seems unfair.
Could you show the comparison of your algorithm to other multi-objective optimization algorithms, such as MO-CMA-ES and NSGA-II?