Bias Variance Decompositions using XGBoost

Originally published at: Bias Variance Decompositions using XGBoost | NVIDIA Technical Blog

This blog dives into a theoretical machine learning concept called the bias variance decomposition. This decomposition is a method which examines the expected generalization error for a given learning algorithm and a given data source. This helps us understand questions like: How can I achieve higher accuracy with my model without overfitting? Why are my…