The process of finding the right combination of hyperparameters to maximize the model performance
Optuma
Ray Tune
Tune is a Python library for experiment execution and hyperparameter tuning at any scale. You can tune your favorite machine learning framework (PyTorch
, XGBoost, Scikit-Learn, TensorFlow and Keras, and more) by running state of the art algorithms such as Population Based Training (PBT) and HyperBand/ASHA.
Two Benefits
- They maximize model performance: e.g., DeepMind uses PBT to achieve superhuman performance on StarCraft; Waymo uses PBT to enable self-driving cars.
- They minimize training costs: HyperBand and ASHA converge to high-quality configurations in half the time taken by previous approaches; population-based data augmentation algorithms cut costs by orders of magnitude.
cf) ASHA: one of the popular early stopping algorithm
To be brief, Ray Tune scales your training from a single machine to a large distributed cluster without changing your code.
ref) https://neptune.ai/blog/hyperparameter-tuning-in-python-complete-guide