当前位置: X-MOL 学术WIREs Data Mining Knowl. Discov. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Hyperparameter optimization: Foundations, algorithms, best practices, and open challenges
WIREs Data Mining and Knowledge Discovery ( IF 7.8 ) Pub Date : 2023-01-16 , DOI: 10.1002/widm.1484
Bernd Bischl 1, 2 , Martin Binder 1, 2 , Michel Lang 2, 3 , Tobias Pielok 1 , Jakob Richter 1, 4 , Stefan Coors 1 , Janek Thomas 1 , Theresa Ullmann 2, 5 , Marc Becker 1 , Anne‐Laure Boulesteix 2, 5 , Difan Deng 6 , Marius Lindauer 6
Affiliation  

Most machine learning algorithms are configured by a set of hyperparameters whose values must be carefully chosen and which often considerably impact performance. To avoid a time-consuming and irreproducible manual process of trial-and-error to find well-performing hyperparameter configurations, various automatic hyperparameter optimization (HPO) methods—for example, based on resampling error estimation for supervised machine learning—can be employed. After introducing HPO from a general perspective, this paper reviews important HPO methods, from simple techniques such as grid or random search to more advanced methods like evolution strategies, Bayesian optimization, Hyperband, and racing. This work gives practical recommendations regarding important choices to be made when conducting HPO, including the HPO algorithms themselves, performance evaluation, how to combine HPO with machine learning pipelines, runtime improvements, and parallelization.

中文翻译:

超参数优化:基础、算法、最佳实践和开放挑战

大多数机器学习算法都是由一组超参数配置的,这些超参数的值必须仔细选择,并且通常会显着影响性能。为了避免耗时且不可重现的手动试错过程以找到性能良好的超参数配置,可以采用各种自动超参数优化 (HPO) 方法——例如,基于监督机器学习的重采样误差估计。从一般角度介绍 HPO 之后,本文回顾了重要的 HPO 方法,从网格或随机搜索等简单技术到进化策略、贝叶斯优化、Hyperband 和赛车等更高级的方法。这项工作给出了关于进行 HPO 时要做出的重要选择的实用建议,包括 HPO 算法本身,
更新日期:2023-01-16
down
wechat
bug