Improving Automated Hyperparameter Optimization with Case-Based Reasoning

Maximilian Hoffmann, Ralph Bergmann

In: Mark T. Keane, Nirmalie Wiratunga (Hrsg.). Case-Based Reasoning Research and Development. International Conference on Case-Based Reasoning (ICCBR) Springer Cham 2022.


The hyperparameter configuration of machine learning models has a great influence on their performance. These hyperparameters are often set either manually w. r. t. to the experience of an expert or by an Automated Hyperparameter Optimization (HPO) method. However, integrating experience knowledge into HPO methods is challenging. Therefore, we propose the approach HypOCBR (Hyperparameter Optimization with Case-Based Reasoning) that uses Case-Based Reasoning (CBR) to improve the optimization of hyperparameters. HypOCBR is used as an addition to HPO methods and builds up a case base of sampled hyperparameter vectors with their loss values. The case base is then used to retrieve hyperparameter vectors given a query vector and to make decisions whether to proceed trialing with this query or abort and sample another vector. The experimental evaluation investigates the suitability of HypOCBR for two deep learning setups of varying complexity. It shows its potential to improve the optimization results, especially in complex scenarios with limited optimization time.

2022_ICCBR__Hyperparameter_Optimization_with_CBR.pdf (pdf, 635 KB )

Deutsches Forschungszentrum für Künstliche Intelligenz
German Research Center for Artificial Intelligence