Skip to main content Skip to main navigation

Publikation

HANF: Hyperparameter And Neural Architecture Search in Federated Learning

Jonas Seng; Pooja Prasad; Devendra Singh Dhami; Kristian Kersting
In: Computing Research Repository eprint Journal (CoRR), Vol. abs/2206.12342, Pages 0-10, arXiv, 2022.

Zusammenfassung

Deep neural architectures have profound impact on achieved performance in many of today's AI tasks, yet, their design still heavily relies on human prior knowledge and experience. Neural architecture search (NAS) together with hyperparameter optimization (HO) helps to reduce this dependence. However, state of the art NAS and HO rapidly become infeasible with increasing amount of data being stored in a distributed fashion, typically violating data privacy regulations such as GDPR and CCPA. As a remedy, we introduce FEATHERS - $textbfFE$derated $textbfA$rchi$textbfT$ecture and $textbfH$yp$textbfER$parameter $textbfS$earch, a method that not only optimizes both neural architectures and optimization-related hyperparameters jointly in distributed data settings, but further adheres to data privacy through the use of differential privacy (DP). We show that FEATHERS efficiently optimizes architectural and optimization-related hyperparameters alike, while demonstrating convergence on classification tasks at no detriment to model performance when complying with privacy constraints.

Weitere Links