Amirmohammad Farzaneh

Research Associate

NeurIPS 2025 Workshop Announcement


From Tuning to Guarantees: Statistically Valid Hyperparameter Selection


October 30, 2025

I’m excited to share that I will be presenting a 3-hour tutorial at NeurIPS 2025 on how to move from empirical tuning to statistically guaranteed hyperparameter selection.

Modern ML pipelines often rely on grid search, random search, or Bayesian optimization to find the best hyperparameters, but these methods provide no guarantees that the chosen configuration will remain reliable once deployed.

In this tutorial, I will present a rigorous yet practical framework that treats hyperparameter selection as a statistical testing problem, providing formal guarantees on the reliability of the selected configurations. Participants will learn how to:
  • Construct valid p-values or e-values for each configuration
  • Apply multiple hypothesis testing to control risk with finite-sample guarantees
  • Extend these ideas to alternative risk measures, multi-objective optimization, side-information priors, and adaptive testing
  • Incorporate these tools into real-world workflows, from LLM tuning to healthcare, finance, and autonomous systems
Learn more at https://lnkd.in/e6KwFS3K, and join us at NeurIPS if the topic resonates with your work. Looking forward to meeting researchers, practitioners, and students interested in reliable AI, statistical guarantees, and robust hyperparameter selection.

Share

Tools
Translate to