Smac 2.0 Direct
smac = HPOFacade(scenario, train_model, overwrite=True) smac.optimize(parallel_backend="multiprocessing", n_workers=4) | Pitfall | Fix | |---------|-----| | SMAC gets stuck in one region | Increase acq_func exploration (e.g., acq_func="EI" + high kappa ) | | Too slow for large spaces | Use multi-fidelity or lower n_trials | | Conditional parameters not handled | Use ConfigSpace.Condition – see docs | | Reproducibility issues | Set seed in Scenario | | Memory blowup | Reduce runhistory size or use extensive=False in facade | Comparison vs Other Tuners (TL;DR) | Tool | Best for | |------|----------| | SMAC 2.0 | Conditional spaces, multi-objective, moderate cost | | Optuna | Simpler spaces, TPEF+CMA, good defaults | | Hyperopt | Quick TPE experiments, older codebases | | BayesianOptimization | Low-dim (<20) continuous spaces | | Grid/Random | Debugging, cheap functions | Final Tip Start with HPOFacade – it hides most complexity. Only drop to SMAC4BB or SMAC4AC classes if you need full control (e.g., custom surrogate).
https://automl.github.io/SMAC3/main/ Paper: "SMAC 2.0: A Versatile Hyperparameter Optimization Framework" (Lindauer et al., 2022) smac 2.0
def train_model(config, budget=0.5): # budget = fraction of epochs # train for int(budget * max_epochs) epochs return val_loss scenario = Scenario(cs, n_trials=100, min_budget=0.1, max_budget=1.0) smac = HPOFacade(scenario, train_model, overwrite=True) smac