← Back to all frameworks Machine Learning

Optuna & Ray Tune

Hyperparameter search at scale — Bayesian, distributed

What it is

Optuna for clean Bayesian search inside one process; Ray Tune for distributed trials across a cluster. Both implement Hyperband, ASHA and population-based training.

How Vaaani uses it

  • Squeezing the last 3% accuracy from a tuned model
  • Distributed trials across spot instances
  • Multi-objective optimization (accuracy + latency)
  • Pruning bad trials early to save compute budget

Why it makes the cut

Manually tuning hyperparameters wastes budget. Optuna's pruner finds the answer 5x faster and you go to lunch.

Sample code

import optuna

def objective(trial):
    lr = trial.suggest_loguniform("lr", 1e-5, 1e-1)
    n  = trial.suggest_int("n_estimators", 50, 500)
    return train_eval(lr, n)

study = optuna.create_study(direction="maximize")
study.optimize(objective, n_trials=100)

Related in the Vaaani stack

Have a project that needs Optuna?

30-min discovery call. You describe the busywork; I map it to an AI worker and a budget.