Reusable genetic algorithm framework for hyperparameter tuning across scikit-learn models using tournament selection, crossover/mutation, elitism, and constraint-aware handling of invalid configurations.
A general-purpose genetic algorithm (GA) hyperparameter tuning framework implemented in Python. Evolves a population of candidate hyperparameter dictionaries over generations using tournament selection, uniform crossover, mutation, and elitism. Fitness is computed via 5-fold cross-validation with configurable scoring and parallelism (n_jobs). Includes a constraint-aware subclass for Logistic Regression to enforce compatible solver/penalty combinations.
Explore GA as a practical alternative when grid search is too expensive, while keeping the tuner reusable across model families and robust to incompatible hyperparameter dependencies.