

finetune
Additional functions for model tuning
The finetune package extends the tidymodels tune package with additional hyperparameter optimization methods for machine learning models. It provides two main approaches: simulated annealing for iterative search and racing methods for efficient grid search.
Simulated annealing explores the parameter space iteratively to find optimal values, accepting both better and occasionally worse configurations to escape local optima. Racing methods start by evaluating all parameter combinations on a small number of resamples, then use statistical testing (ANOVA-based or win/loss tournament-style) to eliminate poor performers early and focus computational resources on promising candidates. This makes hyperparameter tuning faster by avoiding full evaluation of parameter combinations that are unlikely to perform well.







