Benchmarking Discrete Optimization Heuristics
This thesis involves three topics: benchmarking discrete optimization algorithms, empirical analyses of evolutionary computation, and automatic algorithm configuration.
- Ye, F.
- 01 June 2022
- Thesis in Leiden Repository
The objective is benchmarking EAs on discrete optimization for the selection and design of better optimizers.In practice, we start with building the IOHprofiler benchmark software, which supports us in testing algorithms on a wide range of problems and allows us to perform and visualize the statistical analysis on algorithms' performance.While performing numerous benchmark studies, we study the impact of mutation rate and population size on the EAs and investigate how crossover and mutation interplay with each other and the impact of population size on the GAs. Moreover, we analyze a smooth way of interpolating between local and non-local search by proposing a new normalized bit mutation.We apply Irace, MIP-EGO, and MIES to configure the GA for ERT and AUC, respectively. Our results suggest that even when interested in ERT, it might be preferable to tune for AUC for the configuration task. We also observe that tuning for ERT is much more sensitive with respect to the budget that is allocated to the target algorithms.At last, we leverage our benchmark data of static algorithms to study dynamic algorithm selection.