Tuning metaheuristics by sequential optimisation of regression models

Áthila R. Trindade, Felipe Campelo*

*Corresponding author for this work

Research output: Contribution to journalArticle (Academic Journal)peer-review

14 Citations (Scopus)

Abstract

Tuning parameters is an important step for the application of metaheuristics to specific problem classes. In this work we present a tuning framework based on the sequential optimisation of perturbed regression models. Besides providing algorithm configurations with good expected performance, the proposed methodology can also provide insights on the relevance of each parameter and their interactions, as well as models of expected algorithm performance for a given problem class, conditional on the parameter values. A number of test cases are presented, including the use of a simulation model in which the true optimal parameters of a hypothetical algorithm are known, as well as usual tuning scenarios for different problem classes. Comparative analyses are presented against Iterated Racing, SMAC, and ParamILS. The results suggest that the proposed approach returns high quality solutions in terms of mean performance of the algorithms equipped with the resulting configurations, with the advantage of providing additional information on the relevance and effect of each parameter on the expected performance.

Original languageEnglish
Article number105829
JournalApplied Soft Computing Journal
Volume85
DOIs
Publication statusPublished - Dec 2019

Bibliographical note

Publisher Copyright:
© 2019 Elsevier B.V.

Keywords

  • Metaheuristics
  • Parameter tuning
  • Regression modelling

Fingerprint

Dive into the research topics of 'Tuning metaheuristics by sequential optimisation of regression models'. Together they form a unique fingerprint.

Cite this