Iterated local search
Iterated Local Search is a term in applied mathematics and computer science
defining a modification of local search or hill climbing methods for solving discrete optimization problems.
Local search methods can get stuck in a local minimum, where no improving neighbors are available.
A simple modification consists of iterating calls to the local search routine, each time starting from a different initial configuration. This is called repeated local search, and implies that the knowledge obtained during the previous local search phases is not used. Learning implies that the previous history, for example the memory about the previously found local minima, is mined to produce better and better starting points for local search.
The implicit assumption is that of a clustered distribution of local minima: when minimizing a function, determining good local minima is easier when starting from a local minimum with a low value than when starting from a random point. The only caveat is to avoid confinement in a given attraction basin, so that the kick to transform a local minimizer into the starting point for the next run has to be appropriately strong, but not too strong to avoid reverting to memory-less random restarts.
Iterated Local Search is based on building a sequence of locally optimal solutions by:
- perturbing the current local minimum;
- applying local search after starting from the modified solution.
Perturbation Algorithm
Finding the perturbation algorithm for ILS is not an easy task. The main aim is not to get stuck at the same local minimum and in order to ensure this property, the undo operation is forbidden. Despite this, a good perturbation has to consider a lot of values, since there exist two kind of bad perturbation:- too weak: fall back to the same local minimum
- too strong: random restart
Benchmark Perturbation