Local optimisation – Hill Climbing
We did a very small amount of this in ADS; it was a guest lecture where the guy talked about genetic algorithms.
The problem with searching using hill climbing is that sometimes you can get stuck in a rut (‘local optimum’), where you’ve found an ‘optimal’ solution but really there’s an even better solution you just can’t get to it because you’ve been too greedy.
The solution is ‘simulated annealing’. This allows you to go to a less optimal solution in order to find an even better solution. You do this by accepting ‘non-improving moves’ probabilistically and depending on 2 parameters; the worse a move is the less likely it is to be accepted, and also a temperature parameter.
The temperature parameter basically stops you going too far, it starts really high so virtually anything is accepted, and gradually gets ‘cooler’ until ultimately only hill climbing is accepted.
There’s a pseudocode algorithm on the slides.
It gets very very complicated from here on. Good luck.