email me

In principle, Metropolis-Hastings algorithms can also be used to find the set of global maxima of a given function. The button below opens a separate window from your browser containing a demonstration of this. The window is resizable, but you may need to adjust its dimensions depending on your system configuration.

To view this applet, please install the Java Plugin.

The principle of simulated annealing

Suppose we wish to find the set of global maxima of a given function. Conventionally, we could proceed by differentiating the function, and setting this derivative to zero. This would give the set of local extrema and inflection points, which we could then analyse further to find the maxima etc. The classical (non random) numerical methods which perform this task are often impractical when the function is complicated.

The simulated annealing approach consists in thinking of the function to be maximized as a probability density, and running a Markov chain with this density as equilibrium distribution. However, this will not yet give the maxima. At the same time the chain is run, the density is gradually raised to higher and higher powers (this is called lowering the temperature). In the limit as the temperature goes to zero, the successive densities turn into a collection of point masses located on the set of global maxima of the function. Thus when the temperature is very low, the corresponding Markov chain will spend most of its time near the global maxima.

You can try out some of the standard Metropolis Hastings algorithms from the Metropolis-Hastings applet on simulated annealing. But beware of choosing cooling schedules that are too fast, for then the algorithm can easily get "stuck".


When the temperature is very close to zero, numerical errors are likely to become important in this implementation, and the Metropolis Hastings algorithms may loose sight of the target density completely, thinking that it is equal to zero. Errors of this type can be minimized by more careful(!) programming.