This paper presents a comparison of three different global optimization techniques. Global optimization techniques or methods are used when a problem has a number of different local minima. Conventional methods will only solve for one of these local minima and in some cases, that is not good enough. Global optimization methods are probabilistic routines. They will sometimes accept a worse solution than the previous one, and therefore they have the ability to climb out of the local minimum valleys. Conventional methods such as the Generalized Reduced Gradient do not have this ability.
Three different methods are examined. The first is a Simulated Annealing algorithm. This algorithm applies the natural phenomenon of the annealing of solids to solve optimization problems. The second method is Genetic Algorithms. Genetic Algorithms simulate the natural process of evolution using the idea of "the survival of the fittest." This method uses a population of designs rather than a single design. The last method uses orthogonal arrays to find a suitable starting point for the GRG algorithm. It is shown that all these methods do converge on a global minimum, however, some are more efficient than others. A simple model problem with a known answer is used to illustrate each method.
Return to Mechanisms Home Page