An Overview of the Optimization Techniques

The Optimization adapter offers a variety of optimization techniques.

This page discusses:

See Also
About the Optimization Techniques
Configuring the Optimization Techniques

Adaptive DOE Technique

The Adaptive DOE optimization runs a series of space filling DOEs, where each successive DOE is recentered and shrunk around a projection of the best point from the previous iteration. The technique optimizes the position of points in the design space by maximizing the distance from any other point. Executing a series of such space-filling DOEs, with a projection of the new optimal point and levels, can be an effective optimization strategy. Adaptive DOE optimization scales better for parallel execution than the other optimization techniques, and, because the points are spread out, the method is insensitive to the noisy outputs that are common in finite element analysis simulations.

Adaptive Simulated Annealing Technique

The Adaptive Simulated Annealing (ASA) algorithm is well suited for solving highly nonlinear problems with short running analysis codes, when finding the global optimum is more important than a quick improvement of the design.

Archive-Based Micro Genetic Algorithm (AMGA)

The Archive-based Micro Genetic Algorithm (AMGA) is an evolutionary optimization algorithm and relies on genetic variation operators for creating new solutions. The generation scheme deployed in AMGA can be classified as generational since, during a particular iteration (generation), only solutions created before that iteration take part in the selection process. The algorithm, however, generates a very small number of new solutions (it can work with just two solutions per iteration) at every iteration. Therefore, it can also be classified as an almost steady-state genetic algorithm.

Downhill Simplex Technique

The Downhill Simplex technique is a geometrically intuitive algorithm. A simplex is defined as a body in n dimensions consisting of n+1 vertices. Specifying the location of each vertex fully defines the simplex. In two dimensions the simplex is a triangle. In three dimensions it is a tetrahedron. As the algorithm proceeds, the simplex makes its way downward toward the location of the minimum through a series of steps.

Evolutionary Optimization Algorithm (Evol)

The Evolutionary Optimization Algorithm (Evol) is an evolution strategy that mutates designs by adding a normally distributed random value to each design variable. The mutation strength (standard deviation of the normal distribution) is self-adaptive and changes during the optimization loop. The algorithm has been calibrated to efficiently solve design problems with low numbers of variables and with some noise in the design space.

Hooke-Jeeves Technique

The Hooke-Jeeves technique begins with a starting guess and searches for a local minimum. It does not require the objective function to be continuous. Because the algorithm does not use derivatives, the function does not need to be differentiable. In addition, the technique allows you to determine the number of function evaluations required for the greatest probability of convergence.

Mixed-Integer Sequential Quadratic Programming (MISQP) Technique

Mixed-Integer Sequential Quadratic Programming (MISQP) is a trust region–based method for solving problems that include integer and other discrete variables.

Similar to other sequential quadratic programming methods, MISQP assumes that the objective function and constraints are continuously differentiable.

In addition, MISQP assumes that the objective and constraint functions are smooth with respect to the integer variables. Unlike other mixed-integer methods, MISQP does not relax the integer variables. MISQP uses a branch-and-bound procedure for solving each of the successive mixed-integer quadratic programs (MIQP). MISQP guarantees convergence for convex problems and produces good results for nonconvex problems.

Modified Method of Feasible Directions (MMFD) Technique

The Modified Method of Feasible Directions (MMFD) is a direct numeric optimization technique used to solve constrained optimization problems.

Multifunction Optimization System Tool (MOST) Technique

The Multifunction Optimization System Tool (MOST) technique first solves the given design problem as if it were a purely continuous problem, using sequential quadratic programming to locate an initial peak. If all design variables are real, optimization stops here. Otherwise, the technique branches out to the nearest points that satisfy the integer or discrete value limits of one nonreal parameter, for each such parameter.

Multi-Island Genetic Algorithm Technique

In the Multi-Island Genetic Algorithm, as with other genetic algorithms, each design point is perceived as an individual with a certain fitness value, based on the value of the objective function and constraint penalty. An individual with a better value of objective function and penalty has a higher fitness value.

Multi-Objective Particle Swarm Technique

Particle swarm optimization is a population-based search procedure where individuals (called particles) continuously change position (called state) within the search area. In other words, these particles "fly" around in the design space looking for the best position.

Neighborhood Cultivation Genetic Algorithm (NCGA) Technique

In the Neighborhood Cultivation Genetic Algorithm (NCGA) technique, each objective parameter is treated separately. Standard genetic operation of mutation and crossover are performed on the designs. The crossover process is based on the “neighborhood cultivation” mechanism, where the crossover is performed mostly between individuals with values close to one of the objectives. By the end of the optimization run, a pareto set is constructed where each design has the “best” combination of objective values, and improving one objective is impossible without sacrificing one or more of the other objectives.

Pointer-2 Technique

The goal of the Pointer-2 technique is to make optimization more accessible to nonexpert users without sacrificing performance. The Pointer-2 technique consists of a complementary set of optimization techniques. Each technique succeeds or fails for different topography features, and a hybrid combination of these methods allows a pointer optimization to solve a broad range of design problems. The Pointer-2 strategy can control one technique at a time or all techniques at once. As the optimization proceeds, the strategy determines which techniques are most successful and suitable values for the optimal internal control parameters (for example, step sizes, numbers of iterations, number of restarts, etc.).

Pointer Technique

The Pointer-2 technique is superior to the Pointer technique, and you should always use Pointer-2 over Pointer. The original Pointer technique is retained only for compatibility with older models.

Sequential Quadratic Programming (NLPQL) Technique

NLPQL is a special implementation of a Sequential Quadratic Programming (SQP) method. Proceeding from a quadratic approximation of the Lagrangian function and a linearization of constraints, a quadratic programming subproblem is formulated and solved. Depending on the number of compute nodes, objective and constraint functions can be evaluated simultaneously at predetermined test points along the search direction. The parallel line search is performed with respect to an augmented Lagrangian merit function.

Stress Ratio Technique

The Stress Ratio Optimizer is a fully stressed design (FSD) method commonly used in structural optimization.