Optimization, Design of Experiments Tool and Constraint Satisfaction

This topic provides you with information concerning Optimization, Design of Experiments Tool and Constraint Satisfaction.

This page discusses:

Optimization

This section provides you with more information about optimization.

If you encounter a problem with many variables (more than 4) as free parameters:

  • Do not forget to apply ranges,
  • Begin with the gradient algorithm before trying the simulated annealing algorithm.

Gradient

The gradient behaves better with squares or (especially relevant for Target Values). Hence the following problem:

Given: volume=x*y*z.
Find, x, y and z such that volume = 1000.
is better solved if the following formula is given to the optimizer: objective = (volume)2

Chaining algorithms

In most cases the properties of the functions used inside the optimization problem are unknown. In this case, it is recommended to use the global search algorithm (Simulated Annealing). However, as this algorithm can take a long time to reach convergence (especially when there are many free parameters), it could be helpful to use a local search (gradient) for a few iterations before switching to the global search (Simulated Annealing). Eventually, when the global search has converged and that results must be refined, reduce the ranges around the found solution and restart a slow Local Search.

Several constraints

  • Some optimization problems can contain a large number of constraints with respect to the number of free parameters. In this case the optimization problem can be over-constrained i.e. there is no feasible region (set of free parameter values for which all constraints are satisfied). The Global Search (Simulated Annealing) helps to reduce the constraints values even in this later case. However, it does not guarantee any access to the feasible region even if it exists.
  • The evolution of the distances to satisfaction (that can be displayed with the graphs) is useful to identify constraints that are difficult to satisfy. It is sometime better to deactivate all other constraints to identify a potential zone of satisfaction for these constraints only.

Recommendations

Always use well-constrained sketches when they are involved in an optimization. Under-constrained sketches can lead to wrong solutions or collapsed geometries. It is also recommended to limit the ranges of the free parameters to reasonable values.

Constraint Satisfaction

This section provides you with more information about constraint satisfaction.

Each output variable is contained between 2 bounds, a superior one and an inferior one. These bounds are computed by the Constraint satisfaction.

Given two real type parameters, x and y. Create a constraint satisfaction:

x2 + y2 < 25, 
the following parameters are displayed in the Formula editor indicating the lower and the upper bounds:

Additional Information

This section provides you with additional information.

  • Only one objective parameter can be optimized at a time f(x).
  • Continuity in parameter value is required. Multiple discrete value parameters cannot be used as free parameters.
  • Using a parameter which is constrained by a relation as a free parameter may lead to unpredictable results.
  • For the gradient based algorithm (local search) put at least three times the number of free parameters as number of updates without improvement.
  • If you want to modify free parameters driven by an Equivalent Dimension feature, select the Value parameter located below the Equivalent Dimension feature. Note that you will not be able to directly apply ranges and steps to the parameters making up the Equivalent Dimension: Only the Value parameter located below the Equivalent Dimension feature can be applied ranges and steps. Therefore, if you apply ranges and/or steps to the Value parameter, the parameters making up the equivalent dimension will have the same ranges or/and steps as the Value parameter.
  • For gradient based algorithms (derivative provider excepted), the gradients are calculated by finite difference. This is one of the causes of imprecision. A second cause of imprecision comes from measures that are provided by various apps. Optimization results precision depends on the 2 previous causes.