Optimization Search Methods

The search method refers to the approach taken in the optimization algorithm to locate a new design point that has a lower objective function or is more feasible than the current design point.

MotionSolve uses the SCIPY algorithm to find a solution. This algorithm supports several search methods.
Note: Most of our testing has been performed with the Sequential Least Squares Programming (SLSQP) algorithm.
The different algorithms supported by scipy.optimize.minimize are shown below:
Table 1.
Algorithm Problem Type Description
L-BFGS-B Constrained Method L-BFGS-B (Limited-memory BFGS Bounds) is an optimization algorithm in the family of quasi-Newton methods that approximates the Broyden–Fletcher–Goldfarb–Shanno (BFGS) algorithm using a limited amount of computer memory. It is applicable only for simple bound-constrained optimization, i.e. for problems where the only constraints are of the form b L b b U MathType@MTEF@5@5@+= feaagKart1ev2aqatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbdfgBPj MCPbqefqvATv2CG4uz3bIuV1wyUbqeduuDJXwAKbYu51MyVXgaruWq VvNCPvMCG4uz3bqeeuuDJXwAKbsr4rNCHbGeaGqiFu0Je9sqqrpepC 0xbbL8F4rqqrFfpeea0xe9Lq=Jc9vqaqpepm0xbba9pwe9Q8fs0=yq aqpepae9pg0FirpepeKkFr0xfr=xfr=xb9adbaGaaiGadmWaamaaci GaaqqaceqbcaGcbaGaamOyamaaBaaaleaacaWGmbaabeaakiabgsMi JkaadkgacqGHKjYOcaWGIbWaaSbaaSqaceaatLVaamyvaaqabaaaaa@4669@ .
TNC Constrained Method TNC uses a truncated Newton algorithm to minimize a function with variables subject to bounds. It is applicable only for simple bound-constrained optimization, i.e. for problems where the only constraints are of the form b L b b U MathType@MTEF@5@5@+= feaagKart1ev2aqatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbdfgBPj MCPbqefqvATv2CG4uz3bIuV1wyUbqeduuDJXwAKbYu51MyVXgaruWq VvNCPvMCG4uz3bqeeuuDJXwAKbsr4rNCHbGeaGqiFu0Je9sqqrpepC 0xbbL8F4rqqrFfpeea0xe9Lq=Jc9vqaqpepm0xbba9pwe9Q8fs0=yq aqpepae9pg0FirpepeKkFr0xfr=xfr=xb9adbaGaaiGadmWaamaaci GaaqqaceqbcaGcbaGaamOyamaaBaaaleaacaWGmbaabeaakiabgsMi JkaadkgacqGHKjYOcaWGIbWaaSbaaSqaceaatLVaamyvaaqabaaaaa@4669@ . This algorithm uses gradient information; it is also called Newton Conjugate-Gradient.
COBYLA Constrained Method COBYLA uses the Constrained Optimization BY Linear Approximation (COBYLA) method. The algorithm is based on linear approximations to the objective function and each constraint.
SLSQP Constrained Method SLSQP uses Sequential Least SQuares Programming to minimize a function of several variables with any combination of bounds, equality and inequality constraints.
Nelder-Mead Unconstrained Method Nelder-Mead uses the Simplex algorithm. This algorithm is robust in many applications. However, if numerical computation of derivative can be trusted, other algorithms using the first and/or second derivatives information might be preferred for their better performance in general.
Powell Unconstrained Method Powell is a modification of Powell’s method, which is a conjugate direction method. It performs sequential one-dimensional minimizations along each vector of the directions set, which is updated at each iteration of the main minimization loop. The function need not be differentiable, and no derivatives are taken.
CG Unconstrained Method CG uses a nonlinear conjugate gradient algorithm by Polak and Ribiere. This is a variant of the Fletcher-Reeves method. Only the first derivatives are used.
BFGS Unconstrained Method BFGS uses the quasi-Newton method of Broyden, Fletcher, Goldfarb, and Shanno. It uses the first derivatives only. BFGS has proven good performance even for non-smooth optimizations. This method also returns an approximation of the Hessian inverse.
Newton-CG Unconstrained Method Newton-CG uses a Newton-CG algorithm (also known as the truncated Newton method). It uses a CG method to the compute the search direction. See also TNC method for a box-constrained minimization with a similar algorithm.
dog-leg Unconstrained Method dogleg uses the dog-leg trust-region algorithm for unconstrained minimization. This algorithm requires the gradient and Hessian; furthermore the Hessian is required to be positive definite.
trust-ncg Unconstrained Method trust-ncg uses the Newton conjugate gradient trust-region algorithm for unconstrained minimization. This algorithm requires the gradient and either the Hessian or a function that computes the product of the Hessian with a given vector.