Broyden–Fletcher–Goldfarb–Shanno algorithm
Broyden–Fletcher–Goldfarb–Shanno algorithm
The BFGS method belongs to quasi-Newton methods, a class of hill-climbing optimization techniques that seek a stationary point of a (preferably twice continuously differentiable) function. For such problems, a necessary condition for optimality is that the gradient be zero. Newton's method and the BFGS methods are not guaranteed to converge unless the function has a quadratic Taylor expansion near an optimum. However, BFGS can have acceptable performance even for non-smooth optimization instances.[2]
In Quasi-Newton methods, the Hessian matrix of second derivatives is not computed. Instead, the Hessian matrix is approximated using updates specified by gradient evaluations (or approximate gradient evaluations). Quasi-Newton methods are generalizations of the secant method to find the root of the first derivative for multidimensional problems. In multi-dimensional problems, the secant equation does not specify a unique solution, and quasi-Newton methods differ in how they constrain the solution. The BFGS method is one of the most popular members of this class.[3] Also in common use is L-BFGS, which is a limited-memory version of BFGS that is particularly suited to problems with very large numbers of variables (e.g., >1000). The BFGS-B variant handles simple box constraints.[4]
Rationale
The search direction pk at stage k is given by the solution of the analogue of the Newton equation:
Algorithm
Obtain a direction by solving .
Perform a one-dimensional optimization (line search) to find an acceptable stepsize in the direction found in the first step, so .
Set and update .
.
.
In statistical estimation problems (such as maximum likelihood or Bayesian inference), credible intervals or confidence intervals for the solution can be estimated from the inverse of the final Hessian matrix. However, these quantities are technically defined by the true Hessian matrix, and the BFGS approximation may not converge to the true Hessian matrix.[10]
Notable implementations
The large scale nonlinear optimization software Artelys Knitro implements, among others, both BFGS and L-BFGS algorithms.
The GSL implements BFGS as gsl_multimin_fdfminimizer_vector_bfgs2.[11]
In the MATLAB Optimization Toolbox, the fminunc function[12] uses BFGS with cubic line search when the problem size is set to "medium scale."[13]
In R, the BFGS algorithm (and the L-BFGS-B version that allows box constraints) is implemented as an option of the base function optim().[14]
In SciPy, the scipy.optimize.fmin_bfgs function implements BFGS.[15] It is also possible to run BFGS using any of the L-BFGS algorithms by setting the parameter L to a very large number.
See also
BHHH algorithm
Davidon–Fletcher–Powell formula
Gradient descent
L-BFGS
Levenberg–Marquardt algorithm
Nelder–Mead method
Pattern search (optimization)
Quasi-Newton methods
Symmetric rank-one