You are here: Symbol Reference > Optimization Namespace > Functions > Optimization.BFGS Function
MtxVec VCL
ContentsIndex
PreviousUpNext
Optimization.BFGS Function

Minimizes the function of several variables by using the Quasi-Newton optimization algorithm.

Pascal
function BFGS(Fun: TRealFunction; Grad: TGrad; var Pars: Array of double; Const Consts: array of double; Const ObjConst: Array of TObject; out FMin: double; const IHess: TMtx; out StopReason: TOptStopReason; const FloatPrecision: TMtxFloatPrecision; DFPAlgo: Boolean = false; SoftLineSearch: boolean = true; MaxIter: Integer = 500; Tol: double = 1.0E-8; GradTol: double = 1.0E-8; const Verbose: TStrings = nil): Integer; overload;
Parameters 
Description 
Fun 
Real function (must be of TRealFunction type) to be minimized. 
Grad 
The gradient and Hessian procedure (must be of TGrad type), used for calculating the gradient. 
Pars 
Stores the initial estimates for parameters (minimum estimate). After the call to routine returns adjusted calculated values (minimum position). 
Consts 
Additional Fun constant parameteres (can be/is usually nil). 
ObjConst 
Additional Fun constant parameteres (can be/is usually nil). 
FMin 
Returns function value at minimum. 
IHess 
Returns inverse Hessian matrix. 
StopReason 
Returns reason why minimum search stopped (see TOptStopReason). 
FloatPrecision 
Specifies the floating point precision to be used by the routine. 
DFPAlgo 
If True, BFGS procedure will use Davidon-Fletcher-Powell Hessian update scheme. If False, BFGS procedure will use Broyden-Fletcher-Goldberg-Shamo Hessian update scheme. 
SoftLineSearch 
If True, BFGS internal line search algoritm will use soft line search method. Set SoftLineSearch to true if you're using numerical approximation for gradient. If SoftLineSearch if false, BFGS internal line search. algorithm will use exact line search method. Set SoftLineSearch to false if you're using *exact* gradient. 
MaxIter 
Maximum allowed numer of minimum search iterations. 
Tol 
Desired Pars - minimum position tolerance. 
GradTol 
Minimum allowed gradient C-Norm
Verbose 
If assigned, stores Fun, evaluated at each iteration step. Optionally, you can also pass TOptControl object to the Verbose parameter. This allows the optimization procedure to be interrupted from another thread and optionally also allows logging and iteration count monitoring.  

the number of iterations required to reach the solution(minimum) within given tolerance.

Problem: Find the minimum of the "Banana" function by using the BFGS-DFP method. 

Solution:The Banana function is defined by the following equation: 

 

Also, BFGS method requires the gradient of the function. The gradient of the Banana function is: 

 

 

Uses MtxVec, Math387, Optimization, MtxIntDiff; function Banana(const Pars: TVec; const Consts: TVec; const OConsts: Array of TObject): double; begin Banana := 100*Sqr(Pars[1]-Sqr(Pars[0]))+Sqr(1-Pars[0]); end; procedure GradBanana(Fun: TRealFunction; const Pars: TVec; const Consts: TVec; const ObjConsts: Array of TObject; const Grad: TVec); begin Grad[0] := -400*(Pars[1] - Sqr(Pars[0]))*Pars[0] - 2*(1 - Pars[0]); Grad[1] := 200*(Pars[1] - Sqr(Pars[0])); end; procedure Example; var Iters : integer; Pars : Array [0..1] of double; StopReason : TOptStopReason; begin // initial estimates for x1 and x2 Pars[0] := 0; Pars[1] := 0; Iters := BFGS(Banana,GradBanana,Pars,[],[],FMin,StopReason,mvDouble,IHess); //stop if Iters > 500 or Tolerance < 1e-8 // Returns Pars = [1,1] and FMin = 0, meaning x1=1, x2=1 and minimum value is 0 end;
#include "MtxExpr.hpp" #include "Math387.hpp" #include "Optimization.hpp" #include "MtxIntDiff.hpp" // Objective function double __fastcall Banana(TVec* const Parameters, TVec* const Constants, System::TObject* const * ObjConst, const int ObjConst_Size) { double* Pars = Parameters->PValues1D(0); return 100.0*IntPower(Pars[1]-IntPower(Pars[0],2),2)+IntPower(1.0-Pars[0],2); } // Analytical gradient of the objective function void __fastcall GradBanana(TRealFunction Fun, TVec* const Parameters, TVec* const Consts, System::TObject* const * ObjConst, const int PConsts_Size, Mtxvec::TVec* const Grad) { double* Pars = Parameters->PValues1D(0); Grad->Values[0] = -400*(Pars[1]-IntPower(Pars[0],2))*Pars[0] - 2*(1-Pars[0]); Grad->Values[1] = 200*(Pars[1]-IntPower(Pars[0],2)); } void __fastcall Example(); { double Pars[2]; double fmin; TOptStopReason StopReason; // initial estimates for x1 and x2 Pars[0] = 0; Pars[1] = 0; int iters = BFGS(Banana,GradBanana,Pars,1,NULL,-1,NULL,-1,fmin,iHess, StopReason,mvDouble, false,true,1000,1.0e-8,1.0e-8,NULL); // stop if Iters >1000 or Tolerance < 1e-8 }
Examples on GitHub
Copyright (c) 1999-2025 by Dew Research. All rights reserved.
What do you think about this topic? Send feedback!