fmincon: diagnostics of used hessian approximation

5 views (last 30 days)
foboo
foboo on 16 Jan 2018
Edited: foboo on 16 Jan 2018
I am using fmincon solve an optimization problem where I only supply an analytical gradient for the objective. For the hessian approximation I would like to try what options works best. However, I don't understand the diagnostics put by fmincon, please see below.
I am using this following call to set up the options for fmincon, where I set BFGS as hessian approximation method:
options = optimoptions(@fmincon, 'GradObj','on','DerivativeCheck','off','MaxFunEvals',1e4,'Display','iter-detailed','algorithm','interior-point', ...
'SpecifyConstraintGradient', true ,'FinDiffType','central','MaxIterations', 5e2, 'HessianApproximation','bfgs' ,'Diagnostics', 'on' )
The options seem to have been accepted by optimoptions as it puts out:
options =
fmincon options:
Options used by current Algorithm ('interior-point'):
(Other available algorithms: 'active-set', 'sqp', 'sqp-legacy', 'trust-region-reflective')
Set properties:
Algorithm: 'interior-point'
CheckGradients: 0
Display: 'iter-detailed'
FiniteDifferenceType: 'central'
HessianApproximation: 'bfgs'
MaxFunctionEvaluations: 10000
MaxIterations: 500
SpecifyConstraintGradient: 1
SpecifyObjectiveGradient: 1
Default properties:
ConstraintTolerance: 1.0000e-06
FiniteDifferenceStepSize: 'eps^(1/3)'
HessianFcn: []
HessianMultiplyFcn: []
HonorBounds: 1
ObjectiveLimit: -1.0000e+20
OptimalityTolerance: 1.0000e-06
OutputFcn: []
PlotFcn: []
ScaleProblem: 0
StepTolerance: 1.0000e-10
SubproblemAlgorithm: 'factorization'
TypicalX: 'ones(numberOfVariables,1)'
UseParallel: 0
Options not used by current Algorithm ('interior-point')
Default properties:
FunctionTolerance: 1.0000e-06
However, when running fmincon with the settings above it says that "finite-differencing (or Quasi-Newton)" is used for the Hessian. What exactly does that mean? Is it using BFGS or not? Why is Quasi-Newton written in brackets only?
Core of the question is actually that the performance (speed and accuracy) of fmnincon is much lower that the one of Knitro trial version (same configuration), and I 'd like to figure out why.
Diagnostic Information
Number of variables: 1200
Functions
Objective and gradient: @(x)mhObjective(obj,x)
Hessian: finite-differencing (or Quasi-Newton)
Constraints
Nonlinear constraints: do not exist
Number of linear inequality constraints: 0

Answers (0)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!