First-Order Optimality Measure

4 views (last 30 days)
Devin
Devin on 8 Feb 2019
Edited: Matt J on 8 Feb 2019
Greetings,
Currently, I am working on a nonlinear constraint optimization problem and expect to use MultiStart function to search for possible local minimum.
Here is one simple example I got.
Capture.PNG
Q1: Although these solutions in the list have exitflags 2, yet the first-order optimality is nonzero. Necessary condition for local minimum is zero (My problem is constaint optimization. So I think here is necessary condition in KKT form. https://www.mathworks.com/help/optim/ug/first-order-optimality-measure.html). Are these solutions local minimum?
Q2: It is noticed that most first-order optimality of solutions are very large and only one solution's is small at 23.32. Does it mean that the optimization problem is very nonsmooth? For a nonsmooth nonlinear constraint optimization problem, what algorithm should be adopted for efficiency?
Thank you very much!

Answers (1)

Matt J
Matt J on 8 Feb 2019
Edited: Matt J on 8 Feb 2019
Q1: Although these solutions in the list have exitflags 2, yet the first-order optimality is nonzero.
It is an iterative solver. It tries to converge to a local minimum, but typically will not land on one exactly. The solver has various stopping criteria which it uses to try to decide when it is close enough to a local minimum. In your case, because exitflag=2, it means that the algorithm stopped because it found it was taking very small steps, which can be an even better indication of proximity to a minimum than the absolute size of the gradient (or first order optimality measure).
One reason small steps may occur, even though the gradient is large in absolute numbers, is that the gradient is nevertheless small in relation to the local curvature. Many/most solvers in Matlab use curvature-weighted gradient steps a la Newton's method. So, for example, the single variable function f(x)=10^9*x^2/2 may appear at x=.00001 to have a large derivative (10^4), but really the derivative is quite small when compared to the curvature (10^9) and so curvature-weighted algorithms will take small steps. And, this is appropriate to do in this case, since x=.00001 is quite close to the minimum at x=0, assuming you consider x=1 to be far away.
Q2: It is noticed that most first-order optimality of solutions are very large and only one solution's is small at 23.32. Does it mean that the optimization problem is very nonsmooth
It could mean many things and not showing us your code leaves a lot of room for guessing. For example, continuing from the above example, it may mean that solutions of both very high and very low curvature were found.

Products


Release

R2018a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!