Is it possible to impose custom conditions for an early surrogate reset?

4 views (last 30 days)
Hello. I’m using the surrogate optimization algorithm for a problem with an expensive objective function. An issue I’ve encountered with my problem is that the optimizer sometimes stalls in the adaptive phase, spending too many function evaluations on points that are close to an unpromising incumbent. I realize that this behavior is governed in part by the MinSampleDistance option. However, I’m hesitant to raise this threshold because it’s not among the options that can be altered later upon restarting from a checkpoint. Ideally, I’d like to add state-dependent conditions tailored to my problem that can trigger an early surrogate reset. Is there a way to do this with a custom output function?

Accepted Answer

Rakesh Kumar
Rakesh Kumar on 31 Mar 2021
Hi James,
Surrogateopt reset criteria depends on the "expected reduction of objective function value" assuming the incumbent is feasible. This is one of the criteria which seems to be interfering with your expected behavior.
How is the "improvement" calculated?
delta_fval = abs( f(x_incumbent) - f(x_trial))
Assuming we decrease the function value, is the delta_fval good enough to keep going?
delta_fval > options.ObjectiveImprovementThreshold * max(1, abs(f(x_incumbent)))
If this check is TRUE then the solver thinks it is improving and it keeps going. if this check fails consecutively for "options.MaxStallIterations", the solver will reset.
The option "ObjectiveImprovementThreshold" and 'MaxStallIterations" are hidden but settable.
What can you do now? Three things (do one or more)
  1. The default value of ObjectiveImprovementThreshold is 1e-6. You can set this value lower just like any other options setting. This is hidden so it will not show it value.
  2. MaxStallIterations default is 10*numberofvariables. You can set this to a lower value if you want. This is hidden so it will not show it value.
  3. The second factor is the scale of the function value. If f(x_incumbent) is very high then the improvement is a product of this value and the ObjectiveImprovementThreshold. You can scale your function as an alternative too.
The other criteria is when the solver cannot create enough distinct points and that's where MinSampleDistance is used. I agree with you that you don't want to play with this as the first choice. This criteria may suffer from scaling of design variables X where one value with a bigger scale can dominate the distance. See if the design varibales are poorly scaled or not.
just like MinSampleDistance, these two options are not state-dependent.
The 'expected reduction" cannot be too loose otherwise we can have poor solution accuracy. Please keep in mind that the default values for these options are set to ensure accuracy of the solution. I do understand that there is no easy way to control this critera at the moment. I will work to make this a settable option in future.
Thanks,
Rakesh
  3 Comments
Rakesh Kumar
Rakesh Kumar on 2 Apr 2021
Hi James,
Let me know if these options helped you control the solver stop early. This will help us understand the usage. Also, if you can share the application with us, that would be bonus.
You are right about not being able to set these options when resuming from a checkpoint. These options can be changed in the save checkpoint file. They are down here "CheckPointData.SurrogateSolverData.options"
Thanks,
Rakesh
James C.
James C. on 4 Apr 2021
Edited: James C. on 4 Apr 2021
Thank you, Rakesh! Will do! As for my application, it’s a simulated method of moments estimation for a finance/econ problem. I’m still experimenting with a handful of solvers (simulated annealing, patternsearch, globalsearch). So far, I’ve had the most luck with globalsearch, but it’s still a bit early to say. Each function evaluation takes between 20 and 30 seconds, so if the surrogate approach works out it could be a nice time-saver.

Sign in to comment.

More Answers (0)

Products


Release

R2020a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!