Nonlinear optimization by using MATLAB built in functions
5 views (last 30 days)
Show older comments
Hello everybody,
I have a question regarding the optimization algorithms. I have used some of them so far, in order to do some system identification. At the moment I am trying to use the optimizer to find the optimal inputs to system that I am simulating. Basically the output of the system responds to N input parameters, I have a target value for the output of my system and I define my error function as the difference between the output value obtained by the current combination of parameters at the current iteration. Basically after every new simulated point, I check the gradient of the new values, and I use those gradients to set the next step, basically a gradient descent optimization. Normally one could do this in MATLAB by using the optimizer built in functions, but they always request the knowledge of the function to optimize and the gradient...is there any way to use those functions without having the whole regression matrixes and vectors in advance? or do I need to develop myself those iterative solutions?
0 Comments
Accepted Answer
John D'Errico
on 1 Oct 2015
Edited: John D'Errico
on 1 Oct 2015
No. The tools in MATLAB absolutely do NOT require knowledge of the gradient!!!!!!
In general, the optimization tools in MATLAB require for the function nothing more sophisticated than a "black box", a function that you can pass parameters into and it returns an objective value.
As it is, you claim that you are computing the gradient anyway. So why would that be a problem? That you are using what is essentially steepest descent suggests that you would be far better off using a more intelligent tool.
Since I do not know what problem you are solving, nor how many parameters are involved in the estimation, I cannot suggest an optimizer. fminsearch is a simple choice, if you have no constraints, and up to roughly 6 or so parameters. Larger problems would normally go to one of the routines in the optimization toolbox. Or you can always look to the file exchange if you lack that toolbox.
3 Comments
John D'Errico
on 2 Oct 2015
With only 3 parameters, as I said, fminsearch will be entirely adequate. It will be better than steepest descent, which is generally a poor choice for optimization.
Your objective function for fminsearch would be something like (y_target - f(X))^2, which is minimized at the target value. Of course, you could still use tools from the optimization toolbox.
In any case you will encounter one serious problem. There will be infinitely many solutions, so dependent on your starting point, you will probably find a different solution.
You are essentially trying to solve one equation in three unknowns. The result will be an infinite set of solutions in general. Think of it as a surface in the 3-dimensional parameter space. Or you might call it a 2-manifold, embedded in the 3 dimensional parameter space.
For example, what is the solution to the problem
f(x,y,z) = x^2 + y^2 + z^2 = 2
One can describe the solution locus as the surface of a sphere. If you used an optimizer to solve the problem as you intend to do, then the answer would depend entirely on where you started the solver. It will find a solution on the surface of the sphere, but which point depends on where you start.
More Answers (0)
See Also
Categories
Find more on Get Started with Optimization Toolbox in Help Center and File Exchange
Products
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!