Optimize model parameters given minimum sum of error square

7 views (last 30 days)
I have a simple regression model for
y_model= b0+b1*x1+b2*x2+normrnd(0,x3)+x4
I know the values of b0, b1 and b2. I have some experimentally observed data, y_obs (stored in a vector). The sum of error squared is calculated as:
f= sum(y_obs-y_model)^2
Problem statement: I want to minimize the objective function, f, to obtain the values of x1, x2, x3 and x4. I know the upper and lower bounds of x1, x2, x3, x4. I do not have MATLAB toolbox. So I am trying using 'fmincon'. There might be some better method or function to solve this problem.
I tried to write the code. I see that the program ends with higher f value than f value of the first trial. But, MATLAB is reporting:
Local minimum possible. Constraints satisfied.
Can any of you please help me out in identify the problem? Am I missing the constraint?
Here is my code:
clear;close;clc;
format long;
% I want to calibrate alpha, beta, sigma and delta, given the sum of squares of errors is minimum
% Experimentally observed data
y_obs =[2708.7, 2937.8, 2934.9, 2877.8, 2823.1];
% The prediction model
% y_model = (b0+b1*X(1)+ b2*X(2))
% Coefficients of the prediction model
b0 = 2670.5; b1 = -576.677; b2 = 138.65;
% Ranges of model paramters
X1 = [0.5;0.75]; X2 = [3.5;4.5]; X3=[80;110]; X4 =[-50;40];
% Bounds of model paramters
lb=[X1(1),X2(1),X3(1),X4(1)];
ub=[X1(2),X2(2),X3(2),X4(2)];
% Error term (normally distributed with mean 0 and std dev. X3)
% assumed_error ~ N(0, X(3))
% X4 ~ U(U_lb,U_ub)
% y_observed = (b0+b1*X(1)+ b2*X(2))+ normrnd(0,X(3))+ X(4);
% Sum of squares of errors
f=@(X) sum((y_obs- (b0+ b1*X(1)+ b2*X(2)+ normrnd(0,X(3))+ X(4))).^2); % Objective function to be minimized
% Initial guess
x0 = (lb + ub)/2;
% Show initial objective
disp(['Initial Objective: ' num2str(f(x0))]);
% Linear constraints
A = []; b = []; Aeq = []; beq = [];
options = optimoptions('fmincon','Display','iter','Algorithm','sqp');
% Optimize with fmincon
[X] = fmincon(f,x0,A,b,Aeq,beq,lb,ub);
% Display final objective
disp(['Final Objective: ' num2str(f(X))]);
% Print solution
disp('Solution')
disp(['X1 = ' num2str(X(1))]);
disp(['X2 = ' num2str(X(2))]);
disp(['X3 = ' num2str(X(3))]);
disp(['X4 = ' num2str(X(4))]);

Answers (1)

Matt J
Matt J on 22 Nov 2020
It does not make sense that there are randomization operations like normrnd in your objective function. How can a function be "minimized" if its definition changes randomly every time you invoke it?
  2 Comments
Rudraprasad Bhattacharyya
Rudraprasad Bhattacharyya on 27 Nov 2020
I do not expect a unique optimized value as I intentionally want to incorporate some uncertainty. I want to run this code for multiple samples, similar to Monte-Carlo simulation. Then I'll obtain the minimum value based on all the samples.
Matt J
Matt J on 28 Nov 2020
You must run a separate optimization for each randomized sample.

Sign in to comment.

Categories

Find more on Problem-Based Optimization Setup in Help Center and File Exchange

Products


Release

R2017b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!