How do I evaluate the error for a fitted non-linear curve

61 views (last 30 days)
Hi,
I have an experimental data spectrum which contains 3 overlapping peaks. I deconvoluted this spectrum with 3 Lorentianz using 'fit', as seen in the code. I'm quite happy with the fit, but I don't understand how I can estimate how good the fit is. Using the 'fit' function I can get R^2 goodness of fit but I found that its only good for linear regression type of fitting, not for a spectrum like mine.
More over, after I fit the data I integrate using 'trapz' over the 3 lorentianz in order to get their area. I need to estimate the error of these integrations. assuming I can get some form of goodness of fit for my fitted spectrum, how will I correlate it to an error of the integration?
I hope I managed to explain my problem.
Here is the fittype function I use in my code:
ft = fittype(@(A1,A2,A3,s1,s2,s3,x) A1*s1^2./(s1^2+(x-pos1).^2) + ...
A2*s2^2./(s2^2+(x-pos2).^2) + A3*s3^2./(s3^2+(x-pos3).^2), 'independent', 'x', 'dependent', 'y' );

Answers (2)

Mathieu NOE
Mathieu NOE on 25 Aug 2022
hello
using both experimental and fitted data you can compute R2 this way :
function Rsquared = my_Rsquared_coeff(data,data_fit)
% R2 correlation coefficient computation
% The total sum of squares
sum_of_squares = sum((data-mean(data)).^2);
% The sum of squares of residuals, also called the residual sum of squares:
sum_of_squares_of_residuals = sum((data-data_fit).^2);
% definition of the coefficient of correlation is
Rsquared = 1 - sum_of_squares_of_residuals/sum_of_squares;
end
  3 Comments
Mathieu NOE
Mathieu NOE on 25 Aug 2022
maybe some usefull info's here :
When fitting data, the evaluation of the GoF is almost never a trivial task. Even in the linear case, there exist some issues as can be read in the following paper :
In the non linear case, the problem becomes much more complicated and, of course, is not free of issues either. (see for example the following paper regarding the use of R-squared)

Sign in to comment.


Ben Mercer
Ben Mercer on 25 Aug 2022
Hi yuval, I'm not sure if I'm entirely clear on what you're asking. But to my understanding R^2 is not specific to linear regression, the criteria for using it is that the model must be a least-squares solution - otherwise you end up with meaningless (sometimes negative) R^2 values.
If an R^2 value is not output from whatever fitting function you are using, you can calculate it by calculating the value of your fitted model at your input data points, then calculate R^2 using the following definition:
The R^2 tells you how much variation in the data is captured by your model, as a fraction of the total variation. You can also calculate the RMS error between the data and model, but this will output a result in the physical units of whatever you are measuring, and this is not a very specific measure of how well your model captures patterns in te data.

Products


Release

R2020a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!