Curve fitting: What does LAR do?
22 views (last 30 days)
Show older comments
Hello, when conducting a nonlinear least squares regression, one can select 'LAR' (=least absolute residuals) under 'fitoptions'. This is a bit confusing because afaik LAR by definition does NOT minimize squared, but linear residuals. So if 'LAR' is selected, the regression is actually not based on nonlinear least squares, right?
Greetings Vince
1 Comment
Tamir Suliman
on 1 Dec 2016
I think this has been answered in
https://www.mathworks.com/matlabcentral/answers/81143-fit-and-polyfit-lar-vs-least-squares
Accepted Answer
John D'Errico
on 1 Dec 2016
Edited: John D'Errico
on 1 Dec 2016
I don't understand what is confusing. The sum of the absolute values of the residuals is clearly NOT least squares, thus a minimum sum of squares of residuals. If it was, there would be no point in offering it as an alternative.
Least squares puts more importance on the large residuals. For example, suppose you have two cases, based on two sets of parameters. We need consider only two points.
Case 1: Resid1 = 1, Resid2 = 1
case 2: Resid1 = 0, Resid2 = 2
For a least squares solver, the two cases are very different. Case 1 would be preferred, since the sum of squares is 2, versus 4 for the latter. So whatever set of parameters generated case 1 will be chosen as better.
In a LAR scheme (also called MAD, for minimum absolute deviations) the sum of the absolute residuals are identical. Neither solution would be preferred over the other.
One reason one might use such a scheme is it will be less sensitive to outliers in the data, whereas least squares tends to be jerked around by outliers quite easily.
More Answers (0)
See Also
Categories
Find more on Linear and Nonlinear Regression in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!