Fitting Gaussian to a curve with multiple peaks

357 views (last 30 days)
MINA
MINA on 21 Jul 2018
Commented: Ian-Andreas on 26 Mar 2024 at 14:26
I have a curve with a few peaks (the figure is attached here). I want to fit a few Gaussian to it. How can I do that?
  1 Comment
dpb
dpb on 21 Jul 2018
There are pre-built models in the Curve Fitting Toolbox if you have it...

Sign in to comment.

Answers (3)

Image Analyst
Image Analyst on 12 Oct 2020
Attached is a demo for how to fit any specified number of Gaussians to noisy data. Here is an example where I created a signal from 6 component Gaussians by summing then, and then added noise to the summed curve. The input data is the dashed line (upper most curve), and the Gaussians it thought would sum to fit it best are shown in the solid color curves below the dashed curve.
Here is the main part of the program:
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
% HEAVY LIFTING DONE RIGHT HERE:
% Run optimization
[parameter, fval, flag, output] = fminsearch(@(lambda)(fitgauss(lambda, tFit, y)), startingGuesses, options);
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
The rest is just setup/initialization and plotting code.
  10 Comments
Image Analyst
Image Analyst on 19 Feb 2024
@Ian-Andreas I'm not sure. What I would try is to change the number of Gaussians fitted until you get a situation where there no negative pointing Gaussians.
Ian-Andreas
Ian-Andreas on 26 Mar 2024 at 14:26
@Image Analyst Cheers, adding the number of gaussians appears to have solved it!

Sign in to comment.


Image Analyst
Image Analyst on 21 Jul 2018
Gaussian Mixture Models
Gaussian mixture models (GMM) are composed of k multivariate normal density components, where k is a positive integer. Each component has a d-dimensional mean (d is a positive integer), d-by-d covariance matrix, and a mixing proportion. Mixing proportion j determines the proportion of the population composed by component j, j = 1,...,k.
You can fit a GMM using the Statistics and Machine Learning Toolbox™ function fitgmdist by specifying k and by supplying X, an n-by-d matrix of data. The columns of X correspond to predictors, features, or attributes, and the rows correspond to observations or examples. By default, fitgmdist fits full covariance matrices that are different among components (or unshared).
  13 Comments
Alex Sha
Alex Sha on 23 Nov 2019
Hi,MINA, refer to the results below of multi-peak fit, looks perfect:
eq.png
Root of Mean Square Error (RMSE): 0.000758421176625326
Sum of Squared Residual: 0.00575202681153745
Correlation Coef. (R): 0.99999631975421
R-Square: 0.999992639521965
Adjusted R-Square: 0.999992638049428
Determination Coef. (DC): 0.999992638461578
Chi-Square: 0.0026447203781342
F-Statistic: 58916561.8092515
Parameter Final Estimate
---------- --------------
y01 3.05300628204444
a1 -16.9598254082349
w1 -33.1216274304912
xc1 -199.237682046936
y02 -5.01995553438715
a2 125.991592415662
w2 68.1014557034446
xc2 -38.1447385704029
y03 -21670.4124112499
a3 -862.748820039932
w3 -150.593954759166
xc3 -158.633069857899
y04 2.38145165713579
a4 71.7219094008726
w4 56.7351551616516
xc4 21.6923380713498
y05 -23.3097952825874
a5 -148.854281788758
w5 -73.9490093077971
xc5 220.960836388443
y06 21689.0835365582
a6 -1739.80097056373
w6 -275.612137225476
xc6 104.20899470954
c235.jpg
Rik
Rik on 12 Oct 2020
Comment posted as flag by qi lu:
Hi, this demo seems perfect. Could you please tell me how to fit such equation. I use 'fit' in the curve fitting box, the fitting result is horrible.

Sign in to comment.


Image Analyst
Image Analyst on 24 Jul 2018
Edited: Image Analyst on 25 Nov 2020
One way to do it is to set up the sum of two Gaussians with an offset and a linear ramp. Then you can use fitnlm, with your best guesses as to the parameters. See code:
% Uses fitnlm() to fit a non-linear model (sum of two gaussians on a ramp) through noisy data.
% Requires the Statistics and Machine Learning Toolbox, which is where fitnlm() is contained.
% Initialization steps.
clc; % Clear the command window.
close all; % Close all figures (except those of imtool.)
clear; % Erase all existing variables. Or clearvars if you want.
workspace; % Make sure the workspace panel is showing.
format long g;
format compact;
fontSize = 20;
% Create the X coordinates from 0 to 20 every 0.5 units.
X = linspace(0, 80, 400);
mu1 = 10; % Mean, center of Gaussian.
sigma1 = 3; % Standard deviation.
mu2 = 50; % Mean, center of Gaussian.
sigma2 = 9; % Standard deviation.
% Define function that the X values obey.
a = 6 % Arbitrary sample values I picked.
b = 35
c = 25
m = 0.1
Y = a + m * X + b * exp(-(X - mu1) .^ 2 / sigma1)+ c * exp(-(X - mu2) .^ 2 / sigma2); % Get a vector. No noise in this Y yet.
% Add noise to Y.
Y = Y + 1.2 * randn(1, length(Y));
% Now we have noisy training data that we can send to fitnlm().
% Plot the noisy initial data.
plot(X, Y, 'b.', 'LineWidth', 2, 'MarkerSize', 15);
grid on;
% Convert X and Y into a table, which is the form fitnlm() likes the input data to be in.
tbl = table(X', Y');
% Define the model as Y = a + exp(-b*x)
% Note how this "x" of modelfun is related to big X and big Y.
% x((:, 1) is actually X and x(:, 2) is actually Y - the first and second columns of the table.
modelfun = @(b,x) b(1) + b(2) * x + b(3) * exp(-(x(:, 1) - b(4)).^2/b(5)) + b(6) * exp(-(x(:, 1) - b(7)).^2/b(8));
beta0 = [6, 0.1, 35, 10, 3, 25, 50, 9]; % Guess values to start with. Just make your best guess.
% Now the next line is where the actual model computation is done.
mdl = fitnlm(tbl, modelfun, beta0);
% Now the model creation is done and the coefficients have been determined.
% YAY!!!!
% Extract the coefficient values from the the model object.
% The actual coefficients are in the "Estimate" column of the "Coefficients" table that's part of the mode.
coefficients = mdl.Coefficients{:, 'Estimate'}
% Let's do a fit, but let's get more points on the fit, beyond just the widely spaced training points,
% so that we'll get a much smoother curve.
X = linspace(min(X), max(X), 1920); % Let's use 1920 points, which will fit across an HDTV screen about one sample per pixel.
% Create smoothed/regressed data using the model:
yFitted = coefficients(1) + coefficients(2) * X + coefficients(3) * exp(-(X - coefficients(4)).^2 / coefficients(5)) + coefficients(6) * exp(-(X - coefficients(7)).^2 / coefficients(8));
% Now we're done and we can plot the smooth model as a red line going through the noisy blue markers.
hold on;
plot(X, yFitted, 'r-', 'LineWidth', 2);
grid on;
title('Exponential Regression with fitnlm()', 'FontSize', fontSize);
xlabel('X', 'FontSize', fontSize);
ylabel('Y', 'FontSize', fontSize);
legendHandle = legend('Noisy Y', 'Fitted Y', 'Location', 'northeast');
legendHandle.FontSize = 25;
% Set up figure properties:
% Enlarge figure to full screen.
set(gcf, 'Units', 'Normalized', 'OuterPosition', [0 0 1 1]);
% Get rid of tool bar and pulldown menus that are along top of figure.
% set(gcf, 'Toolbar', 'none', 'Menu', 'none');
% Give a name to the title bar.
set(gcf, 'Name', 'Demo by ImageAnalyst', 'NumberTitle', 'Off')
  6 Comments
Image Analyst
Image Analyst on 27 Feb 2021
I haven't done it in 2-D. You'd have to experiment around with it and try to adapt it to 2-D. You can do that as well as I can.
S0852306
S0852306 on 23 Jul 2023
@Nan Tao, if your real goal is get the partial derivate from data and the data are quite smooth
(no noise), then simple numerical differentiation (finite-difference) shold work fine.
If you really want to fit a model form your data then compute the partial derivate of the fitted model, you
may try this : nonlinear surface fitting, here's the result I got,
the fitted model match the data you provide quite well.
mean square error : 6.6370e-13
mean absolute error: 5.4212e-07
partial derivate estimation
Code: (fitting)
clear; clc; close all;
load('I0_Gaus_Gradient_Fit_0226.mat');
Z=I0_Gaus_Fit_0226;
% to run this script, download the toolbox at file exchage: https://tinyurl.com/wre9r5uk
count=0;
for i=1:size(Z,1)
for j=1:size(Z,2)
count=count+1;
X(1,count)=i;
X(2,count)=j;
Y(count)=Z(i,j);
end
end
%% set up model
NN.InputAutoScaling='on';
NN.LabelAutoScaling='on';
InSize=2; OutSize=1;
LayerStruct=[InSize,5,5,5,OutSize];
NN=Initialization(LayerStruct,NN);
%% set up optimizer
option.MaxIteration=500;
NN=OptimizationSolver(X,Y,NN,option);
Compute partial derivate & visulization
%% visualization
R=FittingReport(X,Y,NN);
P=NN.Evaluate(X);
PMatrix=reshape(P,size(Z,1),size(Z,2));
s=surf(PMatrix);
s.EdgeColor='none';
title(' fitted surface, $$ z=f(x,y) $$,','interpreter','latex')
%% partial derivate estimation
D=NN.Derivate(X);
dx=reshape(D(1,:),size(Z));
dy=reshape(D(2,:),size(Z));
figure
sdx=surf(dx); title('$$ \frac{\partial z}{\partial x} $$','interpreter','latex'); sdx.EdgeColor='none';
figure
sdy=surf(dy); title('$$ \frac{\partial z}{\partial y}$$','interpreter','latex'); sdy.EdgeColor='none';

Sign in to comment.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!