File Exchange

image thumbnail

Support Vector Regression

version 1.0.0.0 (34.5 KB) by Bhartendu
On-line support vector regression (using Gaussian kernel)

35 Downloads

Updated 22 May 2017

View License

On-line regression On-line learning algorithms are not restricted to classification problems. The update rule for the kernel adatron algorithm also suggests a general methodology for creating on-line versions of the optimisations.
making the first update of the kernel adatron algorithm equivalent to αi ← αi + ∂W(α) ∂αi making it a simple gradient ascent algorithm augmented with corrections to ensure that the additional constraints are satisfied. If, for example, we apply this same approach to the linear ε-insensitive loss version of the support vector regression algorithm.
One of the advantages of Support Vector Machine, and Support Vector Regression as the part of it, is that it can be used to avoid difficulties of using linear functions in the high dimensional feature space and optimization problem is transformed into dual convex quadratic programmes. In regression case the loss function is used to penalize errors that are grater than threshold - . Such loss functions usually lead to the sparse representation of the decision rule, giving significant algorithmic and representational advantages.

Reference:
Kernel Methods for Pattern Analysis byJohn Shawe-Taylor & Nello Cristianini
http://kernelsvm.tripod.com/

Cite As

Bhartendu (2020). Support Vector Regression (https://www.mathworks.com/matlabcentral/fileexchange/63060-support-vector-regression), MATLAB Central File Exchange. Retrieved .

Comments and Ratings (47)

Hi Bhartendu,
You did a great job. However, there are some difficulties that many of us have been facing.

Suppose the training set is having dimension 125X5 and so alpha will be 125X1. Furthermore, a new validation/test set is having the dimension 20X5. Now it will not be possible to run Step 2 as you have suggested. Because N is 125 and alpha is having 125 values though the validation/test set is having only 20 values.

Please provide a feasible solution by updating Step 2. How to calculate Alpha for the new test/validation set? Moreover, what is the use of "w" and "b" that have been calculated in your code? Please note that the validation/test set's output SHOULD NOT BE USED IN THE CALCULATION as it is unknown and can be just used to calculate errors after prediction.

Step 1. zscore normalization mentioned below:
[solar_features,mu,sigma]=zscore('GaussianData.csv');
x_validation=(x_validation-mu)./sigma;

Step 2. To get Predicted_values on x_validation: (after applying normalization to x_validation)
for j=1:N
fx1(j,:)=alpha(j)*kernel(x_validation,x_validation(j,:),'g')';
end
Predicted_values=sum(fx1)';

Ali sameer

Thank you very much
I need the refrence or the original paper

Aisha Sa'ad

Hello!
I have a problem running this code using the linear kernel. it shows NaN for the b, w and mse while the graph only shows the actual value on zero horizontally. can you also please explain to me why your data has 3 dimensions x-y-z? Please what method did you use to obtain the third column of the data (z)? Thank you

shadi aosati

To test the model on a new data set, the Alpha was generated for only the training set.
How would you get the alpha matrix for a new set without retraining?

Bhartendu

To people asking for how to predict:
plz check the example tab Support_Vector_Regression.mlx

hey can you help me by sharing how to predict for the new data

Maysara Ghaith

To test the model on a new data set, the Alpha was generated for only the training set.
How would you get the alpha matrix for a new set without retraining?

Thanks for your interesting code!
How can we denormalize the predicted data?
Your code just shows the normalized prediction data.

Almost Artsy

Can I predict next value prediction by using this code?
for example, i have 100 train data, can i predict 101-105? Please Help me.
Thankyou

Aminah Hina

I don't want to normalize my data as I want predicted values in their true form.
My problem is that I have train data of matrix [41x11] (where first 10 columns are features and 11th columns is response) and test data of matrix [6x10] (only features).
now when I have to test my data, how should I compute this module of your code?

% Predicted values

for j=1:N
fx1(j,:)=alpha(j)*kernel(x,x(j,:),'g')';
end
fx=sum(fx1)';
disp('[Actual Values Predicted Values]')
disp([y(1:10) ,fx(1:10)])
% Mean Square error (Gaussian Kernel)
mse=norm(y-fx)^2/N

Aminah Hina

The code is not working properly for k=l and k=p,i.e poly nominal and linear SVR. it fails to calculate weights giving NAN values . pasting values that are displayed in command window
N =39
itr =1
Total number of iteration 1
w =

NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN

b =
NaN
[Actual Values Predicted Values]
134 NaN
123 NaN
115 NaN
101 NaN
80 NaN
111 NaN
93 NaN
95 NaN
87 NaN
129 NaN
mse =

NaN

please suggest what parameters needs to be changes.

crixus

May i know how accurate is this code, I want to learn the svm by refering to the online lecture code

Bhartendu

Mr Vadlan: It's a regression problem. We do not calculate accuracy for any regression problem. R2 score can be calculated but still that is not called "accuracy". Also you asked code for prediction, which I already have given in code and example.

Vadlan Narendra

oh and can you please add the code for predict new data ? thank you

Vadlan Narendra

Hi Bhartendu, can your code show an accuracy of this method ?

Bhartendu

Mr Franck, check the dimension,may be previous build requires exact dimension while handling matrices. It works for me in newer build of Matlab (2017b or later).

nene franck

Error in SupportVectorRegression (line 23)
-alpha'*kernel(x,x(i,:),'g')';

Toktam Babaei

Thanks for sharing your code, but can it be used for multi output regression too ?

Bhartendu

Tian Lan:
Make sure to normalize the data, the method will work. If size (no. of attributes) are not same then little modification will be required especially in choosing the kernel. In case the data seems to be linear, then use linear kernel and like wise.

Tian Lan

Hi, Bhartendu,
If the data is non-Gaussian distributed, can this method work?
Which kernel should I choose?
I am new to SVM, thank you for your time.

Bhartendu

Step 1. zscore normalization mentioned below:

[solar_features,mu,sigma]=zscore('GaussianData.csv');
x_validation=(x_validation-mu)./sigma;

Step 2. To get Predicted_values on x_validation: (after applying normalization to x_validation)

for j=1:N
fx1(j,:)=alpha(j)*kernel(x_validation,x_validation(j,:),'g')';
end
Predicted_values=sum(fx1)';

Chathura Wanigasekara

Hi,

Can you please explain me how to test a validation data set with the trained SVR model.

Thank you.

Yulyanto Lim

how to make the gerenal function f(x) for predicting sir?

Bhartendu

Lawrence Soon:
No, it's just fundamental SVR.

Lawrence Soon

Hi Bhartendu, I am new to SVR, can this library support the epsilon-SVR?

vahid izadi

D W

sorry, can not use the code since the alpha has a MM rows but the x_test has M rows (MM~=M).

D W

Hi Bhartendu, Thank you for your code. I am just wondering how to get the predicted value for test set (M*N) based on training set (MM*N) (note: M~==MM). Because I can use the code since the alpha has a MM rows but the x_test has M rows:
for j=1:M
fx1(j,:)=alpha(j)*kernel(x_test,x_test(j,:),'g')';
end

Bhartendu

Hi D W
Please ask, I will be more then happy to answer you.

D W

Hi Bhartendu, can you answer me? Thank you very much!

D W

N is the number of training data samples. So after training, alpha has N rows. But if test data has M (M~=N) number of samples, the following code does not work because alpha has N rows, not M rows. Thank you, Bhartendu!
for j=1:M
fx1(j,:)=alpha(j)*kernel(x_test,x_test(j,:),'g')';
end

Bhartendu

D W
To get Predicted_values on Test_set: (after applying normalization to Test_set)

for j=1:N
fx1(j,:)=alpha(j)*kernel(x_test,x_test(j,:),'g')';
end

Predicted_values=sum(fx1)';

D W

After training, how to get predicted value on test set?

Bhartendu

Tord Bjørnevaagen
May be zscore normalization has not been executed properply, try something like mentioned below:

[solar_features,mu,sigma]=zscore(solar_features);
newData=[-2.69,979.8,1.08,320803,2.64,863];
newData=(newData-mu)./sigma;

Tord Bjørnevaagen

I may have misunderstood something, but when I train the SVM on a trainset, the result usually performs well on the trained set. But it often performs horribly on a blind set it did not have access to during training. Are there any tweaks to change this?

Bhartendu

Tord Bjørnevaagen
I will update considering your suggestions shortly.
Regards

Tord Bjørnevaagen

Brilliant work!

I have, however, a few suggestions for improvements.

In kernel.m, you write "length(x)". If x has more samples (vertical dimention) than parameters (horizontal direction), length will return the number of samples. If not, it will return the number of parameters. I suggest you replace this with size(x,1) if you want it to apply the number of samples, and size(x,2) for parameters, to avoid crashes. (I assume you want the first).

Also, I suggest adding the line "fx1 = nan(numSamples);" just after the line "% Predicted values", for the sake of speed and readability.

earth science learner

Chin Chou

Bhartendu

Damo Nair, try the following:

clear w;
w=alpha'*x

DUSHYANTH S R

sagar kumar dash

Damo Nair

The size of 'alpha' is 200 x 1 & the size of 'x' is 200 x 2.

Bhartendu

The code is generalised, this kind of error is unfortunate, Please tell me the size of 'alpha' and 'x' (at the moment when you are getting this error).

Damo Nair

When I run your demo SupportVectorRegression on Matlab R2011b it gives me the following error ...
w=sum(alpha.*x) Error using .*
Matrix dimensions must agree.

After a 1000 iterations.

Chang hsiung

MATLAB Release Compatibility
Created with R2016a
Compatible with any release
Platform Compatibility
Windows macOS Linux

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!