MATLAB Answers

is it important to normalise the input to a neural network before training

15 views (last 30 days)
Newman
Newman on 15 Jul 2016
Edited: Greg Heath on 27 Jul 2016
I have a feature vector of the size 10000x400(400 samples) and target matrix is 40x400(40 classes).The input feature vecotr for each sample has 10,000 rows which have values like 0 123 212 242 123 45 etc.So I want ot ask that should I normalise all the elements in the rows by using the standard formula:
element of row=(element of row-mean(of column))/standard deviation (if same col).

Accepted Answer

Greg Heath
Greg Heath on 16 Jul 2016
Edited: Greg Heath on 27 Jul 2016
1. Delete and/or modify numerical outliers. Standardization of data to
zero-mean/unit-variance is the most effective way to do this.
2. Keep the ranges of all input and target vector components comparable to help
understand their relative importance.
3. Consider biases to be weights that act on unit vector components
4. Keep the initial scalar products of weights and vectors within the linear regions
of the sigmoids to avoid algebraic stagnation in the asymptotic regions.
5. Data scaling to [-1 1 ] is a MATLAB default. Standardization and no scaling are the
alternatives. Since you already have unscaled and standardarized data, you have a
variety of choices. My choice is to use the standardized data but accept the
[-1 1 ] default.
Why? ... because it is the easiest to code and understand.
Hope this helps.
Thank you for formally accepting my answer
Greg
  2 Comments
Greg Heath
Greg Heath on 17 Jul 2016
1. My bad. zero-mean/unit-variance is STANDARDIZATION.
MATLAB:
help zscore
doc zscore
NNTOOLBOX:
help mapstd
doc mapstd
2. Defaults do not have to be accepted. They are what
the algorithm uses if an alternative is not specified.
3. Typically, feature vectors are combined to create
feature matrices so that inputs and outputs are
matrices. If you decide to not accept the MAPMINMAX
default, you can use
a. MAPSTD
b. '' % No normalization
Hope this helps.
Greg

Sign in to comment.

More Answers (1)

Walter Roberson
Walter Roberson on 15 Jul 2016
Edited: Walter Roberson on 15 Jul 2016
Algebraically it is not important -- as long as you adjust your transfer functions appropriately. In practice, with floating point round-off and limited range, there could be some effects, which could be anywhere from minor to major, depending on your transfer functions.
Normalizing makes it a lot easier to compare the effects of different parameters. If A varies twice as much as B, is that because A is more important in determining the correlation, or is it because the range of A is higher and maybe A is actually less important? When you normalize then you do not have to think as much about how to interpret the results.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!