How to transfer neural network weights to my own code

4 views (last 30 days)
I'd like to use Matlab to train my neural network, using the weights calculated by Matlab in my own program, written in Go. It seems like this should be pretty straightforward but I have not been able to get the same results.
My application is regression with 21 numeric inputs, all with different ranges, and a single answer that will range between about 25 and 100. I have lots of training data, over 8000 records. I should be able to get good results and, indeed, if I stay completely in Matlab I get very high R scores.
This has been my approach so far:
  1. Linearly scale all of my inputs and target values to between 0 and 1 and import to Matlab
  2. Use the Matlab Neural Network gui to to set up and train a network, mostly using defaults
  3. Run further tests inside Matlab with very good results
  4. Write the network's weights and biases to .csv files using net.IW, net.LW, and net.b
  5. Apply these weights and biases to my own network written in Go.
My own network operates how I understand that it should, but I am new to this so it's possible I'm missing something important. Using the weights and biases from Matlab, I get outputs that range from around -50 to +200 which I'm expecting. The results do seem to respond to the inputs in the correct direction, I just get answers from a much wider range than I'm expecting.
My network operates like so:
  1. Each hidden-layer node sums up each input times the weight for that input-node pair from IW
  2. Each hidden-layer node adds its bias to the sum
  3. Each hidden-layer node "activates" with f(sum) = tanh(sum)
  4. The output node sums up all of the hidden-layer outputs times the appropriate weight from LW
  5. The output node adds its bias
  6. The output node activates with f(sum) = sum
  7. Multiply the output by the scale value (100)
I understand that I can get better results by normalizing my data and/or using different activation functions. My concern right now isn't to achieve the best performance.
All I want to do right now is get my network behaving the same as Matlab's. There's something I'm missing. Maybe Matlab does its own normalization on the training data.
Has anyone been able to do this successfully?

Accepted Answer

Joshua Leasure
Joshua Leasure on 18 Feb 2017
It turns out my suspicions were correct. MATLAB does do pre-processing. But it's possible to disable it. I followed this guide:
https://stackoverflow.com/questions/15526112/export-a-neural-network-trained-with-matlab-in-other-programming-languages/15537848#15537848
And everything is working as expected now.

More Answers (1)

Greg Heath
Greg Heath on 18 Feb 2017
MATLAB's default normalization range is [ -1 1 ].
Hope this helps.
Thank you for formally accepting my answer
Greg
P.S. Whenever you do something new you should first test it on the MATLAB example data in
the help and doc documentation

Categories

Find more on Deep Learning Toolbox in Help Center and File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!