Neural network accuracy improves on retraining without weight reinitialisation
17 views (last 30 days)
Show older comments
Frederick Turner
on 28 Jan 2018
Commented: Frederick Turner
on 29 Jan 2018
Apologies as a similar question has been asked before, but it was never resolved. I am trying to create a neural network for use in a regression problem using nftool/nntool. I find that, on the first training run, the network sometimes performs quite poorly, but that with subsequent training runs regression accuracy seems to increase (pretty much with each successive run), although the weights have not been reinitialised. Why does this happen (answers in terms of the error surface and backpropogation would be illustrative though I don't need that much detail)? When the weights are not reinitialised, does each training run in MATLAB somehow 'build on' the previous?
Thanks
0 Comments
Accepted Answer
Greg Heath
on 29 Jan 2018
A net with former weights will continue training from those weights.
If you wish to reinitialize to get an alternate design use the function configure at the top of the loop.
I have posted zillions of examples in both the NEWSGROUP (now only available in comp.soft-sys.matlab) and ANSWERS
Try searching
greg configure
Hope this helps.
Thank you for formally accepting my answer
Greg
More Answers (0)
See Also
Categories
Find more on Image Data Workflows in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!