This challenge is to return the WH_delta and WP_delta, given X, WH, WP, EPY using ReLU on the hidden layer and Softmax on the output layer. Test Cases will accumulate dWP and dWH to solve neural nets for Counter, Subtractor,Mux. Test Cases will have four output cases. ReLU performs well on multiple output cases.
[dWP,dWH]=Neural_Back_Propagation_ReLU(X,WH,WP,EPY)
The matlab Latex code for making the Back Propagation chart included in template.
Solution Stats
Solution Comments
Show comments
Loading...
Problem Recent Solvers9
Suggested Problems
-
Find the alphabetic word product
3468 Solvers
-
Back to basics 18 - justification
209 Solvers
-
316 Solvers
-
Sum of diagonal of a square matrix
1640 Solvers
-
512 Solvers
More from this Author306
Problem Tags
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!