Submitted by Emotional-Fox-4285 t3_yoauod in deeplearning
Emotional-Fox-4285 OP t1_ivjbz4s wrote
Reply to comment by HowdThatGoIn in In my deep NN with 3 layer, . In the second iteration of GD, The activation of Layer 1 and Layer 2 output all 0 due to ReLU as all the input are smaller than 0. And L3 output some value with high floating point which is opposite to first forward_ propagation . Is this how it should work ? by Emotional-Fox-4285
I send you the link to my notebook...
I am beginner ,therefore very lack of knowledge and couldn't find it out myself.
I will be grateful if you take a look of my notebook and feel free to suggest any change.
https://drive.google.com/file/d/1S5s5d6x0iwFOYk9SimiZt2U_6dLNierP/view?usp=sharing
Viewing a single comment thread. View all comments