Discussing Feed Forward & Back Propagation in Neural Networks

Neural Network
Neural Network
Neural Network

Day- 3

Image for post
Image for post
Labeled Neural Network

Feed Forward

np.random.normal(0, constant**0.5, size=(matrix_rows, matrix_cols))
output = activation_function(np.dot(weights, input))
error = target - output
error_term = error * output_grad
weight_step += error_term * input
weight += learning_rate * (weight_step/n_records)

Back Propagation

hidden_layer_input = np.dot(inputs, weights_ih)
hidden_layer_output = activation_function(hidden_layer_input)
output_layer_input = np.dot(hidden_layer_output, weights_ho)
final_output = output_layer_input
output_error_term = error * output * (1 - output)
hidden_error_term = np.dot(output_error_term, weights_ho) * (sigmoid(hidden_layer_output) * (1- sigmoid(hidden_layer_output))
delta_wih = learning_rate * hidden_error_term * inputs[:, None]
delta_who = learning_rate * output_error_term * hidden_layer_output

Side Notes

weight_x_to_y = np.random.normal(0, scale=0.1, size=(x,y))
Image for post
Image for post
Neural Network Weight Representation
Image for post
Image for post
Neural Network Weights Matrix

Written by

Machine Learning Enthusiast | Quick Learner | Student

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store