Member-only story
Diving Deeper into Neural Networks
Day- 2
Hello Everyone! This is my second post of my journey of completing the Deep Learning Nanodegree. Today’s main day was spent on learning how Neural Networks worked.
Neural Network are just Multi layer perceptrons.
Errors
The third type of Error is the Softmax Function. It is same as the Sigmoid function but Sigmoid Function has a drawback that it can be used only when there would be two possible outcomes but on the other hand, Softmax Function is used when we have to deal with three or more possibilities, what it does is same, it converts the amount of a certain event to happen into probabilities, which makes it easier to use them later in the models. But the problem again raises, lets suppose you opt to sum the possibilities and then divide from the total number of probabilities, taking the average. This approach is okay but doesn’t work in case of Softmax function as it cannot handle if the inputs to the function are negative. Hence, to solve this problem, we use Exponential on the values to convert any negative number into a positive one.
Cross Entropy
Now, after this, we need to compute finals probabilities, those of output layer. For this, Cross Entropy is used. What Cross Entropy is, it is the sum of all the negative logarithms…