Member-only story

Tackling common Neural Network problems

Danyal Jamil
6 min readApr 30, 2020

--

Hello Everyone! This is my fourth writing since I’ve started my Nanodegree and have successfully completed the second module of building a Neural Network from scratch, but more on that later. First lets discuss about today’s leanrings.

Deep NN

Day- 4

Todays main topic was Overfitting vs Underfitting. Overfitting is when a model trains on a given data set well but fails to generalize the accuracy on a test data, unseen data. Underfitting happens when a model fails to come up with a model that can tackle the complexity of a data set. The sad part is that theres actually no straight way to get the get best which avoids both of these phenomenas.

To quote an example for Underfitting, Its just like you, trying a kill a bear with mosquito killer, see? The solution is not enough. On the other hand, Overfitting is when you try to kill a mosquito with a bazooka. See what happened there? You used a complex solution for a rather simple problem.

Regularization

Now, this is a technique used to avoid Overfitting. What we do in this process, is that we change the error function and add either the sum of absolute values of the weights times a constant or sum of sqaured values of the weights times a constant. The constant is called Lambda.

--

--

Danyal Jamil
Danyal Jamil

Written by Danyal Jamil

Machine Learning Enthusiast | Student

No responses yet