Suck behind the paywall? Click here to read the full story with my friend link!
So, 3 months ago, I woke up at 3 pm, earlier than usual, thinking about which movie to watch. Opened Facebook, saw some people posting about CodeCademy offering free Pro Memberships for Covid-19. Was kinda amazed and applied right away.
Now, after reading many reviews and articles, I decided that I’d study Web Development form the platform. Then, there were either courses that I’d have to enroll and complete or the whole Career Path. Not gonna lie, the career path seemed like a ton of work in the start but I eventually made up my mind to go for that rather than to manually choose what to study. Also, I feel like its easier when someone does the tough job for you i.e. The curriculum-making, etc. …
This article is actually a continuum of a series that focuses on the basic understanding of the building blocks of Deep Learning. Some of the previous articles are, in case you need to catch up:
We have studied Gradient Descent in the past and we know how quick it converges to a minimum value. But ordinary GD can often does not work when we are dealing with local minimums. For example, see this graph:
Stuck behind the paywall? Click here to read the full story with my friend link!
This article is actually a continuum of a series that focuses on the basic understanding of the building blocks of Deep Learning. Some of the previous articles are, in case you need to catch up:
Machine Learning is not just classifying whether an image contains a dog or not. Neither is it just for predicting house predictions of Boston. When one gets into how many applications there really are of this vast industry, one usually gets stunned!
Back to the point, it’s not compulsory that a model can predict just one probability when given a scenario, for example, what if you want to check if an image contains a dog and a cat? You see? This is where we have multi-class classification. …
Stuck behind the paywall? Click here to read the full story with my friend link!
This article is actually a continuum of a series that focuses on the basic understanding of the building blocks of Deep Learning. Some of the previous articles are, in case you need to catch up:
Here in this article, we will talk about another very often used technique which people don’t pay that much heed to. I’ll try to cover as much as I can.
Stuck behind the paywall? Click here to read the full story with my friend link!
This article is actually a continuum of a series that focuses on the basic understanding of the building blocks of Deep Learning. Some of the previous articles are, in case you need to catch up:
Deep Learning is the branch of Artificial Intelligence where we let the model learn features on its own to get to a result. …
Stuck behind the paywall? Click here to read the full story with my friend link!
This article is actually a continuum of a series that focuses on the basic understanding of the building blocks of Deep Learning. Some of the previous articles are, in case you need to catch up:
Deep Learning in a field that is almost ruling the list of Most worked on industries, and the reason I see is the potential it has in the near future. I have mentioned this numerous times that we won’t have people driving taxis or people are the cash counter after 10 years, and all that work would be replaced by Machines. …
Stuck behind the paywall? Click here to read the full story with my friend link!
This article is actually a continual of previous series of articles. Here are links to the stories, if you will to come along.
Now, who doesn’t like Deep Learning? I suppose you do or why else would you be reading this article. But Deep Learning is just starting out these days, there ALOT that has to be discovered in the field.
Despite the rapid advancements and studies being done in the field, there’s is still a ton of stuff that we need to unveil. …
Stuck behind the paywall? Click here to read the full story with my friend link!
In the last post, we covered Exponentially Weighted Averages! Now, to further improve the model’s accuracy, we’ll discuss L2 regularization. Take a peek at my last post here:
‘Regularization is any modification we make to a learning algorithm that is intended to reduce its generalization error but not its training error.’ ~ Ian GoodFellow
Now, there are two types of Regularizations, L1 and L2. Sure, there are others too but these are the generally thought of when we talk about L regularizations. …
Stuck behind the paywall? Click here to read the full story with my friend link!
We all, ML Engineers or Data Scientists, love Data. Whenever we hear that we are getting more Data to use, it sounds like Heaven but
Not everything is as it seems.
“What is the drawback here?” ~ you might ask. So, we have our little CPU Processors, but some of the lucky ones among us have GPUs, but then too, the Computing Power is not skyrocketing and has a limit. The main drawback that I can think of is Too long Training time obviously.
Let me explain this the way NG explained. Suppose you have the weather data of London. You want to able to predict weather on different days. The data might look something…
Stuck behind the paywall? Click here to read the full story with my friend link!
We all, ML Engineers or Data Scientists, love Data. Whenever we hear that we are getting more Data to use, it sounds like Heaven but
Not everything is as it seems.
“What is the drawback here?” ~ you might ask. So, we have our little CPU Processors, but some of the lucky ones among us have GPUs, but then too, the Computing Power is not skyrocketing and has a limit. The main drawback that I can think of is Too long Training time obviously.
Say you have a batch of 5,000,000 data inputs. Now, this would alone make a vector of 5,000,000 sizes, and should you think of efficiency when performing any Mathematical operation on a vector this big. What we do here? Ever heard of the saying “Baby steps”, this is essentially that. …
About