Gradient Descent: 🗻Going downhill Pt 1

How Computer’s 💻 learn.

OneHotCoder
2 min readDec 19, 2021
Photo by Nathan Anderson on Unsplash

🌄Gradient Descent

Think of Gradient Descent as you are trying to climb down 📉 the hill blindfolded and figuring out at each step in which direction ➡️ you want to go in until you reach the bottom of hill.

Gradient gives us the direction of the steepest ascent ⏫ (i.e top of hill), therefore we step in direction opposite to that of gradient.

💭How does Gradient Descent works ?

First we have to find out the loss function J(theta) (i.e the blue line)

d J(θ)/d (θ) gives us the gradient of that function.

And then we try to minimize this loss using Gradient Descent at each step until we reach the minima of that function.

Loss Function is given by:

  • h(x) → Predicted Value
  • y(i) → True value
  • J(θ) → Loss Function

We minimize 📉 this loss function using gradient descent as :

Check Out: Gradient Descent 🗻: Going Downhill pt 2

--

--