11 Comments
User's avatar
Sani Nassif's avatar

Small typo in figure. c^2 = a^2 + b^2, not the square root of ...

Expand full comment
Tivadar Danka's avatar

Thanks, you are correct!

Expand full comment
Rocco Jarman's avatar

Outstanding resource. Thanks!

Expand full comment
Vladimir Shitov's avatar

Thank you, that’s a wonderful article! Small typo:

> Take a small step in the direction of the gradient to arrive at the point x₁. (The step size is called the learning rate.)

The steps should be taken in the opposite direction :)

Expand full comment
Tivadar Danka's avatar

Thanks, I'll fix this ASAP!

Expand full comment
Vladimir Shitov's avatar

It’s actually pretty fun to follow the gradient once and see how the loss diverges.

Expand full comment
Saurabh Dalvi's avatar

Much needed thanks

Expand full comment
bluematrix's avatar

Great work.

Just another hint: in the diagram explaining vector addition, you may want to connect the base of vector y to the point of vector x, to visualize they add up to x+y.

Expand full comment
Kovats William's avatar

Is there a typo in your formula:

<ax+y,z> =a<x,z> + <x,y> = <y,z> ?

I would have thought <ax+y,z>= a<x,z> + <y,z>

but I might just not be understanding something.

Expand full comment
sciencetalks's avatar

No amount of words can describe how grateful I am for you to have compiled a stunning resource such as this one. Thank you very much for all of your time and effort! Will definitely be sharing with friends. Can't wait to read more, subscribed. :))

Expand full comment
Dr. U V's avatar

Looks like a great primer for novice like me in machine learning..My forte is number theory and developing math puzzles and games

Expand full comment