Just another hint: in the diagram explaining vector addition, you may want to connect the base of vector y to the point of vector x, to visualize they add up to x+y.
No amount of words can describe how grateful I am for you to have compiled a stunning resource such as this one. Thank you very much for all of your time and effort! Will definitely be sharing with friends. Can't wait to read more, subscribed. :))
Small typo in figure. c^2 = a^2 + b^2, not the square root of ...
Thanks, you are correct!
Outstanding resource. Thanks!
Thank you, that’s a wonderful article! Small typo:
> Take a small step in the direction of the gradient to arrive at the point x₁. (The step size is called the learning rate.)
The steps should be taken in the opposite direction :)
Thanks, I'll fix this ASAP!
It’s actually pretty fun to follow the gradient once and see how the loss diverges.
Much needed thanks
Great work.
Just another hint: in the diagram explaining vector addition, you may want to connect the base of vector y to the point of vector x, to visualize they add up to x+y.
Is there a typo in your formula:
<ax+y,z> =a<x,z> + <x,y> = <y,z> ?
I would have thought <ax+y,z>= a<x,z> + <y,z>
but I might just not be understanding something.
No amount of words can describe how grateful I am for you to have compiled a stunning resource such as this one. Thank you very much for all of your time and effort! Will definitely be sharing with friends. Can't wait to read more, subscribed. :))
Looks like a great primer for novice like me in machine learning..My forte is number theory and developing math puzzles and games