The Palindrome

The Palindrome

Share this post

The Palindrome
The Palindrome
Understanding k-Nearest Neighbors
Copy link
Facebook
Email
Notes
More

Understanding k-Nearest Neighbors

Show me your neighbors, I’ll tell you who you are

Tivadar Danka's avatar
Levi's avatar
Tivadar Danka
and
Levi
May 09, 2024
∙ Paid
37

Share this post

The Palindrome
The Palindrome
Understanding k-Nearest Neighbors
Copy link
Facebook
Email
Notes
More
4
Share

How long does it take to train the k-nearest neighbor model?

Zero seconds. 

It sounds crazy, but kNN is lazy as hell. This model is not training at all!

The group of supervised learning algorithms is broad, but kNN is an outlier. It is a relatively simple model, so it is a good starting point for beginners. There’s also some beautiful math behind: kNN gives us a way to see the importance of distance metrics — an otherwise abstract concept — in real-life machine-learning scenarios. 

On the surface, kNN looks basic, but it is behind advanced (and hyped-up) techniques such as vector similarity search! Whenever you search with an image on Google, there’s (probably) some version of kNN behind.

By the end of this post, you’ll deeply understand how kNN and other lazy algorithms work and why distance is one of the most essential concepts in data science and machine learning.

Let’s dive into it!

Keep reading with a 7-day free trial

Subscribe to The Palindrome to keep reading this post and get 7 days of free access to the full post archives.

Already a paid subscriber? Sign in
A guest post by
Levi
I explain Data Science on Grandma's level.
Subscribe to Levi
© 2025 Tivadar Danka
Privacy ∙ Terms ∙ Collection notice
Start writingGet the app
Substack is the home for great culture

Share

Copy link
Facebook
Email
Notes
More