The Palindrome

The Palindrome

Share this post

The Palindrome
The Palindrome
How to implement a decision tree
Copy link
Facebook
Email
Notes
More

How to implement a decision tree

Beating neural networks one if-else condition at a time, part 2

Tivadar Danka's avatar
Tivadar Danka
Nov 08, 2024
∙ Paid
26

Share this post

The Palindrome
The Palindrome
How to implement a decision tree
Copy link
Facebook
Email
Notes
More
3
Share

Last time, we stopped the decision tree train right when we finally understood how to train one.

Now, we continue straight where we left off. In today’s post, we’ll have our very own implementation of decision trees ready!

Note: this post is a direct continuation of the How to grow a decision tree post, so read that first to understand this one!

The Palindrome is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.

Scoring a split

Before diving into actually growing a decision tree, let's take a step back and examine a split in detail. Each split is determined by a feature and a threshold, for example:

left_leaf = X[:, 0] < 0
right_leaf = ~left_leaf   # equivalent to right_split = X[0] >= 0

Each split creates two leaves in our decision tree. In our case, these are represented by the boolean arrays left_leaf and right_leaf.

These boolean arrays tell us whether the first feature (represented by X[:, 0]) is larger or smaller than …

Keep reading with a 7-day free trial

Subscribe to The Palindrome to keep reading this post and get 7 days of free access to the full post archives.

Already a paid subscriber? Sign in
© 2025 Tivadar Danka
Privacy ∙ Terms ∙ Collection notice
Start writingGet the app
Substack is the home for great culture

Share

Copy link
Facebook
Email
Notes
More