9 Comments
User's avatar
Ameer Saleem's avatar

Was great to collaborate on this with you Tivadar 😁 hope you all enjoy

Expand full comment
Alberto Gonzalez's avatar

Awesome post. The quality of guest authors in this newsletter is top notch.

Expand full comment
Wendlyn Alter's avatar

Great intuitive explanations! I'm an ignoramus, but what I learned from this is (a) it takes a lot to really surprise us (per that logarithmic scale) and (b) entropy is a much different concept than the simplistic idea I always had about it. Do I understand correctly here that it has more to do with the breadth of the spectrum of potentials for a future state, rather than the disorder of a current state? I.e., just about anything could happen?

Expand full comment
dishant ghai's avatar

Very intuitively written. Thanks!

Expand full comment
Vedant Gosavi's avatar

Fascinating

Expand full comment
Robert C Culwell's avatar

Thank you from an olde dawg.

Here's to learning knew trix! 🦴🍻

Expand full comment
Thomas Feeney's avatar

From one teacher to another, this is excellent

Expand full comment
Randy Boring's avatar

Takes me back to 1984 learning this from David Huffman himself at UC Santa Cruz!

Expand full comment
Joe's avatar

One small addition that might help readers connect the math more directly to the idea: the reason we multiply pi by log2(pi) is that entropy is really the weighted average of the information content, which is written as negative log2(pi). The pi acts as the weight — it shows how often each outcome happens. In other words, entropy measures the average surprise across all outcomes, not just the information from one event.

Expand full comment