10 Comments
User's avatar
Ameer Saleem's avatar

Was great to collaborate on this with you Tivadar 😁 hope you all enjoy

Alberto Gonzalez's avatar

Awesome post. The quality of guest authors in this newsletter is top notch.

Wendlyn Alter's avatar

Great intuitive explanations! I'm an ignoramus, but what I learned from this is (a) it takes a lot to really surprise us (per that logarithmic scale) and (b) entropy is a much different concept than the simplistic idea I always had about it. Do I understand correctly here that it has more to do with the breadth of the spectrum of potentials for a future state, rather than the disorder of a current state? I.e., just about anything could happen?

dishant ghai's avatar

Very intuitively written. Thanks!

Vedant Gosavi's avatar

Fascinating

Robert C Culwell's avatar

Thank you from an olde dawg.

Here's to learning knew trix! 🦴🍻

Thomas Feeney's avatar

From one teacher to another, this is excellent

Sqth's avatar

This is mazing I've never read this sort of translation of intuition to math before. Its actually mind blowing how that correlated...it would be very interesting to try that out with my own intuitions. Starting out with the shape of a graph and building top down like that.

Personally I was convinced the only way to take intuition to math was rigourously having theorems and proving them, but I suppose your could work backwards and then try to prove or disprove.

Randy Boring's avatar

Takes me back to 1984 learning this from David Huffman himself at UC Santa Cruz!

Joe's avatar

One small addition that might help readers connect the math more directly to the idea: the reason we multiply pi by log2(pi) is that entropy is really the weighted average of the information content, which is written as negative log2(pi). The pi acts as the weight — it shows how often each outcome happens. In other words, entropy measures the average surprise across all outcomes, not just the information from one event.