Entropy
- Richard Lusk

- Aug 22, 2020
- 1 min read
There is something so captivating about Claude Shannon. About information theory. About how when you think of humans, data, information. How everything really is about passing information from one thing to another. How is a story uttered over a kitchen table different than message packet being delivered over the internet? Or different than light itself crossing parsecs from origin across the universe?
Whats the difference between information and energy? What is the difference between a a beam of light, a signal and a poem? Maybe there's an answer, I honestly don't know.
I keep coming back to entropy as a concept. Aren't we just electrons passing information to each other in our correspondence with eachother? When we misunderstand eachother, isn't that a function of noisy channels?
Why is it that information theory pulled their ideas from Boltzmann and Gibbs? We need to get to the bottom of this; maybe Landauer knows.


Comments