“As to entropy, I think it would actually be a good measure of password complexity, but unfortunately there's no way to compute it directly. We would need a password database comparable in size (or preferably much larger than) the entire password space in order to be able to do that. Since we can't possibly have that (there are not that many passwords in the world), we can't compute the entropy - we can only try to estimate it in various ways (likely poor)”
- Claude Shannon was a smart dude.
- No seriously, he was amazing; He literally wrote the first book on modern code-breaking techniques.
- Shannon entropy is a very powerful tool used to measure information entropy/ information leakage.
- Another way of describing Shannon entropy is that it attempts to quantify how much information is unknown about a random variable.
- It’s been effectively used for many different tasks; from proving one time pads secure, to estimating the limits of data compression.
- Despite the similar sounding names, information entropy and guessing entropy are not the same thing.
- Yes, I’m actually saying that knowing how random a variable is doesn’t tell you how likely it is for someone to guess it in N number of guesses, (with the exception of the boundary cases where the variable is always known – aka the coin is always heads- or when the variable has an even distribution – aka a perfectly fair coin flip).