tag:blogger.com,1999:blog-496451536493805371.post1468627758970953800..comments2018-01-24T20:41:44.023-08:00Comments on Reusable Security: CCS Paper Part #2: Password EntropyMatt Weirhttp://www.blogger.com/profile/16008062842047893999noreply@blogger.comBlogger7125tag:blogger.com,1999:blog-496451536493805371.post-89208896706385954092013-04-28T22:18:20.195-07:002013-04-28T22:18:20.195-07:00If there is no recommended notation for the far mo...If there is no recommended notation for the far more useful concept of how many guesses it takes to reach some given probability of attacker success, I'd like to suggest<br /><br /> C(X,k) = 2 log_2(G(0.5, X, k))<br /><br />where X is a probability distribution over {x_1, ... x_m} and k is a key (or password), k = x_i, and N(p, X, k) is number of guesses needed for probability p of finding k in distribution X.<br /><br />C is for "crack time". And I picked 2 log_2 so that C(X,w) will equal H(X) when X is uniformly distributed.<br /><br />Some things in there are arbitrary and not particularly principled. But if nobody has something better then I will use that when talking about this.JPGoldberghttps://www.blogger.com/profile/13464707043372692893noreply@blogger.comtag:blogger.com,1999:blog-496451536493805371.post-78952134496748828202011-01-01T17:12:51.126-08:002011-01-01T17:12:51.126-08:00Ohh wait
"Ask a crypto guy, (or gal), if the...Ohh wait<br /><br />"Ask a crypto guy, (or gal), if the Shannon entropy of a message encrypted with a truly random and properly applied one time pad is equal to the size of the key."<br /><br />Is asking "if the entropy of the <b>plain text</b> is the same as the entropy of the key."<br /><br />I thought it was asking "if the entropy of the <b>cipher text</b> is the same as the entropy of the key."<br /><br />Thanks for be so patient and sorry for wasting your time.Sc00bzhttps://www.blogger.com/profile/07569236555003905833noreply@blogger.comtag:blogger.com,1999:blog-496451536493805371.post-64384651214151044222010-12-14T21:33:00.409-08:002010-12-14T21:33:00.409-08:00Hey, sorry it took me so long to get back to you S...Hey, sorry it took me so long to get back to you Sc00bz. Once again though I'm going to have to disagree with you, though on a technicality. What the Shannon entropy measures is the amount of UNKNOWN INFORMATION. I capitalized that since that's the part you need to focus on. Since the data being transmitted is the message, here are the following use cases:<br /><br />If the message, encrypted_message and key are unknown, then the entropy is that of the message. That's because you have a base understanding of what the message might say. Aka, Matt really likes attacking at noon, so if Matt sends a message it will probably say "attack at noon" with a certain probability. Even if the attacker isn't able to intercept any messages, they still will know I like to sleep in, and probably won't attack in the morning ;)<br /><br />If the encrypted message + key is revealed to the attacker, the entropy is 0. Aka, the attacker can decode everything, so there is no unknown info.<br /><br />If the message is revealed, the entropy is 0. See the above, the attacker knows all they need to know.<br /><br />With a one-time-pad, if then encrypted message, but NOT the key is revealed, the entropy is equal to that of the original message. By that I mean the attacker doesn't suddenly get dumber by intercepting the encrypted message. Therefore, the entropy can not be worse than if the attacker did not intercept the encrypted message. Therefore, the entropy of the encrypted message can not be more than the entropy of the unknown transmitted message.<br /><br />Long story short, the entropy of a hidden value can only decrease with the addition of new information. Therefore, even with perfect encryption, the entropy of the message can never be higher than entropy of the original message.<br /><br />As I said, information entropy is weird ;)Matt Weirhttps://www.blogger.com/profile/16008062842047893999noreply@blogger.comtag:blogger.com,1999:blog-496451536493805371.post-45698122220064852262010-11-19T14:14:15.088-08:002010-11-19T14:14:15.088-08:00I was just thinking about this and I think I under...I was just thinking about this and I think I understand how we got different answers. I'm assuming neither the encrypted message nor the key is given.<br /><br />I'm pretty sure once a message is given its Shannon entropy would be zero because there is only one message.<br /><br /><br />The encrypted message is given:<br />This would mean the Shannon entropy of the encrypted message is zero.<br /><br />The key is given:<br />This would mean the Shannon entropy of the encrypted message is the same as the Shannon entropy of the message.<br /><br />Given neither:<br />This would mean the Shannon entropy of the encrypted message is the same as the key because that's how truly random and properly applied one time pads work.Sc00bzhttps://www.blogger.com/profile/07569236555003905833noreply@blogger.comtag:blogger.com,1999:blog-496451536493805371.post-52444753530844001932010-11-02T02:56:09.911-07:002010-11-02T02:56:09.911-07:00I just made a small, (but significant), edit in my...I just made a small, (but significant), edit in my description of how to calculate entropy. I added the word "independent" when talking about how entropy values are additive. Aka the final entropy value of a one time pad encrypted message is not 1.0 + (entropy of the message), since those two values do not influence the encrypted message independently, (they are XORed). By this I mean the attacker knows the final ciphertext, and they know the probability distribution of the original message. If they can guess the original message, they can XOR it with the cyphertext and obtain the key. Therefore while the key might have an entropy of 1 per bit, the plaintext message leaks information about it. As I said, entropy is weird ;) Another side note: This is also why one time pads suffer so much when their key generation is not completely random.Matt Weirhttps://www.blogger.com/profile/16008062842047893999noreply@blogger.comtag:blogger.com,1999:blog-496451536493805371.post-3107589845513993262010-11-02T02:47:02.618-07:002010-11-02T02:47:02.618-07:00Entropy is weird. This example trips up just about...Entropy is weird. This example trips up just about everyone, (including me), when they first see it which is why I included it. The thing to remember is that entropy measures how much information is unknown about the final message. Here's one way to look at the problem:<br /><br />Let's say you know that 70% of my messages read "Attack at dawn", because I'm an early riser, (cough), but 30% of my messages read "Attack at noon", due to some nights I like to spend at the bar. This means if you intercept one of my messages, you would still be able to guess the contents correctly 70% of the time. The one time pad doesn't add any entropy, but neither does it reduce the entropy of the final message, (aka leak any additional information). This is why it's considered a perfectly secure cipher. Any other crypto algorithm has the potential of leaking additional information.Matt Weirhttps://www.blogger.com/profile/16008062842047893999noreply@blogger.comtag:blogger.com,1999:blog-496451536493805371.post-47977061807403247602010-11-01T21:25:53.284-07:002010-11-01T21:25:53.284-07:00"Ask a crypto guy, (or gal), if the Shannon e..."Ask a crypto guy, (or gal), if the Shannon entropy of a message encrypted with a truly random and properly applied one time pad is equal to the size of the key. If they say “yes”, point and laugh at them. The entropy is equal to that of original message silly!"<br /><br />I'm confused because when I think about this I come up with this basic scenario my message and one time pad are one bit long and my message is always 1.<br /><br />My message's Shannon entropy is zero [100% for 1].<br />My one time pad's Shannon entropy is one [50% for 0, 50% for 1].<br />My encrypted message's Shannon entropy is one [50% for 1, 50% for 0].<br /><br />What am I doing wrong?Sc00bzhttps://www.blogger.com/profile/07569236555003905833noreply@blogger.com