How does the password Tr0ub4dor&3 have ~28 bits of entropy?

All we need is an easy explanation of the problem, so here it is.

I rencently seen this :

enter image description here

I can’t figure out, how he computes 28 bits of entropy for a password like
“Tr0ub4dor&3” seems really few…

How to solve :

I know you bored from this bug, So we are here to help you! Take a deep breath and look at the explanation of your problem. We have many solutions to this problem, But we recommend you to use the first method because it is tested & true method that will 100% work for you.

Method 1

He’s modeling the password as the output of a randomized algorithm similar to this one:

  1. Pick one word uniformly at random out of a dictionary with 65,536 (= 16 bits) words. (We assume the dictionary is known to the attacker.)
  2. Flip a coin (= 1 bit); if heads, flip the capitalization of the first letter of the word.
  3. For each vowel in the word, flip a coin; if it lands heads, substitute the vowel with its “common substitution”. Munroe is simplifying here by assuming that words in the dictionary typically have three vowels (so we get ~ 3 bits total).
  4. Pick a numeral (~ 3 bits) and a punctuation symbol (~ 4 bits) at random. Flip a coin (= 1 bit); if heads, append the numeral to the password first and the symbol second; if tails, append them in the other order.

The entropy is a function of the random choices made in the algorithm; you calculate it by identifying what random choices the algorithm makes, how many alternatives are available for each random choice, and the relative likelihood of the alternatives. I’ve annotated the numbers in the steps above, and if you add them up you get about 28 bits total.

You can see that Munroe’s procedure isn’t hard science by any means, but it’s not an unreasonable estimate either. He’s practicing the art of the quick-and-dirty estimate, which he very often demonstrates in his work—not necessarily getting the right number, but forming a quick idea of its approximate magnitude.

Method 2

Each small square is a bit of entropy that’s being accounted.

  • 16 bits for the word alone
  • 1 for the first letter: caps or not?
  • 1 for each substitution of O and 0, A and 4
  • 4 for using a symbol that’s not that common
  • 3 for using a number
  • 1 for the unknown order of symbol + number or number + symbol.

There is some reasoning about it. For example, when the password requires caps, almost everybody put the caps in the first letter. So you don’t get much more than just a bit of entropy out of it.

Note: Use and implement method 1 because this method fully tested our system.
Thank you 🙂

All methods was sourced from or, is licensed under cc by-sa 2.5, cc by-sa 3.0 and cc by-sa 4.0

Leave a Reply