What is password entropy?
Password entropy is a measurement of how unpredictable a password is — specifically, how many guesses an attacker would need to crack it through brute force. It is expressed in bits, and each additional bit doubles the number of guesses required.
The concept comes from information theory, specifically Claude Shannon's 1948 paper "A Mathematical Theory of Communication." Shannon entropy measures the minimum number of bits needed to represent a piece of information — in the password context, how much information content (unpredictability) a password contains.
A password with 40 bits of entropy requires up to 240 guesses to crack by brute force — roughly 1.1 trillion guesses. With 80 bits, that becomes 280, or about 1.2 × 1024 guesses. The difference is not linear — it is exponential. This is why security professionals care so much about entropy.
How to calculate entropy
Password entropy depends on two variables: the length of the password and the size of the pool of possible characters at each position.
The formula is: H = L × log₂(N)
Where H is entropy in bits, L is password length, and N is the size of the character pool.
Common character pool sizes:
| Character set | Pool size (N) | log₂(N) |
|---|---|---|
| Digits only (0–9) | 10 | 3.32 bits/char |
| Lowercase letters (a–z) | 26 | 4.70 bits/char |
| Lower + uppercase | 52 | 5.70 bits/char |
| Alphanumeric (lower + upper + digits) | 62 | 5.95 bits/char |
| Full printable ASCII (including symbols) | 94 | 6.55 bits/char |
| Full ASCII + Unicode extensions | 100+ | 6.64+ bits/char |
Worked examples:
- 8-character alphanumeric password: 8 × log₂(62) = 8 × 5.95 = 47.6 bits
- 12-character full ASCII password: 12 × log₂(94) = 12 × 6.55 = 78.6 bits
- 20-character full ASCII password: 20 × log₂(94) = 20 × 6.55 = 131 bits
This is the theoretical maximum entropy — the actual effective entropy depends on how randomly the password was generated. A human-chosen "random" password almost always has far lower effective entropy because humans are predictable.
Entropy vs. perceived strength
This is where most password strength meters mislead users. A typical strength meter checks for uppercase, lowercase, numbers, and symbols and reports "strong" when all four are present. But a password like P@ssw0rd1 has high perceived strength (4 character types, 9 characters) and very low actual strength — it appears on every cracked password list.
Why? Because entropy measures statistical unpredictability, not just character diversity. When a human chooses a password by substituting @ for a and 0 for o, they are following an extremely predictable pattern. Modern cracking tools handle these substitutions trivially — they are baked into dictionary attack rule sets.
True entropy requires true randomness. A password generator using a cryptographically secure pseudorandom number generator (CSPRNG) — like crypto.getRandomValues() in the browser, which PassGeni uses — produces passwords with entropy close to the theoretical maximum. A human choosing a "random" password typically achieves 10–30% of the theoretical maximum entropy for the same length and character set.
Entropy and crack time
Crack time estimates depend on the attacker's hardware and the hashing algorithm used to store the password. Modern GPU-accelerated cracking rigs can test billions to trillions of guesses per second against weakly hashed passwords (MD5, SHA-1). Against modern slow hashes (bcrypt, Argon2id), the rate drops dramatically.
| Entropy | Guesses required | Time at 10B/sec (MD5) | Time at 10K/sec (bcrypt) |
|---|---|---|---|
| 28 bits | ~268 million | < 1 second | ~7 hours |
| 40 bits | ~1.1 trillion | ~110 seconds | ~3.5 years |
| 56 bits | ~72 quadrillion | ~83 days | Millions of years |
| 80 bits | ~1.2 × 10²⁴ | Billions of years | Heat death of universe |
| 128 bits | ~3.4 × 10³⁸ | Impossible | Impossible |
Key takeaway: against a slow hash (which any serious application uses), 40+ bits of entropy is sufficient for most purposes. Against an offline attack on leaked MD5 hashes — the scenario after a major breach — you want 60+ bits. For long-term secrets, 80+ bits provides a meaningful safety margin against near-future computational improvements.
Why pool size matters
Expanding the character pool is the most efficient way to increase entropy per character. Compare:
- Lowercase only: 4.70 bits per character
- Full ASCII (lowercase + uppercase + digits + 32 symbols): 6.55 bits per character
- Difference: 1.85 bits per character
Over a 12-character password, that difference is 22 bits — equivalent to adding roughly 3–4 characters to a lowercase-only password. This is why including symbols is genuinely valuable, even as NIST warns against requiring them (because requirements lead to predictable substitutions).
The practical recommendation: use a password generator that draws from a large pool. PassGeni's default pool (lowercase + uppercase + digits + symbols) gives 6.55 bits per character. At 18 characters (the default length), that is approximately 118 bits — sufficient against any foreseeable attack.
Length vs. complexity: the numbers
The NIST guidance to prefer length over complexity is mathematically correct, but the two are not mutually exclusive. The data:
| Configuration | Entropy | Verdict |
|---|---|---|
| 8 chars, full ASCII (complex) | 52 bits | Marginal |
| 12 chars, lowercase only (long) | 56 bits | Marginal |
| 12 chars, full ASCII | 79 bits | Good |
| 16 chars, full ASCII | 105 bits | Strong |
| 20 chars, full ASCII | 131 bits | Post-quantum safe |
| 5-word passphrase (EFF list, 7776 words) | 65 bits | Good for memorability |
| 6-word passphrase (EFF list) | 78 bits | Strong |
The ideal is both: long and drawn from a large pool. A randomly generated 16-character password using the full ASCII printable set achieves 105 bits — better than any human-chosen passphrase, and immune to the pattern-based attacks that defeat complexity rules.
Practical entropy targets
What entropy should you target for different use cases?
Common entropy myths
Myth: "My password is strong because it has uppercase, numbers, and symbols." Character diversity contributes to pool size but not if the underlying pattern is predictable. P@ssw0rd! has all four types and zero real entropy because it appears in every cracking dictionary.
Myth: "I just need to avoid dictionary words." Modern cracking tools combine dictionary attacks with rule-based transforms, making this insufficient at short lengths. Below 12 characters, even non-dictionary passwords can be cracked quickly against weakly hashed stores.
Myth: "Entropy is the same thing as strength." Entropy measures theoretical unpredictability of random selection. Real-world password strength also depends on: the hashing algorithm used to store it, whether it appears in breach databases, the attacker's specific tooling, and rate limiting on the authentication system. A 40-bit randomly generated password stored with Argon2id is practically much stronger than an 80-bit password stored as unsalted MD5.
Myth: "Passphrases always have more entropy than passwords." A randomly chosen 4-word passphrase from the EFF list has about 51 bits of entropy — less than a 9-character random full-ASCII password. Passphrases win on memorability, not raw entropy at equal length.
crypto.getRandomValues() directly in your browser.