Posts tagged ‘crypto’

Together with most of the internet, we tested IPv6 on World IPv6 day last week. I won’t go into details on what IPv6 is and why it’s important. Although IPv6 has been tested intensely in isolated networks, this is the first time it was tested on such a large scale. Technically, the participants would just add AAAA-records for their websites to DNS. This small change causes a huge effect. Since most browsers are configured to prefer IPv6 AAAA-records over IPv4 A-records, this causes all IPv6-connected users to suddenly connect over IPv6 instead of IPv4.

For the most part, this major changeover happened without as much of a hitch. In fact, if I hadn’t known it was World IPv6 day, I wouldn’t have noticed anything. But I’m not a normal web-user, so I did notice some issues.

Continue reading ‘World IPv6 day – lessons learned’ »

Some people seem to be obsessed by long keys for cryptographic purposes. While it does increase the strength of the key, it also decreases the performance. Beyond a certain point, adding extra bits just isn’t worth it. Bruce Schneier did the calculations in his book Applied Cryptography, I added the conversion to SI units: (I’m quoting without permission, under the “review/criticism” and “research/study” exceptions. If the copyright owner does not agree, please contact me.)

One of the consequences of the second law of thermodynamics is that a certain amount of energy is necessary to represent information. To record a single bit by changing the state of a system requires an amount of energy no less than kT, where T is the absolute temperature of the system and k is the Boltzman constant. (Stick with me; the physics lesson is almost over.)

Given that k = 1.38×10-16 erg/°Kelvin [1.38 × 10−23 J/K], and that the ambient temperature of the universe is 3.2°Kelvin, an ideal computer running at 3.2°K would consume 4.4×10-16 ergs [4.41 × 10−23 J] every time it set or cleared a bit. To run a computer any colder than the cosmic background radiation would require extra energy to run a heat pump.

Now, the annual energy output of our sun is about 1.21×1041 ergs [1.21 × 1034 J]. This is enough to power about 2.7×1056 single bit changes on our ideal computer; enough state changes to put a 187-bit counter through all its values. If we built a Dyson sphere around the sun and captured all its energy for 32 years, without any loss, we could power a computer to count up to 2192. Of course, it wouldn’t have the energy left over to perform any useful calculations with this counter.

But that’s just one star, and a measly one at that. A typical supernova releases something like 1051 ergs [1044 J]. (About a hundred times as much energy would be released in the form of neutrinos, but let them go for now.) If all of this energy could be channeled into a single orgy of computation, a 219-bit counter could be cycled through all of its states.

These numbers have nothing to do with the technology of the devices; they are the maximums that thermodynamics will allow. And they strongly imply that brute-force attacks against 256-bit keys will be infeasible until computers are built from something other than matter and occupy something other than space.

The calculations are slightly off; however, it does give a very good indication how far bruteforcing can go, ever.

When you cycle through all possibilities by incrementing the counter, the number of bit changes is higher. To count up to N, you need N flips of bit 0 (the least significant bit); N/2 flips of bit 1; N/4 flips of bit 2; … Some clever mathematicians proved that 1+1/2+1/4+1/8+… = 2, so you need 2N bitflips in total. A 187bit counter hence requires 2 * (2^187-1) bitflips, roughly 3.9E56.

By iterating in a smart way, you can reduce the number of flips to half of that. Calculating this smart way may however require more energy than you’re saving…

Xkcd.com has a very true comic on the practical side of security.