The Significance of Key Length
In a 1998 article in the industry literature, a writer made the claim that 56bit keys did not provide as adequate protection for DES at that time as they did in 1975 because computers were 1000 times faster in 1998 than in 1975. Therefore, the writer went on, we needed 56,000bit keys in 1998 instead of 56bit keys to provide adequate protection. The conclusion was then drawn that because 56,000bit keys are infeasible (true), we should accept the fact that we have to live with weak cryptography (false!). The major error here is that the writer did not take into account that the number of possible key values double whenever a single bit is added to the key length; thus, a 57bit key has twice as many values as a 56bit key (because 2^{57} is two times 2^{56}). In fact, a 66bit key would have 1024 times more values than a 56bit key.
But this does bring up the issue, what is the significance of key length as it affects the level of protection?
In cryptography, size does matter. The larger the key, the harder it is to crack a block of encrypted data. The reason that large keys offer more protection is almost obvious; computers have made it easier to attack ciphertext by using brute force methods rather than by attacking the mathematics (which are generally wellknown anyway). With a brute force attack, the attacker merely generates every possible key and applies it to the ciphertext. Any resulting plaintext that makes sense offers a candidate for a legitimate key. This was the basis, of course, of the EFF’s attack on DES.
Until the mid1990s or so, brute force attacks were beyond the capabilities of computers that were within the budget of the attacker community. By that time, however, significant compute power was typically available and accessible. Generalpurpose computers such as PCs were already being used for brute force attacks. For serious attackers with money to spend, such as some large companies or governments, Field Programmable Gate Array (FPGA) or ApplicationSpecific Integrated Circuits (ASIC) technology offered the ability to build specialized chips that could provide even faster and cheaper solutions than a PC. (As an example, the AT&T Optimized Reconfigurable Cell Array (ORCA) FPGA chip cost about $200 and could test 30 million DES keys per second, while a $10 ASIC chip could test 200 million DES keys per second; compare that to a PC which might be able to test 40,000 keys per second.) Distributed attacks, harnessing the power of between tens and tens of thousands of powerful CPUs, are now commonly employed to try to bruteforce crypto keys.
The table below — from a 1995 article discussing both why exporting 40bit keys was, in essence, no crypto at all and why DES’ days were numbered — shows what DES key sizes were needed to protect data from attackers with different time and financial resources. This information was not merely academic; one of the basic tenets of any security system is to have an idea of what you are protecting and from who are you protecting it! The table clearly shows that a 40bit key was essentially worthless against even the most unsophisticated attacker. On the other hand, 56bit keys were fairly strong unless you might be subject to some pretty serious corporate or government espionage. But note that even 56bit keys were clearly on the decline in their value and that the times in the table were worst cases.
Type of Attacker  Budget  Tool  Time and Cost Per Key Recovered 
Key Length Needed For Protection In Late1995 


40 bits  56 bits  
Pedestrian Hacker  Tiny  Scavenged computer time 
1 week  Infeasible  45 
$400  FPGA  5 hours ($0.08) 
38 years ($5,000) 
50  
Small Business  $10,000  FPGA  12 minutes ($0.08) 
18 months ($5,000) 
55 
Corporate Department  $300K  FPGA  24 seconds ($0.08) 
19 days ($5,000) 
60 
ASIC  0.18 seconds ($0.001) 
3 hours ($38) 

Big Company  $10M  FPGA  7 seconds ($0.08) 
13 hours ($5,000) 
70 
ASIC  0.005 seconds ($0.001) 
6 minutes ($38) 

Intelligence Agency  $300M  ASIC  0.0002 seconds ($0.001) 
12 seconds ($38) 
75 
So, how big is big enough? DES, invented in 1975, was still in use at the turn of the century, nearly 25 years later. If we take that to be a design criteria (i.e., a 20plus year lifetime) and we believe Moore’s Law (“computing power doubles every 18 months”), then a key size extension of 14 bits (i.e., a factor of more than 16,000) should be adequate. The 1975 DES proposal suggested 56bit keys; by 1995, a 70bit key would have been required to offer equal protection and an 85bit key necessary by 2015.
A 256 or 512bit SKC key will probably suffice for some time because that length keeps us ahead of the brute force capabilities of the attackers. Note that while a large key is good, a huge key may not always be better; for example, expanding PKC keys beyond the current 2048 or 4096bit lengths doesn’t add any necessary protection at this time. Weaknesses in cryptosystems are largely based upon key management rather than weak keys.
Much of the discussion above, including the table, is based on the paper “Minimal Key Lengths for Symmetric Ciphers to Provide Adequate Commercial Security” by M. Blaze, W. Diffie, R.L. Rivest, B. Schneier, T. Shimomura, E. Thompson, and M. Wiener (1996).
The most effective largenumber factoring methods today use a mathematical Number Field Sieve to find a certain number of relationships and then uses a matrix operation to solve a linear equation to produce the two prime factors. The sieve step actually involves a large number of operations that can be performed in parallel; solving the linear equation, however, requires a supercomputer. Indeed, finding the solution to the RSA140 challenge in February 1999 — factoring a 140digit (465bit) prime number — required 200 computers across the Internet about 4 weeks for the first step and a Cray computer 100 hours and 810 MB of memory to do the second step.
In early 1999, Shamir (of RSA fame) described a new machine that could increase factorization speed by 23 orders of magnitude. Although no detailed plans were provided nor is one known to have been built, the concepts of TWINKLE (The Weizmann Institute Key Locating Engine) could result in a specialized piece of hardware that would cost about $5000 and have the processing power of 1001000 PCs. There still appear to be many engineering details that have to be worked out before such a machine could be built. Furthermore, the hardware improves the sieve step only; the matrix operation is not optimized at all by this design and the complexity of this step grows rapidly with key length, both in terms of processing time and memory requirements. Nevertheless, this plan conceptually puts 512bit keys within reach of being factored. Although most PKC schemes allow keys that are 1024 bits and longer, Shamir claims that 512bit RSA keys “protect 95% of today’s Ecommerce on the Internet.” (See Bruce Schneier’s CryptoGram (May 15, 1999) for more information, as well as the comments from RSA Labs.)
It is also interesting to note that while cryptography is good and strong cryptography is better, long keys may disrupt the nature of the randomness of data files. Shamir and van Someren (“Playing hide and seek with stored keys”) have noted that a new generation of viruses can be written that will find files encrypted with long keys, making them easier to find by intruders and, therefore, more prone to attack.
Finally, U.S. government policy has tightly controlled the export of crypto products since World War II. Until the mid1990s, export outside of North America of cryptographic products using keys greater than 40 bits in length was prohibited, which made those products essentially worthless in the marketplace, particularly for electronic commerce; today, crypto products are widely available on the Internet without restriction. The U.S. Department of Commerce Bureau of Industry and Security maintains an Encryption FAQ web page with more information about the current state of encryption registration.
Without meaning to editorialize in this tutorial, it is important to note that the 40bit key limit was, in essence, put into place by policy makers who believed that only the U.S. knew how to build strong crypto algorithms, ignoring the work ongoing in Australia, Canada, Israel, South Africa, the U.K., and other locations in the 1990s. But there is still a prevailing attitude, apparently, that U.S. crypto algorithms are the only strong ones around; consider Bruce Schneier’s blog in June 2016 titled “CIA Director John Brennan Pretends Foreign Cryptography Doesn’t Exist.” Cryptography is a decidely international game today; note the many countries mentioned above as having developed various algorithms, not the least of which is the fact that NIST’s Advanced Encryption Standard employs an algorithm submitted by cryptographers from Belgium. For more evidence, see Schneier’s Worldwide Encryption Products Survey (February 2016).
On a related topic, public key crypto schemes can be used for several purposes, including key exchange, digital signatures, authentication, and more. In those PKC systems used for SKC key exchange, the PKC key lengths are chosen so to be resistant to some selected level of attack. The length of the secret keys exchanged via that system have to have at least the same level of attack resistance. Thus, the three parameters of such a system — system strength, secret key strength, and public key strength — must be matched. This topic is explored in more detail in Determining Strengths For Public Keys Used For Exchanging Symmetric Keys (RFC 3766).
Source : http://www.garykessler.net/library/crypto.html#keylen