Evolution of an Algorithm: From DES to DESX

By Alex Sutay

            For as long as we’ve been sharing information, we’ve been trying to obscure that information. Secret codes, text ciphers, and more have all been used for centuries to hide information inside other data so that anyone can read our messages, but only our recipient can truly understand. Encryption is just our modern implementation of this desire of obscurity.

            One of the earliest codified public encryption algorithms was the Data Encryption Standard, or DES. DES was developed by IBM under the name Lucifer. It was a block cipher with a symmetric key, meaning that it encrypted the data one chunk at a time and the same key both encrypts and decrypts the data.  While it was being developed, IBM was consulted by the NSA. This new edited version of Lucifer was presented to the National Bureau of Standards who adopted it in November of 1976 under the name we know it as today, DES.

Quick note on the graphs: All of them use the same data collected from a python script using pycryptodome for the encryption. This first one has a different y-axis scale. Also, you can see the x-axis scale is all x10^9, which is noted in the bottom right corner. The max file size tested was 1 GB

The NSA’s involvement in the creation of DES actually includes a couple of interesting controversies. The biggest change the NSA suggested that most people opposed was the key size. They proposed a 64-bit key with 8 bits used for parity, meaning it had an effective key size of only 56 bits. Although that was far too many for commercially available computers to brute force at the time, it was still relatively small and easy to foresee that it would be cracked in the near future. Some even suspect that the NSA already had the capability to brute force it and they were pushing so hard for the lower key size to allow themselves to break encryption whenever they desired. Another controversial change was the NSA’s suggestion for S-Boxes. S-Boxes, short for substitution boxes, are used in the algorithm to further obscure the data and are essential to preventing certain types of known cryptanalysis, the art of breaking encryption. The NSA proposed their own S-Boxes, but the reasoning behind their choices were kept confidential. This led people to believe they had built in their own back door that would allow them to break the encryption quickly and easily, no brute force necessary. In fact, in 1990, Biham and Shamir discovered a new type of cryptanalysis that exploited weak S-Boxes named differential cryptanalysis. However, the S-Boxes proposed by the NSA were actually resistant to differential cryptanalysis! This suggests that the NSA actually knew about this type of attack and kept it confidential while helping develop DES to be resistant.

            The next iteration of DES was 3DES, or triple DES, developed in 1995. As discussed in the previous section, the primary weakness of DES was the limited key size. To fix this, it was suggested that you run the encryption algorithm multiple times. In theory, this means that the data will be encrypted with a much larger key space. If the original key was effectively 56 bits, then surely running it twice would give you an effective key space of 112 bits. Unfortunately, that’s not how it turned out. Simply encrypting twice was weak to an attack known as the meet-in-the-middle attack. This means that running DES twice with 2 different keys would increase your effective key space from 56 bits to 57 bits. This is… underwhelming to say the least. It’s not worth doing twice the computations for only 1 extra bit of security. Thankfully, there is a solution: run the algorithm 3 times. After accounting for meet-in-the-middle attacks used as part of the larger attack, the effective key space is 112 bits. This is a significant increase! It comes at a hefty cost, 3 times the computational power per encryption and decryption, but an 112 bit effective key is far far more secure. It’s important to remember that each bit of the effective key space means an exponential increase in the security.

            So the problem we have is: how do we increase the key space of DES without an operation that would take a lot of computing power, like running the algorithm again. The solution comes from what is known as key whitening. This technique was developed by Ron Rivest in May 1984. Essentially, you create 2 extra 64 bit keys, the same size as the blocks used in DES encryption. Then, encryption for each block becomes a bitwise XOR against the first key, standard encryption of the block using DES, and then one last XOR between the output and the second extra key. Decryption follows the same but in reverse, XOR-ing against the second key, standard decryption, and XOR against the first key. Since XOR is a simple operation, it’s just two more easy operations per block. This makes DESX very similar in speed to DES, but we add 128 bits to the key space (64 bits for each extra key), plus the original 56 bit key of DES brings us to a total of 184 bits, a significant improvement over 56 with only a small increase in time.

            The last piece of DES’s history is its slow death, starting in 2001 and continuing to this day. In 2001, the National Institute of Standards and Technology established the Advanced Encryption Standard, or AES. AES was built from the ground up and is totally different from DES and all of its iterations with the only common factor being they’re both symmetric encryption schemes. AES was specifically designed to be efficient in both hardware and software. Because of this, it’s significantly faster in all of its modes, which include 128 bit, 192 bit, and 256 bit keys. This means that AES is faster and more secure than DES, 3DES, and DESX, so these days pretty much everyone has adopted AES as the de facto encryption algorithm for symmetric encryption.

Note: since it looks like AES takes 0 seconds, that’s just how much faster it is than DES. Below is a graph of the same data, but with only AES and a different y-axis so you can see how AES does scale with more data, but even at 1 gigabyte (maximum file size tested here), it still takes less than a second to perform.

Script: https://github.com/alex-sutay/encrypt_time

Advertisement

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s