Data Cryptography

© Mercury Communications Ltd - March 1993

2007 network writings:
My TechnologyInside blog


Cryptography is seen as the only effective way of ensuring security and privacy in data communications. The market move from mainframes to distributed network-based PCs or workstations is acting as the prime motivator to incorporate encryption as standard in many tools common on PCs and workstations. It is seen by many that the lack of this security is a major inhibitor to moving critical information from mainframes to desktop machines as ubiquitous communication leads to risks without encryption. Access authentication, in the form of passwords or PINs, is becoming increasingly important. Security through encryption is also seen as a key issue in digital mobile communications, especially GSM, PCN and DECT cordless standards. Also, encryption of PIN codes and financial data held on smartcards is of key importance in the fight against fraud.

Encryption Bewilderment

There are many proprietary encryption algorithms in use in the commercial world. However, wide scale use of encryption in the public transmission domain depends on the availability of solid and unclassified standards to enable inter-operability. The word 'unclassified' is used in the context of government limitations to the use and export of encryption tools.

Up until 1991, the U.S. digital signature encryption standard recommended by the National Institute of Standards and Technology (NIST) was RSA (the derivation of this acronym is described later). This was the preferred de facto international standard in use by many commercial companies and it was widely recognised that royalty payments to NIST would be waived to provide a commercial incentive. Companies that adopted RSA included: Microsoft, Sun, Lotus, Novel, DEC, Motorola, NT, Exxon, Citicorp, Boeing, GE and Apple. Standards that involve RSA include: X.500 for digital directories, Etebac-5 for French banking, AS2805.6.5.3 the Australian digital signature standard, and RFC1114 for Internet privacy-enhanced mail.

However, on August 30th 1991, NIST changed its mind and proposed an entirely new algorithm for public review. This new standard was called the Digital Signature Standard (DSS). No reasons were given for the change of standard and many industries were very disappointed with the announcement. DSS was developed by the U.S. National Security Agency (NSA), an equivalent to the UK's MI6, and was based on public-key algorithm published by Tahar ElGamil in 1985. The NSA not only makes codes, but breaks them as well to allow the agency to intercept international tele-communications traffic. This dual role has lead many to publicly suspect that if the NSA could crack the DSS code their life would be made a little easier. Although in 1992 several researchers have claimed to have found 'trap doors' in DSS, nothing has been proved so far.

Encryption Algorithms

There are two groups of encryption algorithm that are important to the computer and telecommunications fraternities: secret-key cryptography (SKC) and public-key cryptography (PKC).

SKC is the traditional method of encrypting data and uses a single secret key known only to the sender and receiver of data. The standard in this area is known as Digital Encryption Standard (DES). PKC is more concerned with signature verification, which is a procedure that can verify the source of data, and also that the data has not been tampered with in transit. The two standards here are RSA and DSS. Let's look at secret key cryptography first.

Traditional Secret Key Encyption (DES)

Unlike public-key encryption, secret-key encryption is computationally efficient and is the favourite approach for bulk data encryption.

DES is the most common encryption scheme in use by governments and international industries and was the first cryptographic algorithm openly developed by the U.S. government. In 1975 IBM responded to a call by the US government by putting forth a proposal, based on their Lucifer system, whereby a single 56-bit 'secret' key is used to encode and decode. This key is usually presented as a 64-bit word, with check bits filling the unused positions. This proposal formed the basis of DES.

The NSA, together with some of the world's best code breakers, evaluated DES positively but did recommend some changes to the original IBM proposal. DES has never been compromised as far as has been made known publicly and most users have sufficient faith still to widely use the algorithm.

DES is a very fast algorithm and is capable of encrypting data at more than 100 million bits per second. Given a machine that could perform this algorithm in 1µS, it would take 1760 years to try all the combinations of the key. It has been reaffirmed twice as a standard but was up for renewal in January 1993.

All cryptography has a natural life span and advances in computer technology will undoubtedly reduce the value of DES in the future. Using massively parallel computers, it would be possible to try many multiple codes in parallel and considerably reduce the time taken to crack a code. Realising this, the NSA does not now recommend the use of DES on classified data and on this basis the NSA stopped certifying DES based products in 1988. DES is seen to be at the end of its natural lifetime because of its very short 56-bit key.

The DES method requires one private key to be sent to the recipient via a path different from the data. This is usually by diplomatic bag in government circles, or hand-carried in commercial operations. Of course, the weakness of this approach is that is always possible that the key could fall into the wrong hands. A more widely applicable encryption system is now in use around the world, public-key encryption.

Public Key Encryption

Throughout history, spies, diplomats, and the military have been using the same basic principle for their encryption activities. For example, Julius Caesar seemed to have used a simple substitution algorithm. For example VENI VIDI VICI would be transformed to YHQL YLGL YLFL. To unscramble the message, the enciphering scheme is simply reversed by moving each letter three places to the left in the alphabet. It is a symmetrical process with the enciphering and deciphering taking about the same time. In this case, the secret key, 3, is kept secret.

Since Roman times there have been many developments of the secret key technique, but in essence, it has remained the same. Until, that is, the arrival of asymmetric public-key encryption (PKC) in 1976.

In that year, Whitfield Diffie and Martin Hellman of Stanford University published their paper "New Directions in Cryptography " . This paper presented an entirely new type of cryptography which, rather than employing a single key to encrypt and decrypt messages, it used a pair of corresponding keys. One key encrypts, another key decrypts. Only one of the keys needs to remain secret; the other key may be either given to an individual recipient or, more importantly, may be freely distributed by a bulletin board for example. This is possible because it is computationally impractical to derive the private key from the public one.

The implications of this technique are far reaching. For the first time, cryptography became applicable to many commercial transactions, such as electronic banking, private electronic mail, and inter-network data traffic. PKC has enabled or directly influenced two major areas of cryptography: Electronically signed and sealed data known as digital signature cryptography; and electronic distribution of secret keys.

Digital Signatures

A digital signature means that an author can 'seal' transmitted information with their own signature in the form of the private key. It can then be sent to anyone holding the equivalent public key. Using this technique it is possible for the recipient to determine not only that the person claiming to have sent the message really did send it, but also that the data has not been tampered with while in transit over the network.

To illustrate how this process works, assume that a long message consisting of '1's and '0's is split up into blocks of 64-bits. When the first block arrives it is held in memory. As the second block arrives, each bit of the first block is matched to the bit in the same position in the second block. If the two bits are the same, a '0' is recorded. If they are different, a '1' is recorded. This produces a new 64-bit block which replaces the first block. This continues with each new block received, generating a new 64-bit block that is dependant on all previously received blocks. The end result is a unique 64-bit code called a 'digital signature'. Even changing a single bit in the data would change the resulting signature.

This signature is then encrypted using the private PKC key and sent to the recipient. This operation prevents forgeries by allowing anyone holding the public key to verify the source. As the data is not encrypted, anyone who intercepts the data can read it. As a result, techniques based on this approach have been easier to export from the USA because they do not stop US national intelligence from monitoring international data traffic. This approach forms the basis for smartcard verification technology.

Distributing Keys

Traditional secret codes are only as good as the method used to distribute the keys. The normal method has usually involved couriers. But, with the advent of PKC, the PSTN network can be used.

For example, the large university network, Internet, is using a PKC algorithm to electronically distribute secret keys for use with DES encrypted data. As explained earlier, DES is still used because of its great speed. Consider two people, Jim and Jeff, who want to establish an interactive DES session. This works as follows.

1 Jim puts his public PKC key on, say, a bulletin board so that it is available to Jeff.

2 Jeff then uses Jim's public key to encrypt a random DES key he has created.

3 Jim then decrypts Jeff's secret key using his private PKC key.

4 Now that both parties have the secret key, secret DES communication can start.

RSA Algorithms

RSA is the most popular of the PKC algorithms. It was patented in 1978 by Ronald Rivest, Adi Shamir, and Leonard Adelman of Massachusetts Institute of Technology. The name of the algorithm derives from the initials of the three inventor's surnames. In 1982 they formed RSA Data Security to commercialise their developments through product development and licensing. The RSA patent does not expire until year 2000.

There have been many public-key algorithms proposed over the years but RSA is by far the easiest to understand, implement, and has certainly been the most popular. It should be remembered that with all these algorithms, analysis does not actually prove, or disprove, the integrity of the encryption. What it can do, however, is provide a high confidence level, underpinned by theory.

RSA is based on the difficulty of factoring large numbers. The public and private keys are functions of a pair of very large prime numbers of at least 100 to 200 digits. The algorithm calculates both keys from these prime numbers, and determining one from the other is said to be the equivalent of factoring the product of the two primes. This process could take thousands of years running on today's computers. In fact, after 10 years nobody has admitted ('admitted' is a key word in this context!) to breaking an RSA code.

Beware, A Very Technical Section!

It is not possible within the constraints of this briefing to go into the details of how the RSA algorithm works, but it is possible to show the principles through a simple example.

A prime number generator is used to produce two large prime numbers: P and Q. Let's assume that P=47 and Q=71 (in a real life the numbers would be much, much, bigger). These two numbers are multiplied together to derive N: PxQ = N, in this case N=3337. The public encryption key, E, must have no factors in common with (P-1)x(Q-1), e.g. 46x70=3220. Therefore, E could be randomly chosen to be 79.

The private key, D, is derived from the following algorithm: D= [79-1]^(mod 3220) e.g. D=1019. In practice, E and N are the public key, D is the private key, and P and Q are discarded or stored.

To encrypt the data string "DR DOBBS", M = 68 82 32 68 79 66 66 83, first break it down into six blocks [M)i], where [M1]=688, [M2]=232 etc. The first block is encrypted using the formula 688x79^[mod 3337]=1570. A similar calculation on subsequent blocks generates an encrypted message C=1570 2756 2714 2276 2423 1581.

To decrypt this message a similar set of calculations are done using the private key, D. For example, to decrypt the first block: 1570*1019^(mod 3337) = 688.

The security of the RSA code wholly depends on the difficulty of factoring large numbers. Calculations indicate that using the best factoring algorithm available, and assuming a computer performance one million times better than today, it would still take 4,000 years to crack the code. However, there is always the possibility that a new factoring technique could be developed that would reduce this time. But, as mathematicians have been working on this problem since Greek times, this is unlikely to be achieved - or is it?

The New DSS Algorithm

As discussed earlier, in 1991 NIST replaced the de factoindustry standard, RSA, with the digital signature standard (DSS). This new algorithm is focused on electronically verifying the integrity of data and its source, rather than the encryption of data. The DSS algorithm computes and verifies a short code that is appended to the bulk data file. In practice it works in a similar manner to that described earlier. However, it has come under considerable criticism from the computer community. Some of the public comments are:

  • DSS is not compatible with standard bodies such as ISO and CCITT, who have already accepted RSA. Adopting DSS will create double standards which will hold back the market and create confusion.
  • Several organisations have claimed DSS infringes existing patents, and actions are currently pending.
  • DSS is too slow, especially for verification and could cause problems when used on a PC.
  • There is no provision for encryption of data and secret key exchange.
  • It is also claimed that the fixed key length of 512 bits is too short. Indeed, there shouldn't be a fixed length key at all.

Public opinion, as expressed in Communications of the American Computing Machinery, seems to be unanimous in recommending that NIST should withdraw DSS as a proposal based on the above weaknesses. However, NIST refuses to do so. It may seem incredible, but the US and UK governments treat the licensing of mass-market encryption software for export in the same way it treats munitions exports. US export controls are driven by the historic premise that cryptography should be in the domain of national security interests, which works against what is happening in the commercial world markets. For example, DES is a world standard, but US manufacturers cannot sell products using DES to non-US companies. Similarly, many non-US companies can offer global corporate-wide DES and RSA based encryption solutions, while US companies are limited to offering US only solutions.

Even though it is possible to take a shrink-wrapped PC encryption program out of the US in a case, or ship it overseas via a modem, criminal penalties include fines of up to $1 million and ten years in prison. This is creating serious problems to US software companies who wish to embed encryption in their products and is contributing to impeding the move from centralised to distributed computing.

It is also interesting to note that one way to immunise desktop machines from attacks by viruses is for software houses to attach digital signatures to their packages. This then acts as a tamper detection seal which allows users to determine the integrity of software each time they load it into memory.

Round up

Cryptography is a key technology for the 1990s telecommunications and computer industries. It will be used in such areas as: encryption of smartcard data, verification of source and content of electronic transaction, credit cards, prevention of eavesdropping in cellular telephones; and inbuilt protection against viruses. Unclear worldwide standards and guidelines will undoubtedly lead to high costs, confusion, and delays in the widespread adoption of encryption.

Because of US export rules and the confusion caused by the imposition of an 'unwanted' standard in the form of DSS, encryption seems to be in a state of disarray. This could not come at a worse time for the computer and the telecommunications industries, when the need for embedded encryption is at an all-time high. An example of this confusion is the current demand by GCHQ that the GSM encryption algorithm A5 be downgraded to one called A5X that will allow them to monitor GSM traffic. The justification for this is so that they can monitor traffic of drug barons, criminal fraternity and the IRA.

The whole issue of data security and encryption will become of major consequence in coming years. The computer fraternity have had to address these issues in depth already. Telecommunications is next.

Credit: I would like to acknowledge Dr Dobbs Journal for their excellant articles on the subject of cyptography.

Back to home