Site Loader

One important, yet practical, application of complexity theory is cryptography. It has become one of the main tools for privacy, trust, electronic payments, and other forms of security. It is no longer just a military tool and the advantages it provides should be used to the fullest extent. This paper will discuss basic terminology and main methods of cryptography.

Cryptography is the science of scrambling text so that no one can read it except for the intended recipient. The art of breaking ciphers without the proper key is called cryptanalysis. Cryptography deals with the secure message, digital signatures, authentication, and other similar applications. Cryptology is the branch of mathematics that studies the foundation of cryptographic methods. The process of transforming plaintext into a form that is meaningless to anyone that might intercept it is called encryption.

The process of decoding the message is called decryption. This can be done by using an encryption algorithm, a decryption algorithm, and a secret or private key. The sender uses the encryption algorithm to encode the message, and the receiver uses the decryption algorithm and the key to decode the message. A third party intercepting the encoded message will have worthless data unless they can figure out the decryption algorithm and obtain a key. To best ensure that the key is kept safely out of the hands of the third party it is never to be sent with the encoded message.

The study of cryptography dates back thousands of years to the hieroglyphs of early Egyptian civilization. Cryptography has been used by such figures as Julius Caesar, Charlemagne, Louis XIV and Mary Queen of Scots. One of the simplest cryptographic algorithms ever used was the one called the Caesar cipher. It was used by Julius Caesar to send messages to his generals. It consisted of switching each alphabetic letter to the letter that was three letters down in the alphabet. For example, Caesar would become Fdhvdu.

This algorithm was later improved and renamed ROT13. This algorithm allowed the letters to be shifted to any letter between 1 and 25 in the alphabet, and the number of letters shifted was the key. A simple step away from ROT13 is Monoalphabetic substitution. In this algorithm, each letter corresponds to another letter but in no particular order. For example a = m, s = p, and z = d, etc for all 26 letters. This made the algorithm harder to break, but also made for large keys that couldn t be memorized because they consisted of 26 pairs of letters.

Blaise de Vigenere developed a poly-alphabetic substitution known as the Vigenere cipher. The algorithm encrypts messages several letters at a time instead of letter by letter. For example fh = lm, yz = ef. To simplify making the key that would be required for this for such an algorithm, a table is used in conjunction with the key. The table would be fairly large, but the key would be small enough to be memorized and the table would be useless without the key. This cipher wasn t totally safe, but no sure method of breaking was developed until the 20th century.

Sir Francis Bacon s biliteral cipher employed an arrangement of the alphabetic letters a and b in five-letter combinations each representing a letter of the alphabet. This illustrates the principle that a code employing only two different signs can be used to transmit data.

Modern cryptography is concerned with four chief objectives:

1. Confidentiality (the information cannot be understood by anyone other than the intended recipient).

2. Integrity (the information cannot be altered during storage or transit without the alteration being detected).

3. Non-repudiation (the creator cannot deny at a later stage his intention in the creation and transmission of the information).

4. Authentication (the sender and receiver can confirm each other s identity and the origin or destination of the information).

The advances that have been made in technology such as Internet e-Commerce, communications satellites, and television signals have generated a serious need for more versatile and secure systems and methods of encryption. With the widespread use of the Internet for both private and commercial business, information security seems to touch every aspect of our lives yet the average Internet user doesn t realize that even their e-mail can be intercepted and read. Possibly the greatest barrier to computer security is the human factor. The average user is a non-programmer and most encryption software is not user friendly.

Prior to 1977 the algorithms used were symmetic, the same key was used to encrypt and decrypt the ciphertext. The United States adopted the Data Encryption Standard (DES), a symmetric algorithm, as a federal standard in 1977. A pair of American mathematicians, Whitfield Diffie and Martin Helbman, introduced the asymmetric or public key algorithm in 1977. There are advantages to using the asymmetric algorithms because you have to have both keys to break the cipher and you need fewer unique keys.

Also in 1977, while at the Massachusetts Institute of Technology, Ronald Rivest, Adi Shami, and Len Adleman, invented a new asymmetric technique RSA which is the most commonly used technique today. Shortly after the development of RSA another asymmetric technique, PGP (Pretty Good Privacy), was invented. It was very closely based on RSA and had to be re-worked to make it legally marketable. It is actually broadcast on the world- wide-web as freeware, which has made it one of the most widespread cryptographic systems in use.

There are two classes of key-based encryption algorithms, symmetric (the secret key) or asymmetric (the public key) algorithm. The symmetric algorithms use the same key for encryption and decryption. The asymmetric algorithms use a different key for encryption and decryption.

The symmetric algorithms can be divided into stream ciphers

and block ciphers. A stream cipher can encrypt a single bit of plaintext at a time while a block cipher takes approximately 64 bits, and encrypts them as a single unit.

The asymmetric algorithms (public-key cryptography) permit the encryption key to be public which allows anyone to encrypt with the key. However, only the proper recipient (the one who knows the decryption key) can decrypt the message. The decryption key is called a private or secret key.

Generally speaking, symmetric algorithms execute more quickly on a computer than an asymmetric one. The two types can actually be used together so that a public-key is used to encrypt a randomly generated encryption key, and the random key is used to encrypt the actual message using a symmetric algorithm. This is sometimes called hybrid encryption. The most commonly used symmetric cipher is DES and the most commonly used asymmetric is RSA.

E-Commerce Encryption

For Web-based transactions the connection to the host should be in a secure mode using Secure Sockets Layer (SSL) or Secure Electronic Transaction (SET). This encrypts everything sent back and forth between the browser and the server. This encryption is mainly needed to protect against credit card theft. Most computers come with a pre-installed browser that contains one of these standards. For example, Microsoft Internet Explorers uses SSL which displays a an unbroken key or lock in the lower right hand corner of the screen so that the user will know that they are on a secure site.

A digital signature is a small amount of data created by using some secret key. There is a public key that can be used to verify the signature was really generated by using the corresponding private key. The algorithm used to generate such a signature must allow only those signatures that know the secret key to verify as valid.

Digital signatures can include such things as timestamps to testify that a document existed at the stated time. The signature can also be used to certify that a public key belongs to a particular person. This is done by signing the combination of the key and the information about its owner by a trusted key (the trusted key is owned by a third party).

The reason for trusting the third party key may be because it was signed by another trusted key. At some point, a key must be a root of the trust hierarchy and must be accepted trustworthy. In a centralized key infrastructure, such as government, there are limited roots in the trust network. In a distributed infrastructure there doesn t have to be a universally accepted root. Each party may have different trusted roots. This is the web of trust concept used in PGP.

A digital signature for an arbitrary document can be created by computing a message digest from the document, and then concatenating it with information about the signer, timestamp, etc. The resulting string is then encrypted using the private key of the signer. The resulting encrypted block of bits is the signature. It may then be distributed together with information about the public key that was used to sign it.

To then verify that signature, the recipient first determines whether it trusts that the key belongs to the person it is supposed to belong to (by using the web of trust or other trusted key method). Once the recipient has verified the key, the signature is decrypted using the public key of the person. If the signature decrypts properly, and the information matches that of the message, the signature is accepted as valid. The most widely used algorithm for making and verifying digital signatures is RSA.

When shopping on the Web, many merchants require that a buyer register a user name and password before making a purchase. This is the same type of method used for an Automated Teller Machine (ATM) code.

A good cryptographic system should be designed to be as difficult to break as possible. Anything available that can be used to circumvent security must be made explicit, documented, and brought to the attention of the end users. Whether it s due to poor training, distractions, or forgetfulness, end users are generally the weakest link in the whole cryptographic system.

Theoretically speaking, any cryptographic method with a key can be broken by trying all possible keys in sequence. To use brute force to try all keys the required computing power would exponentially increase with the length of the key. The longer the key the longer it will take to break the key. A 32 bit key takes 232 (about 109) steps. Today, this can be done easily on a typical home computer. If an efficient algorithm were used a 40 bit key would take approximately one week on a modern home computer.

A 56 bit key (such as used with DES) takes a lot more effort, but a large number of home computers using distributed effort can break it in just a few months. Most keys with 64 bits are breakable by governments, organized criminals, and major companies within a few years. Keys with 80 bits appear to be okay for a few more years, and 128 bit keys should remain unbreakable by brute force for the foreseeable future.

The key lengths used in public-key cryptography are generally much longer than those used in symmetric ciphers. This is due to the extra structure that is available to the cryptanalyst. The problem there is not guessing the right key, but deriving the matching secret key from the public key. The RSA algorithm is based on large prime numbers. It is very hard to find out whether a very large number is prime, and if it is not, finding out what numbers it is a product of can be even harder. So if two very large primes and multiply them we arrive at an even larger number that is almost prime making it almost impossible to determine the two numbers used.

The complexity of the RSA cryptosystem is such that a 256 bit key is easily factored on a home computer, and a 512 bit key can be broken by a university research group within a few months. Keys with 1024 bits or more should be safe for now unless some major cryptographical advances are made against RSA. There is some speculation that keys with 2048 bits will be secure for decades.

Post Author: admin