The advent of electronic communications broadened the uses of encryption and also introduced new needs which fell under the study of cryptography. For instance, banks needed to communicate securely, in privacy, but also they needed to protect the integrity of the messages and provide assurance of the messages' authenticity. Banks needed not only privacy but guarentees that the messages were sent by true sources and were not modified in transit.
Cryptography is now a deep and fascinating field, essentially all about the maintenance, release, and transformation of information, often in counter-intuitive ways. It includes coding theory, the theory of computation, the theory of information and noise, as well as security. Branches of mathematics which were previously thought to have no worldly application, such as number theory, are now the cornerstones for a growing Internet economy.
Codes and ciphers
Tremendous effort is often made to discover the meaning of secret messages. This is particularly true during times of war. To break a code or cipher is to discover a method by which a code or cipher can be undone without having access to the original codebook, enciphering system, or secret key. While codes might seem particularly secure, since the assignment of number of symbol to meaning is arbitrary, a body of messages can often be examined for patterns which eventually indicate the meaning of the codes.
Ciphers are more portable and have increasing use in a computerized world. Most classical ciphers, those used up to and including World War II, can now be broken. A famous example is the Enigma machine, which in its various forms secured German communications during the war. It was broken first by Polish mathematicians who reconstructed the machine from some initial guesses as to its workings, and the work continued in England with mathematician Alan Turing and the construction of Bombs, simple electronic computing machines.
Modern ciphers are much more resistant to attack. The most widely used is DES, which for 40 years has protected civilian communications, but now has been rendered insecure by very fast computers which can exhaustively try every possible key looking for reasonable-looking text among the trial decryptions. A new standard, AES, also known as Rain-Doll, has replaced DES last year.
A new class of public-key ciphers was proposed in 1975 and implemented first in 1977 as the famous RSA crypto-system. These use two keys instead of one, a public key to lock the message and a private key to unlock the message. The advantage of a public key system is that two parties need not first meet in person to share a secret key. Without prior introduction a message can be privately sent by using the recipient's well-known and published public key. Only the recipient can unmasked the message since only the recipient knows the private key.
While early cryptographers tended to be linguists, or men-of-letters of some sort, the mathematicians got involved around the turn of the last century and is now a fully mathematical discipline. At first, crypto-systems were judged secure only by trial and error. It was secure until someone happened to break it, and nobody knew when and if that would happen, or by whom. Currently there is a movement to provable security, and the theory has evolved to a point where a crypto-system can be shown secure under certain hypotheses.
The state of the art is, however, that no one knows whether the hypotheses are true or not. Statements are made such as: RSA is secure assuming that there is no way to efficiently factor a large integer. That is, having chosen two large prime numbers p and q, and given to the world their product n=pq, it is an assumption that an impractically large amount of work is required to derive from n the numbers p and q. If this is true, RSA is secure. If it is not, RSA is not secure. We do not know now how to factor large integers (except by quantum computation), but we also do not know if someday someone will find a way to do this.
The theory of computation, and in particular the theory of complexity, can make all these statements precise. These theories depend upon the models of computation put forward by Alan Turing, the same person responsible for the cracking of the German Enigma encipherment. Cryptography is deeply interwoven into our knowledge of information and computation.
Signatures and Authentication-codes
As electronic communications developed and our understanding of computation deepened, there were important new applications for cryptography. One was the ability to apply to a message a transformation that acted as a signature. Only the holder of a secret key could correctly apply the transformation, but anyone could verify, once the transformation has been applied, and without knowledge of the secret key, that the transformation is accurate.
Hence if a message is accompianied by the transform's result, the owner of the secret key must have been involved. Presumably the owner of the secret had read the message and agreed with its contents before performing the transformation. This is the essence of a signature. It has recently become law that such signatures are binding. They are used to sign various sorts of documents. Prior to commercial use, they were used by sensing devices placed in foreign countries to monitor complience with nuclear test-ban treaties. It attested that the statements attributed to the sensing device were indeed made by the device, and not fabricated to forge a facade of compliance.
The signature generally also sealed the message, since the message cannot be changed and without invalidating the signature. Sometimes the public verifiablity of the signature was not needed, just that the message could not be changed without the owner or one of his friends noticing the change. These are called Authenticatication codes.
Chance, randomness and pseudo-randomness
As was said above, security is generally only relative to certain assumptions. Even without these assumptions, security is a probability, never a certainty. A secret can be guessed; a signature can be guessed; and authentication code can be guessed. The modern theory of cryptography is always an exercise in probabilities. We would reduce the chance that a key or signature could be guessed to less than one in a billion.
The definition of randomness is a philosphically complex issue. Probability lets us get a working definition of randomness, and its correctness is proven in various real-life applications which depend on its proper understanding: casinos, insurance, statistical mechanics. The goal of a cipher is to render the text stream into an apparantly random stream of data. No information about the message should be computable by looking at the encrypted text. However, this text is not really random, it only must "look" random to someone of reasonably restricted computational power and who lacks the secret key. This is the concept of pseudo-randomness: the data is not random but nevertheless cannot be distniguished from truely random data by computations of reasonably restricted power.
In the modern theory of cryptography, encryption and pseudorandomness are shown to be equivalent. If it is possible to generate a stream of data which is not random but is nevertheless indistinguishable from random by suitable restricted computational means, then it is possible to encrypt securely. Conversely, if it is possible to encrypt securely, then this encryption can be the basis for generating a pseudorandom data stream.
Burton Rosenberg
24 Aug 2004