Wednesday, January 27, 2010

Lecture 3: Entropy (Jan. 27)

Today's lecture explained how entropy can be used to make sure that you have a good encryption algorithm. DES was not explained today due to lack of time and will probably be taught next lecture. Shannon (sorry I can give you more information about him) came up with a method to mathematically describe the amount of information contained within a communication channel, bandwidth, ect. This is what we know as the entropy, it's the amount of information present. Shown below is Shannon's model.



This equation tells you how many different possibilities are possible. For example, if there is only one possible signal, the entropy is 0 meaning the only signal is the only possible signal. If there are 1024 possible signals, the entropy is 10 meaning 10 bits can describe all possible messages. So the main goal in encryption is to increase the entropy of the message and thereby increasing the complexity of the message.

Dr. Gunes also explained some characteristics of good ciphers. The main characteristics are using the amount of secrecy that you need, the keys and enciphering algorithm should be simple, the process should be simple, errors shouldn't propagate, and the size of the enciphered text should be the same size or smaller than the original.

The last thing that was talked about is the conpect of confusion and diffusion. Confusion means that there isn't an easy relation between the plaintext and the ciphertext. This means that if you changed only one letter in the plaintext, you would have an entirely different ciphertext with many or all of the letters changed. Diffusion means that the plaintext should be spread all over the ciphertext. This means that someone would require access to most of the ciphertext in order to infer any kind of algorithm.

This is a brief summary of what was covered in lecture today.

No comments:

Post a Comment