Está en la página 1de 51

Information Theory

- Bit rate is the number of bits per


second. - Baud rate is the number of signal units per second. Baud rate is less than or equal to the bit rate.

Example
An analog signal carries 4 bits in each signal unit. If 1000 signal units are sent per second. Find the baud rate and the bit rate.

Solution Baud rate = 1000 bauds per second (baud/s) Bit rate = 1000 x 4 = 4000 bps

Example
The bit rate of a signal is 3000. If each signal unit carries 6 bits, what is the baud rate?

Example Baud rate = 3000 / 6 = 500 baud/s

Example
Find the minimum bandwidth for an ASK system has a signal transmitting at 2000 bps. The transmission mode is halfduplex.

Solution
In ASK system the baud rate and bit rate are the same. The baud rate is therefore 2000. An ASK signal requires a minimum bandwidth equal to its baud rate. Therefore, the minimum bandwidth is 2000 Hz.

Example
Given a bandwidth of 5000 Hz for an ASK system signal, what are the baud rate and bit rate?

Solution
In ASK the baud rate is the same as the bandwidth, which means the baud rate is 5000. But because the baud rate and the bit rate are also the same for ASK, the bit rate is 5000 bps.

Quadrature amplitude modulation (QAM) is a combination of ASK and PSK so that a maximum contrast between each signal unit (bit, dibit, tribit, and so on) is achieved.

Bit and baud rate comparison


Modulation Units Bits/Baud Baud rate Bit Rate

ASK, FSK, 2-PSK 24-PSK, 4-QAM 48-PSK, 8-QAM 81616-QAM 3232-QAM 6464-QAM 128128-QAM 256256-QAM

Bit Dibit Tribit Quadbit Pentabit Hexabit Septabit Octabit

1 2 3 4 5 6 7 8

N N N N N N N N

N 2N 3N 4N 5N 6N 7N 8N

Information
What does a word information mean? There is no some exact definition, however: Information carries new specific knowledge, which is definitely new for its recipient; Information is always carried by some specific carrier in different forms (letters, digits, different specific symbols, sequences of digits, letters, and symbols , etc.); Information is meaningful only if the recipient is able to interpret it.

The information materialized is a message. Information is always about something (size of a parameter, occurrence of an event, etc). Viewed in this manner, information does not have to be accurate; it may be a truth or a lie. Even a disruptive noise used to inhibit the flow of communication and create misunderstanding would in this view be a form of information. However, generally speaking, if the amount of information in the received message increases, the message is more accurate.

Information
- More information means less predications - Less information means more predications

Example
1. The sun will rise tomorrow 2. The FEU basketball team will win against the Ateneo Basketball team the next time they will play.
Question: Which sentence contains the most information?

Information Theory
How we can measure the amount of information? How we can ensure the correctness of information? What to do if information gets corrupted by errors? How much memory does it require to store information?

Basic answers to these questions that formed a solid background of the modern information theory were given by the great American mathematician, electrical engineer, and computer scientist Claude E. Shannon in his paper A Mathematical Theory of Communication published in The Bell System Technical Journal in October, 1948.

Claude Elwood Shannon (1916-2001)

The father of information theory The father of practical digital circuit design theory
Bell Laboratories (1941-1972), MIT(1956-2001)
22

Information Content
What is the information content of any message? Shannon s answer is: The information content of a message consists simply of the number of 1s and 0s it takes to transmit it.

23

Information Content
Hence, the elementary unit of information is a binary unit: a bit, which can be either 1 or 0; true or false ; yes or no , black and white , etc. One of the basic postulates of information theory is that information can be treated like a measurable physical quantity, such as density or mass.
24

Information Content
Suppose you flip a coin one million times and write down the sequence of results. If you want to communicate this sequence to another person, how many bits will it take? If it's a fair coin, the two possible outcomes, heads and tails, occur with equal probability. Therefore each flip requires 1 bit of information to transmit. To send the entire sequence will require one million bits.

25

Information Content
Suppose the coin is biased so that heads occur only 1/4 of the time, and tails occur 3/4. Then the entire sequence can be sent in 811,300 bits, on average This would seem to imply that each flip of the coin requires just 0.8113 bits to transmit. How can you transmit a coin flip in less than one bit, when the only language available is that of zeros and ones?
26

Obviously, you can't. But if the goal is to transmit an entire sequence of flips, and the distribution is biased in some way, then you can use your knowledge of the distribution to select a more efficient code. Another way to look at it is: a sequence of biased coin flips contains less "information" than a sequence of unbiased flips, so it should take fewer bits to transmit.

Information Content
Information Theory regards information as only those symbols that are uncertain to the receiver. For years, people have sent telegraph messages, leaving out non-essential words such as "a" and "the." In the same vein, predictable symbols can be left out, like in the sentence, "only information essentially to understand must be tranmitted . Shannon made clear that uncertainty is the very commodity of communication.
28

Information Content
Suppose we transmit a long sequence of one million bits corresponding to the first example. What should we do if some errors occur during this transmission? If the length of the sequence to be transmitted or stored is even larger that 1 million bits, then 1 billion bits what should we do?

29

Two main questions of Information Theory


What to do if information gets corrupted by errors? How much memory does it require to store data? Both questions were asked and to a large degree answered by Claude Shannon in his 1948 seminal article: use error correction and data compression
30

Shannon s basic principles of Information Theory


Shannon s theory told engineers how much information could be transmitted over the channels of an ideal system. He also spelled out mathematically the principles of data compression, which recognize what the end of this sentence demonstrates, that only information essentially to understand must be transmitted . He also showed how we could transmit information over noisy channels at error rates we could control.
31

Why is the Information Theory Important?


Thanks in large measure to Shannon's insights, digital systems have come to dominate the world of communications and information processing.
Modems satellite communications Data storage Deep space communications Wireless technology

32

Information Theory
- Quantification

of information

Channels
A channel is used to get information across:
Source 0,1,1,0,0,1,1 binary channel Receiver

Many systems act like channels. Some obvious ones: phone lines, Ethernet cables. Less obvious ones: the air when speaking, TV screen when watching, paper when writing an article, etc. All these are physical devices and hence prone to errors.
34

Noisy Channels
0 0

A noiseless binary channel transmits bits without error: A noisy, symmetric binary channel applies a bit-flip 0m1 with probability p:
0

1 p p p

1 p

What to do if we have a noisy channel and you want to send information across reliably?
35

Error Correction pre-Shannon


Primitive error correction Instead of sending 0 and 1 , send 0 0 and 1 1 . The receiver takes the majority of the bit values as the intended value of the sender.

36

Channel Rate
When correcting errors, we have to be mindful of the rate of the bits that you use to encode one bit (in the previous example we had rate 1/3). If we want to send data with arbitrarily small errors, then this requires arbitrarily low rates r, which is costly.

37

Error Correction by Shannon


Shannon s basic observations: Correcting single bits is very wasteful and inefficient; Instead we should correct blocks of bits.

38

A model for a Communication System


The communications systems are of a statistical nature. That is, the performance of the system can never be described in a deterministic sense; it is always given in statistical terms. A source is a device that selects and transmits sequences of symbols from a given alphabet. Each selection is made at random, although this selection may be based on some statistical rule.
39

A model for a Communication System


The channel transmits the incoming symbols to the receiver. The performance of the channel is also based on laws of chance. If the source transmits a symbol A, with a probability of P{A} and the channel lets through the letter A with a probability denoted by P{A|A}, then the probability of transmitting A and receiving A is P{A}P{A|A}
40

A model for a Communication System


The channel is generally lossy: a part of the transmitted content does not reach its destination or it reaches the destination in a distorted form.

41

A model for a Communication System


A very important task is the minimization of the loss and the optimum recovery of the original content when it is corrupted by the effect of noise. A method that is used to improve the efficiency of the channel is called encoding. An encoded message is less sensitive to noise.

42

A model for a Communication System


Decoding is employed to transform the encoded messages into the original form, which is acceptable to the receiver.

Encoding: F: I F(I) Decoding: F-1: F(I) I

43

A Quantitative Measure of Information


Suppose we have to select some equipment from a catalog which indicates n distinct models: _x1 , x2 ,..., xn a The desired amount of information I ( xk ) associated with the selection of a particular model xk must be a function of the probability of choosing xk :

I ( xk ) ! f P _xk a
44

A Quantitative Measure of Information


If, for simplicity, we assume that each one of these models is selected with an equal probability, then the desired amount of information is only a function of n:
I1 xk ! f 1/ n

45

A Quantitative Measure of Information


If each piece of equipment can be ordered in one of m different colors and the selection of colors is also equiprobable, then the amount of information associated with the selection of a color c j is :
I 2 c j ! f P _ j a ! f 1/ m c

46

A Quantitative Measure of Information


The selection may be done in two ways: Select the equipment and then the color independently of each other
I xk & c j ! I1 ( xk )  I1 (c j ) ! f 1/ n  f 1 / m

Select the equipment and its color at the same time as one selection from mn possible choices:
I xk & c j ! f 1/ mn
47

A Quantitative Measure of Information


Since these amounts of information are identical, we obtain: f 1/ n  f 1/ m ! f 1/ mn Among several solutions of this functional equation, the most important for us is: f x !  log x Thus, when a statistical experiment has n eqiuprobable outcomes, the average amount of information associated with an outcome is log n
48

A Quantitative Measure of Information


The logarithmic information measure has the desirable property of additivity for independent statistical experiments. The simplest case to consider is a selection between two eqiuprobable events. The amount of information associated with the selection of one out of two equiprobable events is  log 2 1/ 2 ! log 2 2 ! 1 and provides a unit of information known as a bit.
49

Information Entropy
- A key measure of measure - Expressed by the average number of bits needed for storage or communication

Entropy
- Quantifies the uncertainty involved in a random variables. Ex. Coin flip less entropy Roll of die more entropy Entropy units bits/second - bits/symbol - unit less

También podría gustarte