• Non ci sono risultati.

Capacity of an AWGN channel

Nel documento Shannon's theory and some applications (pagine 41-45)

1 Shannon theory

1.3 Capacity

1.3.2 Capacity of an AWGN channel

Now we study the capacity of the analog AWGN channel, assuming that we do not use any digital modulator. Then the situation is the following (see figure 1.4):

Fig.1.4: Block diagram for the AWGN channel

ο‚Ÿ An analog symbol ΞΎ with a given probability density function 𝑓 ΞΎ(π‘₯) is transmitted over the channel; it is assumed that the variation of ΞΎ is equal to πœŽπœ‰2 and the mean πœ‡ΞΎ is zero, but there are no further restriction on 𝑓 ΞΎ(π‘₯).

ο‚Ÿ The AWGN channel adds Ξ½, a Gaussian random variable with variance 𝜎𝜈2 and mean value zero (the probability density function of 𝜈 is denoted as 𝑓 𝜈(π‘₯)).

ο‚Ÿ The receiver gets Ξ· = ΞΎ + Ξ½, a random variable with probability density function 𝑓η (π‘₯) = 𝑓 ΞΎ(π‘₯) βˆ— 𝑓 𝜈(π‘₯) (where βˆ— stands for convolution).

For the case of an analog AWGN channel, the capacity is obtained by maximizing just β„Ž(πœ‚). But we know from section 1.1.1.2 that the maximum entropy of an analog source 𝑋 is obtained when the analog source has a Gaussian probability density function. In particular we showed that, for a Gaussian source π‘₯

β„Ž(π‘₯) =1

2log2(2πœ‹π‘’πœŽπ‘₯2)

where 𝜎π‘₯2 is the variance of π‘₯. In this case, if the source ΞΎ is Gaussian with zero mean, then also πœ‚ = πœ‰ + 𝜈 is Gaussian, being the sum of two statistically independent Gaussian random variable, and πœ‚ has mean equal to the sum of the means and variance equal to the sum of variance of πœ‰ and 𝜈. So if πœ‰ has variance

β„Ž(πœ‚) =1

2log2[2πœ‹π‘’(πœŽπœ‰2+ 𝜎𝜈2)]

and this is the maximum value of β„Ž(πœ‚) (for the given and fixed πœŽπœ‰2).

Let us then complete the evaluation of the conditional entropy β„Ž(πœ‚|πœ‰).

𝑓η|ΞΎ(𝑦|π‘₯) = 1

√2πœ‹πœŽπœˆ2𝑒π‘₯𝑝 {βˆ’(𝑦 βˆ’ π‘₯)2 2𝜎𝜈2 } β„Ž(πœ‚|πœ‰) = βˆ’ ∫ π‘“βˆž Ξ·|ΞΎ(𝑀|𝑒)

βˆ’βˆž

log2𝑓η|ΞΎ(𝑀|𝑒)𝑑𝑀 = (1

2log2[2πœ‹π‘’πœŽπœˆ2])

In the overall, when the input is Gaussian mutual information is 𝐼(ξ; η) =1

2log2[2πœ‹π‘’(πœŽπœ‰2+ 𝜎𝜈2)] βˆ’1

2log2[2πœ‹π‘’πœŽπœˆ2] =1

2log2πœŽπœ‰2+ 𝜎𝜈2 𝜎𝜈2 But this is also the capacity of the AWGN channel:

𝐢 =1

2log2πœŽπœ‰2+ 𝜎𝜈2 𝜎𝜈2

So, each time the AWGN channel is used, it carries at most 𝐢 information bits, and 𝐢 depends on the signal to noise ratio πœŽπœŽπœ‰2

𝜈2: if the noise variance reduces or the source increases, then the capacity increases.

Let us consider now not just the transmission of one analog symbol πœ‰, but a sequence of symbols, and let us limit the problem to the case of a bandlimited channel, in particular a low-pass channel with bandwidth B. Then, only a process πœ‰(𝑑) with bandwidth at most equal to B can pass through the channel without being distorted, and we can represent the information content of πœ‰(𝑑)using just its samples, taken at sampling frequency 2B6. Then the entropy of the Gaussian source is

β„Ž(πœ‰(𝑑)) = 2π΅β„Ž(πœ‰) = 𝐡 log2(2πœ‹π‘’πœŽπœ‰2)

where πœŽπœ‰2 is the variance of the process; remember that, if the process is statistically and ergodic, which we will assume then πœŽπœ‰2 does not change with time and is equal to mean power π‘ƒπœ‰ of the process.

The channel output process πœ‚(𝑑) is the sum of πœ‰(𝑑) and the white Gaussian noise 𝜈(𝑑) having power spectral density 𝑁0/2. The receiver has an initial low pass filter followed by a sampler at frequency 2B, so that we can write that the input of the detector is a sequence of samples, generated as rate 2B samples per seconds, which are the sum of the samples of πœ‰(𝑑) and noise random variables with variance

𝜎𝜈2 = 𝑁0

2 2𝐡 = 𝑁0𝐡 The entropy of πœ‚(𝑑) = πœ‰(𝑑) + 𝜈(𝑑), sampled at rate 2B, is

β„Ž(πœ‚(𝑑)) = 2π΅β„Ž(πœ‚) = 𝐡 log2[2πœ‹π‘’(πœŽπœ‰2+ 𝜎𝜈2)]

The conditional entropy is β„Ž(πœ‚(𝑑)|πœ‰(𝑑)) = 𝐡 log2[2πœ‹π‘’πœŽπœˆ2] as before.

The AWGN channel capacity is then

𝐢′= β„Ž(πœ‚(𝑑)) βˆ’ β„Ž(πœ‚(𝑑)|πœ‰(𝑑)) = 𝐡 log2πœŽπœ‰2+ 𝜎𝜈2 𝜎𝜈2 We can substitute the values of the variances and obtain

6 According to the sampling theorem that states that if a signal π‘₯(𝑑) has bandwidth B, it is possible to exactly evaluate π‘₯(𝑑) from its samples, provided that the sampling frequency is larger than 2𝐡π‘₯.

𝐢′= 𝐡 log2(1 + π‘ƒπœ‰ 𝑁0𝐡)

where now the unit of measure of 𝐢′is bits of information per second (not just bit of information).

In brief, compare to 𝐢 and 𝐢′:

ο‚Ÿ 𝐢 is capacity per channel use 𝐢 =12log2(1 +π‘π‘ƒπœ‰

0𝐡) [information bit per channel use]

ο‚Ÿ 𝐢′ is capacity measured, which use the channel 2B times per second. If we do not use the low pass filter, the capacity is zero for sure.

𝐢′ = 𝐡 log2(1 +π‘π‘ƒπœ‰

0𝐡) [information bit per second]

Let us see if we can relate the discrete channel capacities with the AWGN channel capacity. We can imagine that process πœ‰(𝑑) is the output of a digital modulator that generates bits (real bits β€œ1” or β€œ0”) at rate 𝑅𝑏 bits/s, so that the power π‘ƒπœ‰ can be expressed as π‘ƒπœ‰ = 𝐸𝑇𝑏

𝑏= 𝐸𝑏𝑅𝑏, where 𝐸𝑏 is the energy per bit. So we have, for the AWGN channel, 𝐢′ = 𝐡 log2(𝐸𝑁𝑏𝑅𝑏

0𝐡 + 1) or 𝐢𝐡′ = log2(𝐸𝑁𝑏𝑅𝑏

0𝐡 + 1).

It is not possible to get an error probability equal to zero if the input entropy is larger than the channel capacity. At most one bit transmitted by the digital modulator carries one information bit, so that we can say that the source entropy is 𝐻(𝑋) = 𝑅𝑏 information bits per second, and, if we assume that we are working at the limit, i.e.

the best case, with 𝐻(𝑋) = 𝐢′, we have 𝐢′

𝐡 = 𝑅𝑏 𝐡 which leads to

𝑅𝑏

𝐡 = log2(𝐸𝑏𝑅𝑏 𝑁0𝐡 + 1)

which provides a relationship between the signal to noise ration 𝐸𝑏/𝑁0 and the modulation efficiency 𝑅𝑏/𝐡 (measured in bits/second per hertz). In particular, we can write

𝐸𝑏

𝑁0 = 2𝑅𝑏/π΅βˆ’ 1 𝑅𝑏/𝐡

ο‚Ÿ If 𝑅𝐡𝑏= 1, then 𝐸𝑁𝑏

0 = 1 (0 𝑑𝐡);

ο‚Ÿ if 𝑅𝑏

𝐡 β†’ 0, 𝐸𝑁𝑏

0β†’ ∞;

ο‚Ÿ if 𝑅𝑏

𝐡 β†’ ∞, then 𝐸𝑁𝑏

0 β†’ ln2 (βˆ’1.6 𝑑𝐡):

𝑅𝑏lim/π΅β†’βˆž

2𝑅𝑏/π΅βˆ’ 1

𝑅𝑏/𝐡 = lim

𝑅𝑏/π΅β†’βˆž

𝑒𝑅𝑏/𝐡 log𝑒2βˆ’ 1

𝑅𝑏/𝐡 = lim

𝑅𝑏/π΅β†’βˆž

1 βˆ’ 𝑅𝑏/𝐡 log𝑒2 βˆ’ 1

𝑅𝑏/𝐡 = ln 2

= 0.693

The last limit is quite interesting: it starts that it is possible to transmit with error probability equal to zero if the signal to noise ratio is 𝐸𝑏/𝑁0 > βˆ’1.6 𝑑𝐡, provided that the bandwidth B is infinite. Note that 𝐸𝑏/𝑁0 = 1.6 𝑑𝐡 in the case in which the noise variance 𝑁0/2 is equal to 0.72𝐸𝑏, really very high. Another interesting consideration is that we can trade energy with bandwidth: if we increase the bandwidth, we can reduce 𝐸𝑏/𝑁0 and vice-versa.

Note that it is not possible to get error probability equal to zero if, having fixed 𝐸𝑏/𝑁0, the spectral efficiency is higher than the value shown in the curve of Fig. 1.5;

similarly it is not possible to get error probability equal to zero, if, having fixed the spectral efficiency 𝑅𝑏/𝐡, the signal to noise ratio is lower than the value shown in the curve of Fig.1.5. In principle, any transmission system specified by a couple of values (𝐸𝑏/𝑁0), (𝑅𝑏/𝐡) below the curve in Fig.1.5 can work with error probability equal to zero.

Fig.1.5: Plot Shannon channel capacity curve of spectral efficiency 𝑅𝑏/𝐡 versus 𝐸𝑏/𝑁0 for the AWGN channel (channel capacity limit)

In summary, the Shannon channel capacity curve, meaning the theoretical tightest upper bound on the information rate of data that can be communicated at an arbitrarily low error rate using an average received signal power through an analog communication channel subject to additive white Gaussian noise of power:

𝐢′= 𝐡 log2(1 + π‘ƒπœ‰ 𝑁0𝐡) where

ο‚Ÿ 𝐢′ is the channel capacity in information bits per second, a theoretical upper bound on the net bit rate (information rate) excluding error-correction codes;

ο‚Ÿ 𝐡 is the bandwidth of the channel in hertz (passband bandwidth in case of a bandpass signal);

ο‚Ÿ π‘ƒπœ‰ is the average received signal power over the bandwidth (in case of a carrier-modulated passband transmission), measured in watts (or volts squared);

ο‚Ÿ 𝑁0 is the average power of the noise and interference over the bandwidth, measured in watts (or volts squared);

ο‚Ÿ π‘ƒπœ‰/(𝑁0𝐡) is the signal-to-noise ratio (SNR) of the communication signal to the noise and interference at the receiver (expressed as a linear power ratio, not as logarithmic decibels).

Nel documento Shannon's theory and some applications (pagine 41-45)

Documenti correlati