Sei sulla pagina 1di 6

Additive white Gaussian noise

Additive white Gaussian noise


Additive white Gaussian noise (AWGN) is a channel model in which the only impairment to communication is a linear addition of wideband or white noise with a constant spectral density (expressed as watts per hertz of bandwidth) and a Gaussian distribution of amplitude. The model does not account for fading, frequency selectivity, interference, nonlinearity or dispersion. However, it produces simple and tractable mathematical models which are useful for gaining insight into the underlying behavior of a system before these other phenomena are considered. Wideband Gaussian noise comes from many natural sources, such as the thermal vibrations of atoms in conductors (referred to as thermal noise or Johnson-Nyquist noise), shot noise, black body radiation from the earth and other warm objects, and from celestial sources such as the Sun. The AWGN channel is a good model for many satellite and deep space communication links. It is not a good model for most terrestrial links because of multipath, terrain blocking, interference, etc. However, for terrestrial path modeling, AWGN is commonly used to simulate background noise of the channel under study, in addition to multipath, terrain blocking, interference, ground clutter and self interference that modern radio systems encounter in terrestrial operation.

Channel capacity
The AWGN channel is represented by a series of outputs input and noise, , where distribution with variance (the noise). The at discrete time event index . is the sum of the . is independent and identically distributed and drawn from a zero-mean normal are further assumed to not be correlated with the

The capacity of the channel is infinite unless the noise n is nonzero, and the

are sufficiently constrained. The

most common constraint on the input is the so-called "power" constraint, requiring that for a codeword transmitted through the channel, we have:

where

represents the maximum channel power. Therefore, the channel capacity for the power-constrained

channel is given by:

Where But and

is the distribution of

. Expand

, writing it in terms of the differential entropy:

are independent, therefore:

Evaluating the differential entropy of a Gaussian gives:

Because

and

are independent and their sum gives

From this bound, we infer from a property of the differential entropy that

Additive white Gaussian noise Therefore the channel capacity is given by the highest achievable bound on the mutual information:

Where

is maximized when: for the AWGN channel is given by:

Thus the channel capacity

Channel capacity and sphere packing


Suppose that we are sending messages through the channel with index ranging from distinct possible messages. If we encode the messages to bits, then we define the rate to as: , the number of

A rate is said to be achievable if there is a sequence of codes so that the maximum probability of error tends to zero as approaches infinity. The capacity is the highest achievable rate. Consider a codeword of length codeword vector variance is now a sphere of radius sent through the AWGN channel with noise level . When received, the , and its mean is the codeword sent. The vector is very likely to be contained in around the codeword sent. If we decode by mapping every message received onto

the codeword at the center of this sphere, then an error occurs only when the received vector is outside of this sphere, which is very unlikely. Each codeword vector has an associated sphere of received codeword vectors which are decoded to it and each such sphere must map uniquely onto a codeword. Because these spheres therefore must not intersect, we are faced with the problem of sphere packing. How many distinct codewords can we pack into our -bit codeword vector? The received vectors have a maximum energy of and therefore must occupy a sphere of radius . Each codeword sphere has radius proportional to . The volume of an n-dimensional sphere is directly

, so the maximum number of uniquely decodeable spheres that can be packed into our sphere

with transmission power P is:

By this argument, the rate R can be no more than

Achievability
In this section, we show achievability of the upper bound on the rate from the last section. A codebook, known to both encoder and decoder, is generated by selecting codewords of length n, i.i.d. Gaussian with variance and mean zero. For large n, the empirical variance of the codebook will be very close to the variance of its distribution, thereby avoiding violation of the power constraint probabilistically. Received messages are decoded to a message in the codebook which is uniquely jointly typical. If there is no such message or if the power constraint is violated, a decoding error is declared. Let denote the codeword for message , while is, as before the received vector. Define the following

three events: 1. Event :the power of the received message is larger than . 2. Event : the transmitted and received codewords are not jointly typical.

Additive white Gaussian noise 3. Event : is in , the typical set where , which is to say that the incorrect codeword goes to zero as n . Therefore, for a are independent for by the joint AEP,

is jointly typical with the received vector. An error therefore occurs if , or any of the sufficiently large , we , both that and and

occur. By the law of large numbers, . Since and Therefore,

approaches infinity, and by the joint Asymptotic Equipartition Property the same applies to are each less than are also have independent.

. This allows us to calculate

, the probability of error as follows:

Therefore, as n approaches infinity,

goes to zero and

. Therefore there is a code of rate

R arbitrarily close to the capacity derived earlier.

Coding theorem converse


Here we show that rates above the capacity are not achievable.

Suppose that the power constraint is satisfied for a codebook, and further suppose that the messages follow a uniform distribution. Let be the input messages and the output messages. Thus the information flows as:

Making use of Fano's inequality gives: where Let as

be the encoded message of codeword index i. Then:

Let

be the average power of the codeword of index i:

Where the sum is over all input messages is, for noise level :

and

are independent, thus the expectation of the power of

Additive white Gaussian noise

And, if

is normally distributed, we have that

Therefore,

We may apply Jensen's equality to

, a concave (downward) function of x, to get:

Because each codeword individually satisfies the power constraint, the average also satisfies the power constraint. Therefore

Which we may apply to simplify the inequality above and get:

Therefore, it must be that the capacity derived earlier, as .

. Therefore, R must be less than a value arbitrarily close to

Effects in time domain


In serial data communications, the AWGN mathematical model is used to model the timing error caused by random jitter (RJ). The graph to the right shows an example of timing errors associated with AWGN. The variable t represents the uncertainty in the zero crossing. As the amplitude of the AWGN is increased, the signal-to-noise ratio decreases. This results in increased uncertainty t. When affected by AWGN, The average number of either positive going or negative going zero-crossings per second at the output of a narrow bandpass filter when the input is a sine wave is:
Zero-Crossings of a Noisy Cosine

Additive white Gaussian noise

Where f0 = the center frequency of the filter B = the filter bandwidth SNR = the signal-to-noise power ratio in linear terms

Effects in phasor domain


In modern communication systems, bandlimited AWGN cannot be ignored. When modeling bandlimited AWGN in the phasor domain, statistical analysis reveals that the amplitudes of the real and imaginary contributions are independent variables which follow the Gaussian distribution model. When combined, the resultant phasor's magnitude is a Rayleigh distributed random variable while the phase is uniformly distributed from 0 to 2. The graph to the right shows an example of how bandlimited AWGN can affect a coherent carrier signal. The instantaneous response of the Noise Vector cannot be AWGN Contributions in the Phasor Domain precisely predicted, however its time-averaged response can be statistically predicted. As shown in the graph, we confidently predict that the noise phasor will reside inside the 1 circle about 38% of the time; the noise phasor will reside inside the 2 circle about 86% of the time; and the noise phasor will reside inside the 3 circle about 98% of the time.

References

Article Sources and Contributors

Article Sources and Contributors


Additive white Gaussian noise Source: http://en.wikipedia.org/w/index.php?oldid=562901854 Contributors: AttoRenato, Baba802, Bluemoose, Butlerm, Canis Lupus, Closedmouth, CosineKitty, DVdm, Daniel.kho, Dauberman, Deepak.mr, Fgouget, Funandtrvl, G-Man, GyroMagician, HalfShadow, Hovig berkeley, Ivanov id, Jaakobou, Ka9q, Marshallsumter, Melcombe, Michael Hardy, Nasa-verve, Nguyenngaviet, Oli Filth, Omegatron, Rnt20, Robofish, The Anome, Timo Honkasalo, Waveguy, Wavelength, 91 anonymous edits

Image Sources, Licenses and Contributors


Image:Zero crossing.jpg Source: http://en.wikipedia.org/w/index.php?title=File:Zero_crossing.jpg License: Public Domain Contributors: Kevin McClaning Image:Noisy Phasor.jpg Source: http://en.wikipedia.org/w/index.php?title=File:Noisy_Phasor.jpg License: Public Domain Contributors: Kevin McClaning

License
Creative Commons Attribution-Share Alike 3.0 //creativecommons.org/licenses/by-sa/3.0/

Potrebbero piacerti anche