Sampling Theorem, Flat-Top Sampling, FSK, Mutual Information
Sampling Theorem, Flat-Top Sampling, FSK, Mutual Information
Stage: fourth
Student Signature:
Introduction on sampling (3)
For continues-time signal x(t), the Fourier transform X(jw) is defined as
𝑥(𝑡) ↦ 𝑋(𝑗𝑤)
The band width of a signal X(t) is defined as the highest positive frequency beyond which the Fourier
transform of the signal is zero . if a bandwidth of a signal is finite, then it is called band-limited signal.
Sampling theorem
If a signal x(t) is band limited to B Hz then the sampling theorem states that x(t) can be reconstructed from its
samples provided the signal is sampled at a rate Fs such that
𝐹𝑠 ≥ 2𝐵 (2)
Where Fs=1/Ts is the sampling frequency and Ts is the sampling interval (length between the consecutive
samples). The lower bound Fs = 2B is called Nyquist rate. The unit for Fs is samples per second. From (2) we see
1
that T s ≥ 2𝐵
In words , we say for exact reconstruction. Sampling rate should be at least @B samples/sec or sampling
interval should be at most 1/2B seconds
Proof:
Consider the spectrum of band-limited signal x(t) given in fig.1 the sampling process which is done in time-
domain is , the multiplication of the continuous-time signal x(t) with a periodic impulse train with period Ts
defined as
𝛿𝑇𝑠(𝑡) = ∑∞
𝑘=−∞ 𝛿(𝑡 − 𝑘𝑇𝑠) . resulting in a sampled signal Xs(t) given by
𝑥𝑠 (𝑡) = 𝑥𝑡 × ∑∞ ∞
𝑘=−∞ 𝛿(𝑡 − 𝑘𝑇5 ) = ∑𝑘=−∞ 𝑥 (𝑘𝑇𝑠)𝛿(𝑡 − 𝑘𝑇𝑠 ) (3)
o see what happens in frequency domain, we have to digress for a while to talk about the Fourier transform of
𝛿𝑇𝑠 (𝑡) and the multiplication property of the Fourier transform. First of all let us see the Fourier transform of the
periodic impulse train. The Fourier series coefficients for periodic impulse train can be found out easily as
1
𝑎𝑟 = 𝑇𝑠
1
thus the Fourier series representation is given by
∞
1 𝑗(2𝜋𝑟)𝑡
𝛿𝑇𝑠 (𝑡) = ∑ 𝑒 𝑇𝑠
𝑇𝑠
𝑟=−∞
Fig.1
Nothing that Fourier transform of 𝑒 𝑗𝑤0 is 2𝜋𝛿(𝜔 − 𝜔0 ), taking Fourier transform for above equation we get
∞
2𝜋 2𝜋𝑟
𝐹𝑇{𝛿𝑇𝑠 (𝑡)} = ∑ 𝛿 (𝜔 − )
𝑇𝑠 𝑇5
𝑟=−∞
Fig.2
2
getting back x(t) from xs(t)) we multiply 𝑥𝑠 (𝑗𝜔) with rectangular pulse as shown in Fig.2 . This process is
called low-pass filtering. Note that
𝑠𝑖𝑛 2𝜋𝐵𝑡 𝑤
↔ 𝑟𝑒𝑐 ( )
𝜋𝑡 4𝜋𝐵
𝑠𝑖𝑛 2𝜋𝐵𝑡
The multiplication in frequency domain corresponds to the convolution of xs(t) and in time domain
𝜋𝑡
∞
𝑠𝑖𝑛 2𝜋𝐵𝑡 1 sin 2𝜋𝐵(𝑡 − 𝑘𝑇𝑠)
𝑥(𝑡) = 𝑥𝑠 (𝑡) × = [∑ 𝑥(𝐾𝑇𝑠 ) ]
𝜋𝑡 𝑇𝑠 𝜋(𝑡 − 𝑘𝑇𝑠)
𝑟=−∞
Aliasing
If the condition in equ(2) is not satisfied , the spectra in Fig.2 overlap and we call this as aliasing
During transmission, noise is introduced at top of the transmission pulse which can be easily removed if the
pulse is in the form of flat top. Here, the top of the samples are flat i.e. they have constant amplitude. Hence, it
is called as flat top sampling or practical sampling. Flat top sampling makes use of sample and hold circuit.
Theoretically, the sampled signal can be obtained by convolution of rectangular pulse p(t) with ideally sampled
signal say yδ(t) as shown in the diagram:
i.e. y(t)=p(t)×yδ(t)......(1)
To get the sampled spectrum, consider Fourier transform on both sides for equation 1
3
Y[ω]=F.T[P(t)×yδ(t)]
By the knowledge of convolution property,
Y[ω]=P(ω)Yδ(ω)
Here P(ω)=TSa(ωT/2)=2sinωT/ω
Frequency Shift Keying FSK is the digital modulation technique in which the frequency of the carrier
signal varies according to the digital signal changes. FSK is a scheme of frequency modulation.
The output of a FSK modulated wave is high in frequency for a binary High input and is low in
frequency for a binary Low input. The binary 1s and 0s are called Mark and Space frequencies.
The following image is the diagrammatic representation of FSK modulated waveform along with its
input.
To find the process of obtaining this FSK modulated wave, let us know about the working of a FSK modulator.
FSK Modulator
The FSK modulator block diagram comprises of two oscillators with a clock and the input binary sequence.
Following is its block diagram.
The two oscillators, producing a higher and a lower frequency signals, are connected to a switch along with an
internal clock. To avoid the abrupt phase discontinuities of the output waveform during the transmission of the
4
message, a clock is applied to both the oscillators, internally. The binary input sequence is applied to the
transmitter so as to choose the frequencies according to the binary input.
FSK Demodulator
There are different methods for demodulating a FSK wave. The main methods of FSK detection
are asynchronous detector and synchronous detector. The synchronous detector is a coherent one, while
asynchronous detector is a non-coherent one.
The FSK signal is passed through the two Band Pass Filters BPFsBPFs, tuned to Space and Mark frequencies.
The output from these two BPFs look like ASK signal, which is given to the envelope detector. The signal in
each envelope detector is modulated asynchronously. The decision circuit chooses which output is more likely
and selects it from any one of the envelope detectors. It also re-shapes the waveform to a rectangular one.
5
The FSK signal input is given to the two mixers with local oscillator circuits. These two are connected to two
band pass filters. These combinations act as demodulators and the decision circuit chooses which output is more
likely and selects it from any one of the detectors. The two signals have a minimum frequency separation.For
both of the demodulators, the bandwidth of each of them depends on their bit rate. This synchronous
demodulator is a bit complex than asynchronous type demodulators.
Mutual information
Mutual information is a quantity that measures a relationship between two random variables that are
sampled simultaneously. In particular, it measures how much information is communicated, on
average, in one random variable about another. Intuitively, one might ask, how much does one
random variable tell me about another? For example, suppose X represents the roll of a fair 6-sided
die, and Y represents whether the roll is even (0 if even, 1 if odd). Clearly, the value of Y tells us
something about the value of X and vice versa. That is, these variables share mutual information. On
the other hand, if X represents the roll of one fair die, and Z represents the roll of another fair die,
then X and Z share no mutual information. The roll of one die does not contain any information about
the outcome of the other die. An important theorem from information theory says that the mutual
information between two variables is 0 if and only if the two variables are statistically independent.
The formal definition of the mutual information of two random variables X and Y, whose joint
distribution is defined by P(X, Y) is given by
𝑝(𝑥, 𝑦)
𝐼(𝑥; 𝑦) = ∑ ∑ 𝑝(𝑥, 𝑦) 𝑙𝑜𝑔
𝑝(𝑥)𝑝(𝑦)
𝑦∈𝑦
𝑥∈𝑥
In this definition, P(X) and P(Y) are the marginal distributions of X and Y obtained through the
marginalization process described in the Probability Review document.
Refernces