0% found this document useful (0 votes)
1 views19 pages

Lecture 3

The document discusses the relationship between signals and vectors, introducing key concepts such as useful signals (unit impulse, unit step, and sinusoids), inner products, and orthogonality. It emphasizes the importance of correlation in signal comparison and its applications in engineering. Additionally, it covers the representation of signals using orthogonal signal sets, drawing parallels with vector spaces.

Uploaded by

Kévin Kéké
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
1 views19 pages

Lecture 3

The document discusses the relationship between signals and vectors, introducing key concepts such as useful signals (unit impulse, unit step, and sinusoids), inner products, and orthogonality. It emphasizes the importance of correlation in signal comparison and its applications in engineering. Additionally, it covers the representation of signals using orthogonal signal sets, drawing parallels with vector spaces.

Uploaded by

Kévin Kéké
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 19

EE1 and ISE1 Communications I

Pier Luigi Dragotti

Lecture three
Lecture Aims

• To introduce some useful signals,

• To present analogies between vectors and signals,


– Signal comparison: correlation,
– Energy of the sum of orthogonal signals,
– Signal representation by orthogonal signal set.

1
Useful Signals: Unit impulse function

The unit impulse function or Dirac function is defined as

δ(t) = 0 t 6= 0
Z ∞
δ(t)dt = 1
−∞
Multiplication of a function by an impulse.

g(t)δ(t − T ) = g(T )δ(t − T )


Z ∞
g(t)δ(t − T )dt = g(T ).
−∞

2
Useful Signals: Unit step function

Another useful signal is the unit step function u(t), defined by



 1 t≥0
u(t) =

0 t<0

Observe that 
Z t  1 t≥0
δ(α)dα =
−∞ 
0 t<0
Therefore
du
= δ(t).
dt
If you don’t understand this proof, use your intuition! The derivative of a ’jump’
is a Dirac.

3
Useful Signals: Sinusoids

Consider the sinusoid


x(t) = C cos(2πf0t + θ)
f0 (measured in Hertz) is the frequency of the sinusoid and T0 = 1/f0 is the period.

Sometimes we use ω0 (radiant per second) to express 2πf0.

Important identities

±jx 1 jx −jx 1 jx −jx


e = cos x ± j sin x, cos x = [e + e ], sin x = [e − e ],
2 2j

1
cos x cos y =
[cos(x + y) + cos(x − y)]
2
a cos x + b sin x = C cos(x + θ)
√ −1 −b

2 2
with C = a + b and θ = tan a

4
Signals and Vectors

• Signals and vectors are closely related. For example,


– A vector has components,
– A signal has also its components.

• Begin with some basic vector concepts,

• Apply those concepts to signals.

5
Inner product in vector spaces

x is a certain vector.
It is specified by its magnitude or length |x| and direction.
Consider a second vector y
We define the inner or scalar product of two vectors as

hy, xi = |x||y| cos θ.

Therefore, |x|2 = hx, xi.

When hy, xi = 0, we say that y and x are orthogonal (geometrically, θ = π/2).

6
Signals as vectors

The same notion of inner product can be applied for signals.


What is the useful part of this analogy?
We can use some geometrical interpretation of vectors to understand signals!
Consider two (energy) signals y(t) and x(t).
The inner product is defined by
Z ∞
hy(t), x(t)i = y(t)x(t)dt
−∞

For complex signals Z ∞


hy(t), x(t)i = y(t)x∗(t)dt.
−∞
Two signals are orthogonal if hy(t), x(t)i = 0.

7
Energy of orthogonal signals

If vectors x and y are orthogonal, and if z = x + y


2 2 2
|z| = |x| + |y| (Pythagorean Theorem).

If signals x(t) and y(t) are orthogonal and if z(t) = x(t) + y(t) then

Ez = Ex + Ey .

Proof: R∞
Ez = −∞
(x(t) + y(t))2dt
R∞ 2
R∞ 2
R∞
= −∞
x (t)dt + −∞
y (t)dt + 2 −∞
x(t)y(t)dt
R∞
= Ex + Ey + 2 −∞
x(t)y(t)dt

= Ex + Ey
R∞
since −∞
x(t)y(t)dt = 0.

8
Power of orthogonal signals

The same concepts of orthogonality and inner product extend to power signals.
For example, g(t) = x(t) + y(t) = C1 cos(ω1t + θ1) + C2 cos(ω2t + θ2) and
ω1 6= ω2.
C12 C22
Px = , Py = .
2 2
The signal x(t) and y(t) are orthogonal: hx(t), y(t)i = 0. Therefore,

C12 C22
Pg = Px + Py = + .
2 2

9
Signal comparison: Correlation

If vectors x and y are given, we have the correlation measure as

hy, xi
cn = cos θ =
|y||x|

Clearly, −1 ≤ cn ≤ 1.
In the case of energy signals:
Z ∞
1
cn = p y(t)x(t)dt
Ey Ex −∞

again −1 ≤ cn ≤ 1.

10
Best friends, worst enemies and complete strangers

• cn = 1. Best friends. This happens when g(t) = Kx(t) and K is positive.


The signals are aligned, maximum similarity.

• cn = −1. Worst Enemies. This happens when g(t) = Kx(t) and K is


negative. The signals are again aligned, but in opposite directions. The signals
understand each others, but they do not like each others.

• cn = 0. Complete Strangers The two signals are orthogonal. We may view


orthogonal signals as unrelated signals.

11
Correlation

Why bother poor undergraduate students with correlation?


Correlation is widely used in engineering.
For instance

• To design receivers in many communication systems

• To identify signals in radar systems

• For classifications.

12
Correlation examples

Find the correlation coefficients between:

• x(t) = A0 cos(ω0t) and y(t) = A1 sin(ω1t).


• x(t) = A0 cos(ω0t) and y(t) = A1 cos(ω1t) and ω0 6= ω1.
• x(t) = A0 cos(ω0t) and y(t) = A1 cos(ω0t).
• x(t) = A0 sin(ω0t) and y(t) = A1 sin(ω1t) and ω0 6= ω1.
• x(t) = A0 sin(ω0t) and y(t) = A1 sin(ω0t).
• x(t) = A0 sin(ω0t) and y(t) = −A1 sin(ω0t).

13
Correlation examples

Find the correlation coefficients between:

• x(t) = A0 cos(ω0t) and y(t) = A1 sin(ω1t) cx,y = 0.


• x(t) = A0 cos(ω0t) and y(t) = A1 cos(ω1t) and ω0 6= ω1 cx,y = 0.
• x(t) = A0 cos(ω0t) and y(t) = A1 cos(ω0t) cx,y = 1.
• x(t) = A0 sin(ω0t) and y(t) = A1 sin(ω1t) and ω0 6= ω1 cx,y = 0.
• x(t) = A0 sin(ω0t) and y(t) = A1 sin(ω0t) cx,y = 1.
• x(t) = A0 sin(ω0t) and y(t) = −A1 sin(ω0t) cx,y = −1.

14
Signal representation by orthogonal signal sets

• Examine a way of representing a signal as a sum of orthogonal signals.

• We know that a vector can be represented as the sum of orthogonal vectors.

• The results for signals are parallel to those for vectors.

• Review the case of vectors and extend to signals.

15
Orthogonal vector space

Consider a three-dimensional Cartesian vector space described by three mutually


orthogonal vectors, x1, x2 and x3.

 0 m 6= n
hxm, xni =

|xm|2 m=n

Any three-dimensional vector can be expressed as a linear combination of those


three vectors: g = c1x1 + c2x2 + c3x3.
Where ci = h|x
g ,x i i
|2
.
i

In this case, we say that this set of vector is complete.


Such vectors are known as a basis vector.

16
Orthogonal signal space

Same notions of completeness extend to signals.


A set of mutually orthogonal signals x1(t), x2(t), ..., xN (t) is complete if it can
represent any signal belonging to a certain space. For example:

g(t) ' c1x1(t) + c2x2(t) + ... + cN xN (t)

If the approximation error is zero for any g(t) then the set of signals
x1(t), x2(t), ..., xN (t) is complete. In general, the set is complete when N → ∞.
Infinite dimensional space (this will be more clear in the next lecture).

17
Summary

• Analogies between vectors and signals

• Inner product and correlation

• Energy and Power of orthogonal signals

• Signal representation by means of orthogonal signal

18

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy