Cours_3

Download as pdf or txt
Download as pdf or txt
You are on page 1of 52

Online problem: Bayesian filter Bayesian filter: numerical approach Bayesian filter: analytical approaches

Estimation and identification


Course III: Online estimation problems

Samy Labsir
Webpage

Institut Polytechnique des sciences avancées

5th year SET - 2024/2025

1/52
Estimation and identification IPSA
Online problem: Bayesian filter Bayesian filter: numerical approach Bayesian filter: analytical approaches

Introduction

Context
▶ In a plethory of applications: estimation problem is dynamic.
▶ Examples: target tracking–GNSS navigation–robotics.
▶ Question: how to define estimation methods taking into account a temporal
evolution of the unknown parameter ?

Objective
▶ Define a generic framework for online estimation problem ⇒ Bayesian filtering.
,→ Analytical approaches: Kalman filter-based approaches.
,→ Numerical approaches: Monte Carlo filter-based approaches.

2/52
Estimation and identification IPSA
Online problem: Bayesian filter Bayesian filter: numerical approach Bayesian filter: analytical approaches

Principle

Sequential estimation: principle


At each instant k:
• existence of a hidden parameter xk ∈ Rp ,
• linked to a sensor measurement zk ∈ Rm .
Objective: perform a sequential estimation of xk from the set of measurements z1:k .

3/52
Estimation and identification IPSA
Online problem: Bayesian filter Bayesian filter: numerical approach Bayesian filter: analytical approaches

Principle

Bayesian filter: principle


• Statistic framework: xk and z1:k are uncertain then random.
,→ xk characterized by its transition density p(xk |xk−1 ) ⇝ assumed to be a
Markov chain.
,→ zk defined by its likelihood p(zk |xk ).

• Bayesian filter consists in computing recursively the posterior distribution:


p(xk |z1:k ).

p(xk−1 |z1:k−1 ) =⇒ p(xk |z1:k )


k =⇒ k +1

• Calculation of the average of p(xk |z1:k ):


Z
E(xk |z1:k ) ≜ bxk|k = xk p(xk |z1:k )dxk (1)

• Bayesian framework: optimal estimator in the sense of the MSE !

4/52
Estimation and identification IPSA
Online problem: Bayesian filter Bayesian filter: numerical approach Bayesian filter: analytical approaches

Application examples

Application examples (1/2)

→ GPS/GNSS navigation :
▶ Mobile position
estimation xk with
pseudo-distances
measurements zk .
▶ Pseudo-distances
computed between
mobile and every GNSS
satellite.

Figure: GNSS system operating principle

5/52
Estimation and identification IPSA
Online problem: Bayesian filter Bayesian filter: numerical approach Bayesian filter: analytical approaches

Application examples

Application examples (2/2)

→ Radar tracking :
▶ Position estimation xk
of a target with radar
measurements
(distance, angle and
velocity)
▶ Measurements
computed by a radar
receiver (distance,
arrival angle).

Figure: Radar operating principle


6/52
Estimation and identification IPSA
Online problem: Bayesian filter Bayesian filter: numerical approach Bayesian filter: analytical approaches

Application examples

▶ Computation done in two steps:

Prediction step:

p(xk−1 |z1:k−1 ) =⇒ p(xk |z1:k−1 ) (2)


Correction step:

p(xk |z1:k−1 ) =⇒ p(xk |z1:k ) (3)

7/52
Estimation and identification IPSA
Online problem: Bayesian filter Bayesian filter: numerical approach Bayesian filter: analytical approaches

Application examples

▶ Prediction step equation:

Z
p(xk |z1:k−1 ) = p(xk , xk−1 |z1:k−1 )dxk−1
xk ∈Rp
Z
= p(xk |xk−1 ) p(xk−1 |z1:k−1 )dxk−1
xk−1 ∈Rp

⇒ obtained with the Kolgomorov equation.

▶ Correction step equation:

p(xk |z1:k−1 ) p(zk |xk )


p(xk |z1:k ) =
p(zk |z1:k−1 )
⇒ obtained with the Bayes’ rule.

8/52
Estimation and identification IPSA
Online problem: Bayesian filter Bayesian filter: numerical approach Bayesian filter: analytical approaches

Application examples

Problem
▶ Distributions p(xk |xk−1 ) and p(zk |xk ) are generally known.
▶ BUT, without assumptions on p(xk−1 |z1:k−1 ), p(xk |z1:k−1 ) and p(xk |z1:k ) are
intractable.

Solutions
1- Assume a non-parametric based on samples drawn from the distribution
p(xk−1 |z1:k−1 ) → Monte-Carlo numerical approaches: Particle filter based-
methods.
2- Assume a parametric distribution on p(xk−1 |z1:k−1 ) → Analytical approaches:
Kalman filter based-methods.

9/52
Estimation and identification IPSA
Online problem: Bayesian filter Bayesian filter: numerical approach Bayesian filter: analytical approaches

Application examples

Online problem: Bayesian filter


Principle
Application examples

Bayesian filter: numerical approach


Monte Carlo and Bayesian filter
Particle filter

Bayesian filter: analytical approaches


Extended Kalman filter
Iterated Extended Kalman filter

10/52
Estimation and identification IPSA
Online problem: Bayesian filter Bayesian filter: numerical approach Bayesian filter: analytical approaches

Online problem: Bayesian filter


Principle
Application examples

Bayesian filter: numerical approach


Monte Carlo and Bayesian filter
Particle filter

Bayesian filter: analytical approaches


Extended Kalman filter
Iterated Extended Kalman filter

11/52
Estimation and identification IPSA
Online problem: Bayesian filter Bayesian filter: numerical approach Bayesian filter: analytical approaches

Monte Carlo and Bayesian filter

Online problem: Bayesian filter


Principle
Application examples

Bayesian filter: numerical approach


Monte Carlo and Bayesian filter
Particle filter

Bayesian filter: analytical approaches


Extended Kalman filter
Iterated Extended Kalman filter

12/52
Estimation and identification IPSA
Online problem: Bayesian filter Bayesian filter: numerical approach Bayesian filter: analytical approaches

Monte Carlo and Bayesian filter

Why Monte-Carlo approach ?


▶ Remind of Bayesian filter equations:
Z
p(xk |z1:k−1 ) = p(xk |xk−1 ) p(xk−1 |z1:k−1 )

p(xk |z1:k ) ∝ p(zk |xk ) p(xk |z1:k−1 )

▶ Difficult to compute theoretically and numerically these two equations !!


▶ We propose to deal with the global posterior distribution p(x0:k |z1:k ) given by:

p(x0:k |z1:k ) ∝ p(zk |xk ) p(xk |xk−1 ) p(x0:k−1 |z1:k−1 ) (4)

such as: Z
p(xk |z1:k ) = p(x0:k |z1:k )dx0:k−1 (5)

▶ Idea: use a Monte-Carlo (MC) approximation of p(x0:k |z1:k ) to obtain a MC


approximation of p(xk |z1:k ).
13/52
Estimation and identification IPSA
Online problem: Bayesian filter Bayesian filter: numerical approach Bayesian filter: analytical approaches

Monte Carlo and Bayesian filter

Monte-Carlo and Bayesian filter


Theorem: Monte Carlo approximation of a probabily density
function
▶ Let a probability distribution p(x). If we got N samples {x(i) }N
i=1 drawn from p,
then a Monte-Carlo approximation of p is given by:

N
1 X
p(x) ≃ δ(x − x(i) ) (6)
N
i=1

▶ On the other hand, if we sample from an importance density q(x), then a


importance Monte-Carlo approximation is given by:

N
X p(x(i) )
p(x) ≃ δ(x − x(i) ) (7)
q(x(i) )
i=1

14/52
Estimation and identification IPSA
Online problem: Bayesian filter Bayesian filter: numerical approach Bayesian filter: analytical approaches

Monte Carlo and Bayesian filter

Monte Carlo and Bayesian filter


▶ By using previous theorem, p(x0:k |z1:k ) can be approximated in the following
way:

(i)
,→ Let {xk }N
i=1 ∼ q(x0:k |z1:k ): an approximation of p(x0:k |z1:k ) is given by:

N (i)
X p(x |z1:k ) 0:k (i)
p(x0:k |z1:k ) ≃ (i)
δ(x0:k − x0:k ) (8)
i=1
q(x0:k |z1:k )
▶ Then, we deduce an approximation of p(xk |z1:k ):
N
Z X (i)
p(x |z1:k ) 0:k (i)
p(xk |z1:k ) ≃ (i)
δ(x0:k − x0:k )dx0:k−1 (9)
i=1
q(x0:k |z1:k )
N (i)
X p(x |z1:k )
0:k (i)
= (i)
δ(xk − xk ) (10)
i=1
q(x0:k |z1:k )
(11)

15/52
Estimation and identification IPSA
Online problem: Bayesian filter Bayesian filter: numerical approach Bayesian filter: analytical approaches

Monte Carlo and Bayesian filter

(i)
p(x0:k |z1:k ) (i)
▶ corresponds to the importance weights denoted wk and:
(i)
q(x0:k |z1:k )

N
X (i) (i)
p(xk |z1:k ) ≃ wk δ(xk − xk ) (12)
i=1

▶ From that, we can extract two weighted MC estimators of the mean and the
covariance of p(xk |z1:k ) given by:

N
X (i) (i)
bxk|k ≃ wk x k (13)
i=1
N   ⊤
X (i)
Pk|k ≃ wk bxk|k − xk(i) bxk|k − xk(i) (14)
i=1

16/52
Estimation and identification IPSA
Online problem: Bayesian filter Bayesian filter: numerical approach Bayesian filter: analytical approaches

Particle filter

Online problem: Bayesian filter


Principle
Application examples

Bayesian filter: numerical approach


Monte Carlo and Bayesian filter
Particle filter

Bayesian filter: analytical approaches


Extended Kalman filter
Iterated Extended Kalman filter

17/52
Estimation and identification IPSA
Online problem: Bayesian filter Bayesian filter: numerical approach Bayesian filter: analytical approaches

Particle filter

Particle filter
Question
How to choose the importance density ?

Particle filter: principle


▶ Particle filter : Bayesian Monte-Carlo filter approximating p(x0:k |z1:k ) by using
the following factorized importance density:

q(x0:k |z1:k ) = q(xk |x0:k−1 , zk ) q(x0:k−1 |z1:k−1 )


.
▶ A sample x(i) from q(x0:k |z1:k ) can be gathered by sampling xk (i) according to
0:k
q(xk |x0:k−1 , zk ).

▶ Recursion of the a posterior distribution based on the Bayes’ rule:

p(xk |xk−1 ) p(zk |xk ) p(x0:k−1 |z1:k−1 )


p(x0:k |z1:k ) = R
p(xk |xk−1 ) p(zk |xk ) p(x0:k−1 |z1:k−1 ) d x0:k

18/52
Estimation and identification IPSA
Online problem: Bayesian filter Bayesian filter: numerical approach Bayesian filter: analytical approaches

Particle filter

▶ We make appear the factorized importance density :

p(x0:k |z1:k ) =
p(xk |xk−1 ) p(zk |xk ) p(x0:k−1 |z1:k−1 ) q(xk |x0:k−1 , zk ) q(x0:k−1 |z1:k−1 )
q(xk |x0:k−1 , zk ) q(x0:k−1 |z1:k−1 )
R p(xk |xk−1 ) p(zk |xk ) p(x0:k−1 |z1:k−1 ) q(xk |x0:k−1 , zk ) q(x0:k−1 |z1:k−1 ) d x0:k
q(xk |x0:k−1 , zk ) q(x0:k−1 |z1:k−1 )
(15)

▶ If we have {xk (i) }Ns samples drawn from q(xk |x0:k−1 , zk ), then the MC
i=1
approximation of p is:

Ns p(x (i) |x(i) ) p(z |x (i) ) p(x(i) |z )


P k k−1 k k 0:k−1 1:k−1 (i)
(i) (i)
δ(x0:k − x0:k )
i=1 q(xk (i) |x0:k−1 , zk ) q(x0:k−1 |z1:k−1 )
p(x0:k |z1:k ) ≃
Ns p(x (i) |x(i) ) p(z |x (i) ) p(x(i) |z )
P k k−1 k k 0:k−1 1:k−1
(i) (i)
i=1 q(xk (i) |x0:k−1 , zk ) q(x0:k−1 |z1:k−1 )

19/52
Estimation and identification IPSA
Online problem: Bayesian filter Bayesian filter: numerical approach Bayesian filter: analytical approaches

Particle filter

Particle filter
We make appear:

(i)
(i) p(x0:k−1 |z1:k )
wk−1 = (i)
⇒ Importance weights computed at previous instant (16)
q(x0:k−1 |z1:k )

Then:
Ns p(x (i) |x(i) ) p(z |x (i) ) w (i)
P k k−1 k k k−1 (i)
(i)
δ(x0:k − x0:k )
i=1 q(xk (i) |x0:k−1 , zk )
p(x0:k |z1:k ) ≃
Ns p(x (i) |x(i) ) p(z |x (i) ) w (i)
P k k−1 k k k−1
(i)
i=1 q(xk (i) |x0:k−1 , zk )

20/52
Estimation and identification IPSA
Online problem: Bayesian filter Bayesian filter: numerical approach Bayesian filter: analytical approaches

Particle filter

Particle filter
Thus:

Ns
X (i) (i)
p(x0:k |z1:k ) ≃ wk δ(x0:k − x0:k )
i=1

with:
(i) (i)
p(xk (i) |xk−1 ) p(zk |xk (i) ) wk−1
(i)
(i) q(xk (i) |x0:k−1 , zk )
wk =
Ns p(x (i) |x(i) ) p(z |x (i) ) w (i)
P k k−1 k k k−1

q(x (i) |x(i) , z )


i=1 k 0:k−1 k

▶ This equation defines a weight recursion between w (i) and w (i) which allows to
k k−1
recursively update the posterior distribution.
▶ Advantage: any distribution don’t need to be Gaussian !

21/52
Estimation and identification IPSA
Online problem: Bayesian filter Bayesian filter: numerical approach Bayesian filter: analytical approaches

Particle filter

Particle filter: architecture

22/52
Estimation and identification IPSA
Online problem: Bayesian filter Bayesian filter: numerical approach Bayesian filter: analytical approaches

Particle filter

Particle filter: algorithm


Algorithm 1: Generic particle filter
(i) (i)
Inputs: Samples (particles) and initial weights {x0 }Ns Ns
i=1 , {w0 }i=1 , N: instants
while k ≤ N do
→ Prediction step
while i ≤ Ns do
(i) (i)
Sample xk ∼ q(xk |x0:k−1 , zk )
→ Correction step
while i ≤ Ns do
(i)
Computation of the weights wk according to (17)
. → Extraction of the estimated mean of p(xk |z1:k ).

Ns
X (i)
bxk|k = wk xk (i)
i=1

23/52
Estimation and identification IPSA
Online problem: Bayesian filter Bayesian filter: numerical approach Bayesian filter: analytical approaches

Particle filter

Particle filter: simplification

▶ Basically, it is challenging to sample from q(xk |x0:k , zk ).


▶ If q(xk |x0:k−1 , zk ) = p(xk |xk−1 ) then:

(i)
(i) p(zk |xk (i) ) wk−1
wk = (17)
Ns
P (i)
p(zk |xk (i) ) wk−1
i=1

▶ In this case, the particle filter is denoted SIS algorithm ( Sequential Importance
Sampling).

24/52
Estimation and identification IPSA
Online problem: Bayesian filter Bayesian filter: numerical approach Bayesian filter: analytical approaches

Particle filter

SIS algorithm
Algorithm 2: SIS algorithm: summary
(i) (i)
Inputs: Samples (particles) and initial weights {x0 }Ns
i=1 , {w0 }, N : instants.
while k ≤ N do
→ Prediction step
while i ≤ Ns do
(i)
xk (i) ∼ p(xk |xk−1 )
→ Correction step
while i ≤ Ns do
(i)
Computation of the weights wk
. • Extraction of the estimated mean of p(xk |z1:k ).

Ns
X (i)
bxk|k = wk xk (i)
i=1

25/52
Estimation and identification IPSA
Online problem: Bayesian filter Bayesian filter: numerical approach Bayesian filter: analytical approaches

Particle filter

Resampling
0.04

▶ Major problem of the SIS algorithm: weight 0.035

0.03

degeneration. 0.025

0.02

• The variance of the weights continues 0.015

to increase with each iteration. 0.01

▶ Consequence: a single particle with a 0.005

non-zero weight is obtained. 0 0.5 1 1.5 2

▶ Divergence.
Figure: Weight distribution:
Instant 1
0.4 1

0.9
0.35
0.8
0.3
0.7

0.25
0.6

0.2 0.5

0.4
0.15
0.3
0.1
0.2

0.05
0.1

0 0
0 0.5 1 1.5 2 0 0.5 1 1.5 2

26/52
Figure: Instant 2 Figure: Instant 50
Estimation and identification IPSA
Online problem: Bayesian filter Bayesian filter: numerical approach Bayesian filter: analytical approaches

Particle filter

Resampling
▶ Solution:
1. resample the particles as follows:
Ns
(j)
X (i) (i)
x0:k ∼ wk δ(x0:k − x0:k )
i=1

→ preserves particles of significant weight.


→ is equivalent to drawing an index j according to a categorical law.
(j)
2. Make each resampled particle equiprobable → wk = N1 .
s
▶ Resampling performed when degeneration is too strong → need to use an
indicator.
▶ Effective number of particles:
1
Neff =
Ns  2
P (i)
wk
i=1

▶ SIR (Sequential Importance Resampling) algorithm.


27/52
Estimation and identification IPSA
Online problem: Bayesian filter Bayesian filter: numerical approach Bayesian filter: analytical approaches

Particle filter

SIR algorithm
Algorithm 3: SIR algorithm: summary
(i) (i)
Inputs: Initial particles and weights {x0 }Ns Ns
i=1 , {w0 }i=1 , N: instants.
while k ≤ N do
→ Prediction step
while i ≤ Ns do
(i)
Drawn xk (i) ∼ p(xk |xk−1 )
. → Correction step
while i ≤ Ns do
(i)
Computation of the weights wk according to the equation (17)
. → Resampling
If Neff ≤ γ
Ns
(⋆,j) P (i) (⋆,i)
1. x1:k ∼ wk δ(x0:k − x0:k ).
i=1
(j) 1
2. wk = ∀ j ∈ {1, . . . , Ns }.
Ns Ns
P (i)
bxk|k = wk xk (⋆,i)
i=1
28/52
Estimation and identification IPSA
Online problem: Bayesian filter Bayesian filter: numerical approach Bayesian filter: analytical approaches

Online problem: Bayesian filter


Principle
Application examples

Bayesian filter: numerical approach


Monte Carlo and Bayesian filter
Particle filter

Bayesian filter: analytical approaches


Extended Kalman filter
Iterated Extended Kalman filter

29/52
Estimation and identification IPSA
Online problem: Bayesian filter Bayesian filter: numerical approach Bayesian filter: analytical approaches

Bayesian filter: analytical approaches


▶ Remind of Bayesian filter equations:
Z
p(xk |z1:k−1 ) = p(xk |xk−1 ) p(xk−1 |z1:k−1 )dxk−1

p(xk |z1:k ) ∝ p(zk |xk ) p(xk |z1:k−1 )

Assumptions
1- Gaussian linear assumption on p(xk |xk−1 ) and p(zk |xk )

p(xk |xk−1 ) = NRp (xk ; Fk xk−1 , Qk ) ⇔ xk = Fk xk−1 + vk , vk ∼ NRp (0, Qk )


p(zk |xk ) = NRm (zk ; Hk xk , Rk ) ⇔ zk = Hk xk + nk nk ∼ NRm (0, Rk )

2- Gaussian hypothesis on the previous distribution p(xk−1 |z1:k−1 ):

p(xk−1 |z1:k−1 ) = NRp (xk ; x̂k−1|k−1 , Pk−1|k−1 )


30/52
Estimation and identification IPSA
Online problem: Bayesian filter Bayesian filter: numerical approach Bayesian filter: analytical approaches

With these assumptions, we can obtain:


• First, that the predicted distribution is Gaussian

p(xk |z1:k−1 ) = N (xk ; b


xk|k−1 , Pk|k−1 ) (18)

with b
xk|k−1 and Pk|k−1 expressed as a function of b
xk−1|k−1 , and Pk−1|k−1 ,
• Second, that the corrected distribution is also Gaussian:

p(xk |z1:k ) = N (xk ; b


xk|k , Pk|k ) (19)

with b
xk|k and Pk|k expressed as a function of b
xk|k−1 , and Pk|k−1 ,


{b
xk−1|k−1 , Pk−1|k−1 } ⇒ {b
xk|k−1 , Pk|k−1 }
{b
xk|k−1 , Pk|k−1 } ⇒ {b
xk|k , Pk|k }

,→ Kalman filter algorithm.

31/52
Estimation and identification IPSA
Online problem: Bayesian filter Bayesian filter: numerical approach Bayesian filter: analytical approaches

Kalman filter: architecture

Figure: Kalman filter architecture

32/52
Estimation and identification IPSA
Online problem: Bayesian filter Bayesian filter: numerical approach Bayesian filter: analytical approaches

Kalman filter equations


Assumptions:

p(xk |z1:k−1 ) = NRp (xk ; x̂k|k−1 , Pk|k−1 )


p(xk |z1:k ) = NRp (xk ; x̂k|k , Pk|k )

Prediction step:

bxk|k−1 = Fk bxk−1|k−1
Pk|k−1 = Fk Pk−1|k−1 (Fk )⊤ + Qk

Correction step:
 
 x̂k|k = bxk|k−1 + Kk zk − Hkbxk|k−1

Pk|k = (I − Kk Hk ) Pk|k−1
 −1
Kk = Pk|k−1 (Hk )⊤ Hk Pk|k−1 H⊤

k + Rk

33/52
Estimation and identification IPSA
Online problem: Bayesian filter Bayesian filter: numerical approach Bayesian filter: analytical approaches

Elements of proof

Prediction step:
Z
p(xk |z1:k−1 ) = p(xk , xk−1 |z1:k−1 )dxk−1
xk−1 ∈Rp

• p(xk , xk−1 |z1:k−1 ) = p(xk |xk−1 ) p(xk−1 |z1:k−1 ) is Gaussian and


| {z } | {z }
N (xk ;Fk xk−1 ,Qk ) N p (x ;x
R k k−1|k−1 ,Pk−1|k−1 )
b
we show that:

xk|k−1 , Fk Pk|k−1 (Fk )⊤ + Qk )


p(xk |z1:k−1 ) = NRp (xk ; b
| {z }
Fk xk−1|k−1
b

34/52
Estimation and identification IPSA
Online problem: Bayesian filter Bayesian filter: numerical approach Bayesian filter: analytical approaches

Correction step:

▶ Determination of the distribution p(xk , zk |z1:k−1 )


h i h i 
xk x̂k|k−1 Pk|k−1 Pk|k−1 (Hk )⊤
p(xk , zk |z1:k−1 ) = NRp ,
zk Hk x̂k|k−1 Hk Pk|k−1 Hk Pk|k−1 (Hk )⊤ + Rk

Theorem
Let be a and b two Gaussian vectors on Rc and Rd such as:
h i h i h i
a ma Paa Pab
p(a, b) = NRc+d ,
b mb Pba Pbb

then: 
p(a|b) = NRc a; ma|b , Pa|b
with:
ma|b = ma + Pab Pbb −1 (b − mb )

(20)
−1
Pa|b = Paa − Pab Pbb Pba (21)

35/52
Estimation and identification IPSA
Online problem: Bayesian filter Bayesian filter: numerical approach Bayesian filter: analytical approaches

Elements of proof

• We apply the theorem on:



p(xk |zk |z1:k−1 ) ≜ p(xk |z1:k ) = NRp xk ; b
xk|k , Pk|k

with:
( −1
bxk|k = bxk|k−1 + Pk|k−1 (Hk )⊤ Hk Pk|k−1 Hk + Rk

zk − Hk b
xk|k−1 (22)
−1
Pk|k = Pk|k−1 − Pk|k−1 (Hk )⊤ Hk Pk|k−1 Hk + Rk Hk Pk|k−1 (23)

−1
• By noting Kk = Pk|k−1 (Hk )⊤ Hk Pk|k−1 Hk + Rk , we find correction step
equations.

36/52
Estimation and identification IPSA
Online problem: Bayesian filter Bayesian filter: numerical approach Bayesian filter: analytical approaches

Extended Kalman filter

Online problem: Bayesian filter


Principle
Application examples

Bayesian filter: numerical approach


Monte Carlo and Bayesian filter
Particle filter

Bayesian filter: analytical approaches


Extended Kalman filter
Iterated Extended Kalman filter

37/52
Estimation and identification IPSA
Online problem: Bayesian filter Bayesian filter: numerical approach Bayesian filter: analytical approaches

Extended Kalman filter

Bayesian filter: linear approximation

▶ Classically, sensor and dynamic models are non-linear (but stay Gaussians !)

p(xk |xk−1 ) = NRp (xk ; fk ( xk−1 ), Qk ) (24)


p(zk |xk ) = NRm (zk ; hk (xk ), Rk ) (25)

▶ Not possible to compute exactly the moments of p(xk |z1:k−1 ) et p(xk |z1:k )
▶ Solution : linear approximation of the models:

( 
fk (xk−1 ) = fk (b
xk−1|k−1 ) + Jfk xk−1 − b
xk−1|k−1 (26)

hk (xk ) = hk (b
xk|k−1 ) + Jhk xk − b
xk|k−1 (27)

Jfk : Jacobian of fk computed in b


xk−1|k−1 . Jhk : Jacobian of hk computed in b
xk|k−1 .

38/52
Estimation and identification IPSA
Online problem: Bayesian filter Bayesian filter: numerical approach Bayesian filter: analytical approaches

Extended Kalman filter

1− It ensues a Gaussian linear approximation distributions p(xk |xk−1 ) et p(zk |xk ):



p(xk |xk−1 ) ≃ NRp (xk ; fk (b
xk−1|k−1 ) + Jfk xk−1 − b
xk−1|k−1 , Qk )

p(zk |xk ) ≃ NRm (zk ; hk (b
xk|k−1 ) + Jhk xk − b
xk|k−1 , Rk )

2− Furthemore, we assume that the previous distribution is approximated by a


Gaussian distribution;

p(xk |z1:k−1 ) ≃ N (xk ; b


xk|k , Pk|k ) (28)

• From that, Gaussian approximations of predicted and corrected distributions:


Z

• p(xk |z1:k−1 ) ≃ NRp (xk ; fk (b
xk−1|k−1 ) + Jfk xk−1 − b
xk−1|k−1 , Qk )

p(xk−1 |z1:k−1 ) dxk−1


≃ NRp (xk ; b
xk|k−1 , Pk|k−1 )

• p(xk |z1:k−1 ) ≃ NRm (zk ; hk (x̂k|k−1 ) + Jhk xk − x̂k|k−1 , Rk ) × p(xk |z1:k−1 )
≃ NRp (xk ; b
xk|k , Pk|k ) (29)

39/52
Estimation and identification IPSA
Online problem: Bayesian filter Bayesian filter: numerical approach Bayesian filter: analytical approaches

Extended Kalman filter

▶ Recursions of {b xk|k } and {Pk|k−1 , Pk|k } define the Extended Kalman


xk|k−1 , b
filter algorithm.

Prediction step
(
bxk|k−1 = fk ( x̂k−1|k−1 )
⊤
Pk|k−1 = Jfk Pk−1|k−1 Jfk + Qk

Correction step

 
 bxk|k = x̂k|k−1 + Kk zk − hk x̂k|k−1

Pk|k = (I − Kk Hk ) Pk|k−1
 ⊤ −1
Jhk Pk|k−1 J⊤

Kk = Pk|k−1 Jhk hk + Rk

40/52
Estimation and identification IPSA
Online problem: Bayesian filter Bayesian filter: numerical approach Bayesian filter: analytical approaches

Iterated Extended Kalman filter

Online problem: Bayesian filter


Principle
Application examples

Bayesian filter: numerical approach


Monte Carlo and Bayesian filter
Particle filter

Bayesian filter: analytical approaches


Extended Kalman filter
Iterated Extended Kalman filter

41/52
Estimation and identification IPSA
Online problem: Bayesian filter Bayesian filter: numerical approach Bayesian filter: analytical approaches

Iterated Extended Kalman filter

Iterated Extended Kalman filter


Iterated Extended Kalman Filter (IEKF): principle
▶ IEKF: improvement of the EKF to obtain an approximation of p(xk |z1:k−1 ) and
p(xk |z1:k ) more precise.
▶ Based on the same non-linear models:

xk = fk (xk−1 ) + vk vk ∼ NRp (0, Qk ) (30)


zk = hk (xk ) + nk nk ∼ NRp (0, Rk ) (31)

▶ It determines a direct Gaussian approximation of

p(xk |z1:k−1 ) ≃ NRp (xk ; b


xk|k−1 , Pk|k−1 ) (32)
p(xk |z1:k ) ≃ NRp (xk ; b
xk|k , Pk|k ) (33)

in which the non-linearities of the models is preserved.

42/52
Estimation and identification IPSA
Online problem: Bayesian filter Bayesian filter: numerical approach Bayesian filter: analytical approaches

Iterated Extended Kalman filter

IEKF: principle
How to derive this approximation i.e. means {b xk|k } and covariance matrices
xk|k−1 , b
{Pk|k−1 , Pk|k } ?
,→ Means obtained by computing the mode of both distributions.
,→ Covariance matrices are obtained by a second-order linearization around the
means.
,→ Fitting a Gaussian with these new parameters.

IEKF: method
• Non-linearities of the models → finding the mode of these distributions is not
feasible !
• Must be computed by minimization of a non-linear least squares criterion.
,→ Estimation problem ⇔ Optimization problem.

▶ In the same way as the EKF, the algorithm is divided in two steps: prediction
and correction.

43/52
Estimation and identification IPSA
Online problem: Bayesian filter Bayesian filter: numerical approach Bayesian filter: analytical approaches

Iterated Extended Kalman filter

Iterated Extended Kalman filter: equations


Prediction step: p(xk |z1:k ) ≃ NRp (xk ; b
xk|k−1 , Pk|k−1 )

• Computation of b
xk|k−1 :

▶ We want to resolve:


xk|k−1 = argmax p(xk |z1:k−1 ) (34)
xk
 ⋆ ⋆
⇔ xk|k−1 , xk−1|k−1 = argmax p(xk , xk−1 |z1:k−1 ) (35)
xk ,xk−1

where:

1 1
   
p(xk , xk−1 |z1:k−1 ) ∝ exp − ||xk − fk (xk−1 )||2Qk exp − ||xk−1 − xk−1|k−1 ||2P
2 2 k−1|k−1

44/52
Estimation and identification IPSA
Online problem: Bayesian filter Bayesian filter: numerical approach Bayesian filter: analytical approaches

Iterated Extended Kalman filter

Iterated Extended Kalman filter


▶ Consequently, the maximization problem can rewritten as a minimization
problem:

⋆ ⋆
{xk|k−1 , xk−1|k−1 }= argmin − log p(xk , xk−1 |z1:k−1 )
x̃k ∈Rp ,x̃k−1 ∈Rp

1
 
= argmin ||xk − fk (xk−1 )||2Qk + ||xk−1 − b
xk−1|k−1 ||2P
xk ∈R ,xk−1 ∈R 2
p p k−1|k−1

Solutions of the optimization problem are trivial and are written as:
( ⋆
xk−1|k−1 =b
xk−1|k−1 (36)

xk|k−1 = fk (b
xk−1|k−1 ) ≜ b
xk|k−1 (37)

45/52
Estimation and identification IPSA
Online problem: Bayesian filter Bayesian filter: numerical approach Bayesian filter: analytical approaches

Iterated Extended Kalman filter

• Computation of Pk|k−1

We define the function J as follows:

J(xk , xk−1 ) = −2 log p(xk , xk−1 |z1:k−1 ) = ||ϕ(xk , xk−1 )||2Bk , (38)
 ⊤
ϕ(xk , xk−1 ) = xk − fk (xk−1 ), xk−1 − b
xk−1|k−1 (39)
h i
Qk 0
Bk = (40)
0 Pk−1|k−1

▶ First order development of ϕ at x⋆ ⋆


and xk|k−1 :
k−1|k−1

1
 
⋆ ⋆
p(xk , xk−1 ) ≃∝ exp − ||ϕ(xk|k−1 , xk−1|k−1 ) + Jϕ δ x ||2Bk (41)
2
1 ⊤ ⊤ −1
 
∝ exp − δ J B Jϕ δ x (42)
2 x ϕ k

with Jϕ Jacobian matrix of ϕ computed at xk|k−1 and:
 ⋆ ⋆

δ x = xk − xk|k−1 , xk−1 − xk−1|k−1 (43)

46/52
Estimation and identification IPSA
Online problem: Bayesian filter Bayesian filter: numerical approach Bayesian filter: analytical approaches

Iterated Extended Kalman filter

Iterated Extended Kalman filter


▶ Then, the covariance matrix of p(xk , xk−1 |z1:k−1 ) is:

−1
−1
Pak|k−1 = J⊤
ϕ Bk Jϕ = A−1
k

.
▶ As (magic..):
 
Q−1 −Q−1 Jfk
h i
Pk|k−1 ...
Ak = ⊤ k −1 ⊤
k and Pak|k−1 = (44)
Jf Qk Jf Qk Jfk + Pk−1|k−1 ... ...
k k

we obtain (magic..):
Pk|k−1 = Jfk Pk−1|k−1 J⊤
fk + Qk (45)
Jfk : Jacobian matrix of fk computed at ⋆
xk−1|k−1 .

47/52
Estimation and identification IPSA
Online problem: Bayesian filter Bayesian filter: numerical approach Bayesian filter: analytical approaches

Iterated Extended Kalman filter

Correction step: p(xk |z1:k ) = NRp (xk ; b


xk|k , Pk|k )

▶ At the correction step, Gaussian assumptions + following update equation:

p(xk |z1:k ) ∝ p(zk |xk ) p(xk |z1:k−1 ) (46)


| {z } | {z }
N (zk ;hk (xk ),Rk ) N (x ;x
k k|k−1 ,Pk|k−1 )
b

allows to obtain:

1 1
log p(xk |z1:k−1 ) = C2 + ||zk − hk (xk )||2Rk + ||xk − b
xk|k−1 ||2P (47)
2 2 k|k−1

where C2 ∈ R is a constant.

48/52
Estimation and identification IPSA
Online problem: Bayesian filter Bayesian filter: numerical approach Bayesian filter: analytical approaches

Iterated Extended Kalman filter

Computation of b
xk|k
▶ We look for the mode of p(xk |z1:k ).
▶ It is obtained by resolving the following optimization problem:

bxk|k = argmin − 2 log p(xk |z1:k−1 ) (48)


xk ∈Rp

bxk|k = argmin ∥ϕ(xk )∥2Σk (49)


xk ∈Rp
h i
 ⊤ Rk 0
ϕ(xk ) = zk − hk (xk ), xk − b
xk|k−1 , Σk = (50)
0 Pk|k−1

▶ Solution not trivial ⇒ can be obtained with an optimization algorithm.


▶ Criterion: non-linear squares based ⇒ Gauss-Newton algorithm.

49/52
Estimation and identification IPSA
Online problem: Bayesian filter Bayesian filter: numerical approach Bayesian filter: analytical approaches

Iterated Extended Kalman filter

▶ At each iteration l:
(l) ⊤ (l) (l) ⊤
x(l+1) = x(l) − ( Jϕ Σ−1
k
Jϕ )−1 Jϕ Σ−1
k
ϕ(x(l) ) (51)
k k k

which can be rewritten as:


 
(l) (l)
x(l+1) = b
xk|k−1 + Kk zk − (hk (x(l) ) + Jh (b
xk|k−1 − x(l) )) (52)
k

(l) (l)
Jh : Jacobian matrix of hk computed at x(l) , and Kk is the Kalman gain computed
k
at iteration l :
(l) (l) (l) (l)
Kk = Pk|k−1 (Jh )⊤ (Jh Pk−1|k−1 (Jh )⊤ + Rk )−1 (53)
k k k

Computation of Pk|k : Same method as in the prediction step !

−1
−1
Pk|k = J⊤
ϕ Σk Jϕ (54)

Jϕ : Jacobian matrix of ϕ(x(L) ).


50/52
Estimation and identification IPSA
Online problem: Bayesian filter Bayesian filter: numerical approach Bayesian filter: analytical approaches

Iterated Extended Kalman filter

Iterated Extended Kalman filter: summary


Prediction step (same as the EKF):

(
bxk|k−1 = fk ( x̂k−1|k−1 )
⊤
Pk|k−1 = Jfk Pk−1|k−1 Jfk + Qk

Correction step:

bxk|k = argmin ∥ϕ(xk )∥2Σk ⇒ Gauss-Newton algorithm



(55)
xk ∈Rp



 h i
 ϕ(xk ) = zk − hk (xk ), xk − bxk|k−1 ⊤ , Σk = Rk 0

  
 (56)
0 Pk|k−1
⊤ −1
−1
Pk|k = Jϕ Σk Jϕ (57)





 Jϕ : Jacobian matrix of ϕ(xk ) computed as the


last iteration of Gauss-Newton algorithm

51/52
Estimation and identification IPSA
Online problem: Bayesian filter Bayesian filter: numerical approach Bayesian filter: analytical approaches

Iterated Extended Kalman filter

A few references..

[1] Sarkka S., Bayesian filtering and smoothing, Cambridge, 2013.


[2] Doucet A., Freitas N., Gordon N., Sequential Monte-Carlo methods in practice,
Springer, 1997.

52/52
Estimation and identification IPSA

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy