Download - Introduction to Mobile Robotics - UvA...Introduction to Mobile Robotics Author arnoud Created Date 9/11/2017 12:10:24 PM ...

Transcript

Probabilistic Robotics

Bayes Filter Implementations

Gaussian filters

•Prediction

•Correction

Bayes Filter Reminder

111 )(),|()( tttttt dxxbelxuxpxbel

)()|()( tttt xbelxzpxbel

Gaussians

2

2)(

2

1

2

2

1)(

:),(~)(

x

exp

Nxp

-

Univariate

)()(2

1

2/12/

1

)2(

1)(

:)(~)(

μxΣμx

Σx

Σμx

t

ep

,Νp

d

Multivariate

),(~),(~ 22

2

abaNYbaXY

NX

Properties of Gaussians

2

2

2

1

22

2

2

1

2

112

2

2

1

2

2212

222

2

111 1,~)()(

),(~

),(~

NXpXp

NX

NX

• We stay in the “Gaussian world” as long as we

start with Gaussians and perform only linear transformations.

),(~),(~

TAABANYBAXY

NX

Multivariate Gaussians

1

2

1

1

2

21

11

21

221

222

111 1,~)()(

),(~

),(~

NXpXp

NX

NX

6

Discrete Kalman Filter

tttttt uBxAx 1

tttt xCz

Estimates the state x of a discrete-time controlled process that is governed by the linear stochastic difference equation

with a measurement

Kalman Filter Algorithm

Kalman Filter Algorithm

• Prediction • Observation

• Matching • Correction

9

The Prediction-Correction-Cycle

t

T

tttt

ttttt

tRAA

uBAxbel

1

1)(

2

,

222

1)(

tactttt

ttttt

ta

ubaxbel

Prediction

10

The Prediction-Correction-Cycle

1)(,)(

)()(

t

T

ttt

T

ttttttt

tttttt

t QCCCKCKI

CzKxbel

2

,

2

2

22 ,

)1(

)()(

tobst

tt

ttt

ttttt

t KK

zKxbel

Correction

11

The Prediction-Correction-Cycle

1)(,)(

)()(

t

T

ttt

T

ttttttt

tttttt

t QCCCKCKI

CzKxbel

2

,

2

2

22 ,

)1(

)()(

tobst

tt

ttt

ttttt

t KK

zKxbel

t

T

tttt

ttttt

tRAA

uBAxbel

1

1)(

2

,

222

1)(

tactttt

ttttt

ta

ubaxbel

Correction

Prediction

12

Kalman Filter Summary

•Highly efficient: Polynomial in measurement dimensionality k and state dimensionality n: O(k2.376 + n2)

•Optimal for linear Gaussian systems!

•Most robotics systems are nonlinear!

13

Nonlinear Dynamic Systems

•Most realistic robotic problems involve nonlinear functions

),( 1 ttt xugx

)( tt xhz

14

Linearity Assumption Revisited

15

Non-linear Function

16

EKF Linearization (1)

17

EKF Linearization (2)

18

EKF Linearization (3)

19

• Prediction:

•Correction:

EKF Linearization: First Order Taylor Series Expansion

)(),(),(

)(),(

),(),(

1111

11

1

111

ttttttt

tt

t

tttttt

xGugxug

xx

ugugxug

)()()(

)()(

)()(

ttttt

tt

t

ttt

xHhxh

xx

hhxh

20

21

22

EKF Algorithm

1. Extended_Kalman_filter( t-1, t-1, ut, zt):

2. Prediction:

3.

4.

5. Correction:

6.

7.

8.

9. Return t, t

),( 1 ttt ug

t

T

tttt RGG 1

1)( t

T

ttt

T

ttt QHHHK

))(( ttttt hzK

tttt HKI )(

1

1),(

t

ttt

x

ugG

t

tt

x

hH

)(

ttttt uBA 1

t

T

tttt RAA 1

1)( t

T

ttt

T

ttt QCCCK

)( tttttt CzK

tttt CKI )(

23

Localization

• Given • Map of the environment.

• Sequence of sensor measurements.

• Wanted • Estimate of the robot’s position.

• Problem classes • Position tracking

• Global localization

• Kidnapped robot problem (recovery)

“Using sensory information to locate the robot in its environment is the most fundamental problem to providing a mobile robot with autonomous capabilities.” [Cox ’91]

24

Landmark-based Localization

25

1. EKF_localization ( t-1, t-1, ut, zt, m): Prediction:

3.

5.

6.

7.

8.

),( 1 ttt ug T

ttt

T

tttt VMVGG 1

,1,1,1

,1,1,1

,1,1,1

1

1

'''

'''

'''

),(

tytxt

tytxt

tytxt

t

ttt

yyy

xxx

x

ugG

tt

tt

tt

t

ttt

v

y

v

y

x

v

x

u

ugV

''

''

''

),( 1

2

43

2

21

||||0

0||||

tt

ttt

v

vM

Motion noise

Jacobian of g w.r.t location

Predicted mean

Predicted covariance

Jacobian of g w.r.t control

26

1. EKF_localization ( t-1, t-1, ut, zt, m): Correction:

3.

5.

6.

7.

8.

9.

10.

)ˆ( ttttt zzK

tttt HKI

,

,

,

,

,

,),(

t

t

t

t

yt

t

yt

t

xt

t

xt

t

t

tt

rrr

x

mhH

,,,

2

,

2

,

,2atanˆ

txtxyty

ytyxtxt

mm

mmz

t

T

tttt QHHS

1 t

T

ttt SHK

2

2

0

0

r

r

tQ

Predicted measurement mean

Pred. measurement covariance

Kalman gain

Updated mean

Updated covariance

Jacobian of h w.r.t location

27

EKF Prediction Step

28

EKF Observation Prediction Step

29

EKF Correction Step

30

Estimation Sequence (1)

31

Estimation Sequence (2)

32

Comparison to GroundTruth

33

EKF Summary

•Highly efficient: Polynomial in measurement dimensionality k and state dimensionality n: O(k2.376 + n2)

•Not optimal!

•Can diverge if nonlinearities are large!

•Works surprisingly well even when all assumptions are violated!

34

Linearization via Unscented Transform

EKF UKF

35

UKF Sigma-Point Estimate (2)

EKF UKF

36

UKF Sigma-Point Estimate (3)

EKF UKF

37

Unscented Transform

nin

wwn

nw

nw

i

c

i

mi

i

cm

2,...,1for )(2

1 )(

)1( 2000

Sigma points Weights

)( ii g

n

i

Tiii

c

n

i

ii

m

w

w

2

0

2

0

))(('

'

Pass sigma points through nonlinear function

Recover mean and covariance

38

UKF_localization ( t-1, t-1, ut, zt, m):

Prediction:

2

43

2

21

||||0

0||||

tt

ttt

v

vM

2

2

0

0

r

r

tQ

TTT

t

a

t 000011

t

t

t

a

t

Q

M

00

00

001

1

a

t

a

t

a

t

a

t

a

t

a

t 111111

x

t

u

tt

x

t ug 1,

L

i

T

t

x

tit

x

ti

i

ct w2

0

,,

L

i

x

ti

i

mt w2

0

,

Motion noise

Measurement noise

Augmented state mean

Augmented covariance

Sigma points

Prediction of sigma points

Predicted mean

Predicted covariance

39

UKF_localization ( t-1, t-1, ut, zt, m):

Correction:

z

t

x

tt h

L

i

ti

i

mt wz2

0

Measurement sigma points

Predicted measurement mean

Pred. measurement covariance

Cross-covariance

Kalman gain

Updated mean

Updated covariance

Ttti

L

i

tti

i

ct zzwS ˆˆ,

2

0

,

Ttti

L

i

t

x

ti

i

c

zx

t zw ˆ,

2

0

,

,

1, t

zx

tt SK

)ˆ( ttttt zzK

T

ttttt KSK

40

1. EKF_localization ( t-1, t-1, ut, zt, m): Correction:

3.

5.

6.

7.

8.

9.

10.

)ˆ( ttttt zzK

tttt HKI

,

,

,

,

,

,),(

t

t

t

t

yt

t

yt

t

xt

t

xt

t

t

tt

rrr

x

mhH

,,,

2

,

2

,

,2atanˆ

txtxyty

ytyxtxt

mm

mmz

t

T

tttt QHHS

1 t

T

ttt SHK

2

2

0

0

r

r

tQ

Predicted measurement mean

Pred. measurement covariance

Kalman gain

Updated mean

Updated covariance

Jacobian of h w.r.t location

41

UKF Prediction Step

42

UKF Observation Prediction Step

43

UKF Correction Step

44

EKF Correction Step

45

Estimation Sequence

EKF PF UKF

46

Estimation Sequence

EKF UKF

47

Prediction Quality

EKF UKF

48

UKF Summary

•Highly efficient: Same complexity as EKF, with a constant factor slower in typical practical applications

•Better linearization than EKF: Accurate in first two terms of Taylor expansion (EKF only first term)

•Derivative-free: No Jacobians needed

•Still not optimal!

49

50

51

52

53

• [Arras et al. 98]:

• Laser range-finder and vision

• High precision (<1cm accuracy)

Kalman Filter-based System

[Courtesy of Kai Arras]

54

Multi- hypothesis Tracking

55