Systems and Control Theory Lecture Notes
Laura Giarr´ e
Lesson 9: Stochastic processes and systems
Stochastic Processes
Stochastic Systems
Ergodic Processes
State representation of a stochastic dynamical system
Stochastic processes 1
Def: A stochastic process is a sequence of random variables with a joint pdf.
A random variable (r.v.) is a mathematical tool, based on the theory the probability, which phenomena (its manifestation) are described by random mechanisms.
A Ω space of events is the set in which the random
phenomenon assumes its realization: an event is a subset of that space.
If the space of events is I, then the corresponding r.v. is called integer (roll of a dice, extraction of a number from an urn, the number of customers...)
If the space of events is R, the corresponding r.v is called real, (quantization error, measurement instrument error,
mechanical structural vibration...)
Stochastic Processes 2
We need to describe not a particular signal x(t), but all the set of signals {x(t)} produced by certain phenomena
A stochastic process X is a set of random variables characterized by a temporal index t ∈ T,
X (t) : Ω
event space
× T
time domain
→ R
X = {X (t), t ∈ T}
Mean: m
x= E[x(t)] =
∞−∞
xf
X(x; t)dx
Correlation: R
x= E[x
1(t)x
2(t)]
Covariance:
C
x(t
1, t
2) = E[(x(t
1) − m
x(t
1))(x(t
2) − m
x(t
2))
T]
Variance: Var (x(t)) = R
x(t, t) − E[x(t)]
2Stochastic Processes: Stationarity
Traditional definitions:
A Stochastic process is Wide-sense stationary (WSS) if m
x(t) = m = constant
R
x(t
1, t
2) = R
x(t
1− t
2)
”This may be a limiting definition.” We will discuss shortly.
Common Framework for deterministic and stochastic systems
A typical setup noise
(stochastic)
experiment (deterministic)
output (mixed)
not constant (loose stationarity).
Quasi-stationary
Consider signals with the following assumptions:
1 ) E[s(t)] = m
s(t) |m
s(t)| ≤ C ∀t 2 ) E[s(t)s(r)] = R
s(t, r) |R
s(t, r)| ≤ C∀t Then,
N→∞
lim 1 N
N t=1R
S(t, t − τ) = R
s(τ)∀τ
s(t) is called quasi-stationary
If s(t) is a stationary process, then it satisfies 1), 2) trivially.
If s(t) is a deterministic signal, 1) |s(t)| ≤ C , 2 ) lim
N→∞ N1 Nt=1
s(t)s(t − τ ) = R
s(τ)
Notations
In general: s(t) = x(t)
stochastic
+ u(t)
deterministic
¯E[·] = lim
N1 nt=1
(·)
quasi-stationary (or stationarity in weak sense)∼ =
¯E[s(t)] =m
s= cost
¯E[s(t)s(τ)] =R
s(t − τ) R
x(0)is the variance and is constant
s(t) = x(t) + u(t),
¯E[s(t)] = m
x+ m
u¯E[s(t)s(τ)] = R
x(τ) + R
u(τ) + 2m
xm
uErgodicity 1
is a stochastic process
Sample function or a realization
Sample mean
Ergodicity 2
A process is 2nd-order ergodic if
mean ¯E[x(t)] the sample mean of any realization.
covariance ¯E[x(t)x(t − τ] the sample covariance of any realization.
Sample Averages ∼= Ensemble Averages
A General Ergodic Process
+
WN Signal
quasi–stationary signal
uniformly stable
¯E[s(t)s(t − τ)] = R
s(τ) w.p.1 1
N
N t=1[(s(t)m(t − τ) − E[s(t)m(t − τ)]] → 0 w.p.1
1 N
N t=1[(s(t)v(t − τ) − E[s(t)v(t − τ)]] → 0 w.p.1
Remark: of our computations will depend on a given
realization of a quasi-stationary process. Ergodicity will allow
us to make statements about repeated experiments.
State state representation of a stochastic dynamical system
x(t + 1) =Ax(t) + Bu(t) + w(t) y(t) =Cx(t) + Du(t) + v(t)
where w(t) is called Process disturbance and v(t) is called Measurement disturbance
w(t),v(t) and x(0) are stochastic processes, and the following assumptions are often taken:
w(k) ∼WN (0, Q) v(k) ∼WN (0, R)
x(0) ∼(m
0, P
0)
x(0); w(l); v(j) are uncorrelated ∀l; j ≥ 0
Uncorrelation and Independency: Normal distribution
Let f
1,2(x
1, x
2) a Gaussian (Normal) bivariate probability density, where the joint pdf is
f
1,2(x
1, x
2) = 1 2 π
det (R) exp {− 1
2 (x − m)
TR
−1(x − m)}
with x = [x1 x2]
Ta random variable, m = [m
1m
2]
Tis the vector of means
The variance matrix R is R =
σ
12ρσ
1σ
2ρσ
1σ
2σ
22
σ
i2is the variance of x
iand −1 < ρ < 1.
Uncorrelation and Independency: Normal distribution
We denote such distribution as x ∼ N (m, R):
∞∞
f
1,2(x
1, x
2)dx
1=f
2(x
2)
∞∞
f
1,2(x
1, x
2)dx
2=f
1(x
1) E [x] =m
E [(x − m)(x − m)
T] =R
if ρ = 0, then f
1,2(x
1, x
2) = f
1(x
1)f
2(x
2)
Then two Gaussian random variables that are uncorrelated are also independent.
ρ is said the correlation factor.
Stationar SPs
For a vector of n stochastic processes that are stationary X (t) = [X
1X
2. . . X
n(t)]
T∈ R
n
Covariance function:
R
X(τ) = E[(X(t + τ) − m
X)(X(t) − m
X)
T]
Cross-covariance function of two SPs X (t) ∈ R
nand Y (t)R
m:
R
XY(τ) = E[(X(t + τ) − m
X)(Y (t) − m
Y)
T]
White Process
A SP X (t) is said to be WHITE if the variables X (t
i), i ∈ Z are all independent.
Its covariance function is given by R
X(t, s) =
R
x(t, t) if t = s 0 if t = s
if m
X(t) = m
Xand R
X(t, t) = σ
X2then the white process is stationary and it is denoted by WN (m
X, P
x= σ
x2I) .