(ebook pdf) Mathematics Statistical Signal Processing WLBIFTIJHHO6AMO5Z3SDWWHJDIBJQVMSGHGBTHI

background image

CHAPTER 3. RANDOM OBJECTS

R

Hint:

X

k

X

i

X





k
i

i

k

n

n

n

X

X

n

n

n

n

n

n

n

n

n

n

n

n

n

Y t

background image

3.18. PROBLEMS

X,Z

N

ρV

n

X

W

λw

X t

t

X t

t

t

t

X

background image

CHAPTER 3. RANDOM OBJECTS

X

X

X

X

Y

juc

X

X

λα

downward

Y



i

i

i

N

X

n



background image

3.18. PROBLEMS

n

X

1

,X

3

,X

5

X,Y

X

Y

U

i

i

i

XY

X

Y

X

Y



Y

X



Y

X

Hint:

background image

CHAPTER 3. RANDOM OBJECTS





X

Z

W

W

0

,W

2

background image

3.18. PROBLEMS

i

i

X,Y

k

j

k

X

Y

X Y

Y X

X

k

X

x



i

Y

n

n

n

V,W

Y

[3.26]

chaotic

Chaos

[3.26]

X

0



n

n

background image

CHAPTER 3. RANDOM OBJECTS

n

n

n

X

1

X

n

Hint:

X

1

n

n

n

n

n

n

X

n

tumens

N

k

k

W N



k

k

W

n

X

n



background image

3.18. PROBLEMS

n

W

n



n

n

n

n

n

n

n

n

n

additive noise

Y

n

n

Y

n

X

n

X

n

Y

n

n

n

n

n

n

n

e

n

n

e

e

Y

X Y

1

2σ2

x y

2

background image

CHAPTER 3. RANDOM OBJECTS

background image

3.18. PROBLEMS

background image

CHAPTER 3. RANDOM OBJECTS

background image

Chapter 4

Expectation and Averages

4.1

Averages

expectations

laws of large numbers

ergodic theorems

particular

background image

CHAPTER 4. EXPECTATION AND AVERAGES

n

heads

n

n



i

i

n

n

n

n

n

n

n

n



k

k

n

n

n

background image

4.1. AVERAGES

n



a A

n

a

n

n

a

i

n

a

th

relative

frequency

n

n

n

n

a

n

a

n



i

a

i

a



a

X

X

heads

n

a

n

X

n

n



a A

X

expectation

background image

CHAPTER 4. EXPECTATION AND AVERAGES

X

laws of

large numbers

ergodic theorems

n

n



a A

X

n

4.2

Expectation

X

expected value probabilistic average

mean



x A

X

ensemble average

X

background image

4.2. EXPECTATION





X

expectation



X

X



background image

CHAPTER 4. EXPECTATION AND AVERAGES

X



X




X



X

4.2.1

Examples: Expectation

[4.1]

X

X



i

X

[4.2]



k

X



k

k

[4.3]



X



background image

4.2. EXPECTATION

[4.4]



λr

X

X

X

X

X



x x

T

S



S

T

X

X

X

n

a

background image

CHAPTER 4. EXPECTATION AND AVERAGES

n

a

moments

k

k

variance

F

F

juX

Y



x g x

y

X

Y



A

Y

Y

background image

4.2. EXPECTATION



A

Y

Y



A

Y




x g x

y

X




A

Y




x g x

y

X




A

X

X

fundamen-

tal theorem of expectation

Theorem 4.1 The Fundamental Theorem of Expectation.

Let a random variable

be described by a cdf

X

, which is in turn

described by either a pmf

X

or a pdf

X

. Given any measurable function

, the resulting random variable

has expectation



g X



X




x

X

or



x

X

background image

CHAPTER 4. EXPECTATION AND AVERAGES

Y

/

X

/

Y

/

X

/



Y



/

/

/



/

/

[4.5]

F



F



F

X



F

X

X

background image

4.2. EXPECTATION

[4.6]

jux

X

juX



jux

X

X

juX

X

X



x

X

jux



x

X

jux

X

X

u

X

background image

CHAPTER 4. EXPECTATION AND AVERAGES

k

k

k

k

k

k

X

u

k

k

X

w

k

moment-generating functions.

X

X

X

ju

X

X

X

X

X

background image

4.2. EXPECTATION



k

k

k

k

k

k

k

u

X



k

k

k

X



k

k

k

all

all

X

background image

CHAPTER 4. EXPECTATION AND AVERAGES

4.3

Functions of Several Random Variables

Theorem 4.2 Fundamental Theorem of Expectation for Functions of Sev-
eral Random Variables

Given random variables

k

described by a cdf

X

0

,X

1

,... ,X

k

1

and given a measurable function

k

,

k



k

X

0

,... ,X

k

1

k




x

0

,... ,x

k

1

k

X

0

,... ,X

k

1

k

or



k

X

0

,... ,X

k

1

k

k

4.4

Properties of Expectation

background image

4.4. PROPERTIES OF EXPECTATION

Property 1.

Proof.

X

X

X

Property 2.

Proof.

X

Property 3.

Proof.



x,y

X,Y



x



y

X,Y



y



x

X,Y



x

X



y

Y

background image

CHAPTER 4. EXPECTATION AND AVERAGES

Property 4.

n

n

n

n

n

n

n



n

n



n

n

monotone convergence theorem

background image

4.5. EXAMPLES: FUNCTIONS OF SEVERAL RANDOM VARIABLES

4.5

Examples: Functions of Several Random
Variables

4.5.1

Correlation

correlation



x,y

X,Y



x



y

X

Y



x

X



y

Y

Lemma 4.1 For any two independent random variables

and

,

for all functions

and

with finite expectation.

background image

CHAPTER 4. EXPECTATION AND AVERAGES

uncorrelated

linear

independence

functions

X,Y



X

a

a

a

b

X,Y

X

Y

background image

4.5. EXAMPLES: FUNCTIONS OF SEVERAL RANDOM VARIABLES

Theorem 4.3 Two random variables

and

are independent if and only

if

and

are uncorrelated for all functions

and

with finite

expectations, that is, if (4.18) holds. More generally, random variables

i

are mutually independent if and only if for all functions

i

the random variables

i

i

are uncorrelated.

any

Corollary 4.1 Given a sequence of mutually independent random variables

n

, define

n

n



i

i

Then

Y

n

n



i

X

i

Proof.

&

juY

n

'


ju

n



i

X

i


n



i

juX

i

n



i

&

juX

i

'

n



i

X

i

4.5.2

Covariance

covariance,

covariance

correla-

tion

background image

CHAPTER 4. EXPECTATION AND AVERAGES

constants

Corollary 4.2 Two random variables

and

are uncorrelated if and

only if their covariance is zero; that is, if

.

second moment

variance

X

X



X

standard deviation

X

/

Cauchy-Schwarz inequality

4.5.3

Covariance Matrices

n

mean vector

n

t

background image

4.5. EXAMPLES: FUNCTIONS OF SEVERAL RANDOM VARIABLES

k

k



n

X

X

k

k

l

l

covariance matrix

X

t



n

t

t

k j



n

/

x m

t

K

1

x m

n/

t



n

t

/

x m

t

K

1

x m

n/

4.5.4

Multivariable Characteristic Functions

juX

background image

CHAPTER 4. EXPECTATION AND AVERAGES

X

ju

t

m

/ u

t

u

$

n



k

k

k

n



k

n



m

k

m

%

Theorem 4.4 Let

be a

-dimensional Gaussian random vector with

mean

X

and covariance matrix

X

. Let

be the new random vector

formed by a linear operation of

:

where

is a

matrix and

is an

-dimensional vector. Then

is a

Gaussian random vector of dimension

with mean

Y

X

and covariance matrix

Y

X

t

Proof.

Y

*

ju

t

Y

+

*

ju

t

HX

b

+

ju

t

b

*

j H

t

u

t

X

+

ju

t

b

X

t

ju

t

b ju

t

Hm

1
2

H

t

u

t

X

H

t

u

ju

t

Hm b

1
2

u

t

H

X

H

t

u

t

background image

4.5. EXAMPLES: FUNCTIONS OF SEVERAL RANDOM VARIABLES

k

i

l i

l i ,l i

4.5.5

Example: Differential Entropy of a Gaussian Vec-
tor

n

X

X

differential entropy



X

X



X

0

,X

1

,... ,X

n

1

n

X

0

,X

1

,... ,X

n

1

n

n

?



X

X

n



t

n/

/

/

x m

t

K

1

x m

background image

CHAPTER 4. EXPECTATION AND AVERAGES

t

t

n

&

t

'

n

&

t

'

n

&

t

'

n

n

n

n

4.6

Conditional Expectation

X,Y



Y

Y X

conditional expectation of

given



y A

Y

Y X

Y X

background image

4.6. CONDITIONAL EXPECTATION

conditional expectation



y A

Y

Y X

name



x A

X

X



x A

X

X



y A

Y

Y X



y A

Y



x A

X

X,Y



y A

Y

Y

iterated expectation

nested expectation

background image

CHAPTER 4. EXPECTATION AND AVERAGES

k

n

k

N



k

k



n
k

k

k

k

n



k

k

k

i

Lemma 4.2 General Iterated Expectation

Suppose the

are discrete random variables and that

and

are functions of these random variables. Then

background image

4.7.

JOINTLY GAUSSIAN VECTORS

Proof:



x,y

X,Y



x

X



y

Y X



x

X



Y X

4.7

 Jointly Gaussian Vectors

X

N

t





jointly Gaussian

X

Y

X

Y

background image

CHAPTER 4. EXPECTATION AND AVERAGES

U

t



X

Y



X

t

Y

t



X

X

t

X

Y

t

Y

X

t

Y

Y

t





X

XY

Y X

Y



X

Y

XY

t

Y X

cross-covariance

U

X,Y

XY



X

XY

Y X

Y



$

X

X

XY

Y X

Y X

X

X

XY

Y X

Y X

Y X

X

Y X

%

Y X

Y

Y X

X

XY





$

X

X

XY

Y X

Y X

X

X

XY

Y X

Y X

Y X

X

Y X

% 

X

XY

Y X

Y



background image

4.7.

JOINTLY GAUSSIAN VECTORS

Y X

XY

Y

k m /

U

/



X

t

Y

t

U

X

Y



k/

X

/

&

X

t

X

X

'

m/

U

X

/



X

t

Y

t

U

X

Y

X

t

X

X



X

t

Y

t

U



X

Y



X

t

X

X

Y

Y X

X

X

t

Y X

Y

Y X

X

X

Y x

Y

Y X

X

X

Y X

m/

U

X

/



Y x

t

Y X

Y x



Y x

Y

Y X

X

X

Y X

Y X

t

background image

CHAPTER 4. EXPECTATION AND AVERAGES

not

Y X

U

X

4.8

Expectation as Estimation

best

mean squared error

MSE

background image

4.8. EXPECTATION AS ESTIMATION

the mean of a random variable is the minimum mean

squared error estimate (MMSE) of the value of a random variable in the
absence of any a priori information







x

X

Y X

background image

CHAPTER 4. EXPECTATION AND AVERAGES

Y

Y

X

X

k

t

m

t

t

m



i

i

i

optimal

background image

4.8. EXPECTATION AS ESTIMATION

n

n

n k

n

given

Theorem 4.5 Given two random vectors

and

, the minimum mean

squared error estimate of

given

is

Proof:

t

t

background image

CHAPTER 4. EXPECTATION AND AVERAGES

t

t

t

X,Y

t

t

t

X

t

Y

t

t

t

t

X

t

Y

Y

Y

Y

t

X

X

X

t

XY

X

Y

t

Y X

Y

Y

t

Y X

m/

Y X

/



Y x

t

Y X

Y x



Y X

Y

Y X

X

XY

t

Y X

Y,X

X

Y x

Y

Y X

X

X

Y

Y X

X

X

t

&

t

'

&

t

'

Y X

background image

4.8. EXPECTATION AS ESTIMATION

n

n

n

n

X

n

n

n

n

t

n

X

k

k

j

j

n

n

n

n

n

n

n

n

n

n

t

n

X

n

n

n

n

n

t

n

X


X

X

X


n

n

n

n

n

Y

Y X

X

XY

X

n

t

n

X

n

n

n

X

n

X

n

n

X

n

X

background image

CHAPTER 4. EXPECTATION AND AVERAGES

n

n

n

t

n

X

n


X

X

X


4.9

 Implications for Linear Estimation

Y X

X

Y

Y X

X

XY

&

t

'

Y

Y X

X

XY

Y X

X

X

background image

4.9.

IMPLICATIONS FOR LINEAR ESTIMATION

&

t

'

Y

X

Y

X

Y

X

Y

X

t

'

&

Y

XY

Y X

t

X

t

'

Y

X

t

Y

X

Y

X

&

Y

XY

Y X

t

X

t

'

Y

Y X

X

XY

&

Y X

X

XY

X

t

XY

Y X

t

'

X

/

X



/

X

Y X

/

X

/

X

Y X

/

X

t



t



i i,i

Y X

X

Theorem 4.6 Given random vectors

and

with

X,Y

t

t

t

X

t

Y

t

t

t

t

X

t

Y

,

Y

Y

Y

t

,

X

X

X

t

,

XY

X

Y

t

,

Y X

Y

Y

t

, assume that

X

is invertible (e.g., it is positive

definite). Then

A,b

MMSE

A,b

&

t

'

Y

Y X

X

XY

and the minimimum is achieved by

Y X

X

and

.

background image

CHAPTER 4. EXPECTATION AND AVERAGES

n

n

n

n

n

n

n

t

n

X

n


X

X

X


X

t

n

X

n

X

n

X

X

n

X

n

4.10

Correlation and Linear Estimation

background image

4.10. CORRELATION AND LINEAR ESTIMATION

linear

affine

&

'

&

'

Y

X

X

X

X

X

background image

CHAPTER 4. EXPECTATION AND AVERAGES

affine

linear

t

k

m

t

m

t

i

t

i

k



i

i

t

i

orthogonality principle

background image

4.10. CORRELATION AND LINEAR ESTIMATION

i

t

i

i

t

t

X

t

autocorrelation matrix

X

t

X

k

i

t

t

X

t

t

t

t

t

t

t

t

t

t

t

t

t

background image

CHAPTER 4. EXPECTATION AND AVERAGES

t

t

t

t

t

t

t

&

t

t

t

'

&

t

t

X

'

X

Theorem 4.7 The minimum mean squared error linear predictor of the
form

is given by any solution

of the equation:

X

t

If the matrix

X

is invertible, then

is uniquely given by

t

X

t

that is, the matrix

has rows

t

i

, with

i

X

i

Alternatively,

t

X

background image

4.10. CORRELATION AND LINEAR ESTIMATION

Theorem 4.8 The minimum mean squared estimate of the form

is given by any solution

of the equation:

X

t

where the covariance matrix

X

is defined by

X

t

X

E X

and

If

X

is invertible, then

t

X

t

background image

CHAPTER 4. EXPECTATION AND AVERAGES

The Orthogonality Principle

t



k
n

n

t

t

t

k

t

t

X

n

orthogonal

background image

4.11. CORRELATION AND COVARIANCE FUNCTIONS

t

t

t

t

t

t

t

t

t

t

t

t

Theorem 4.9 The Orthogonality Principle:

A linear estimate

is optimal (in the in the mean squared

error sense) sense) if and only if the resulting errors are orthogonal to the
observations, that is, if

, then

k

n

4.11

Correlation and Covariance Functions

background image

CHAPTER 4. EXPECTATION AND AVERAGES

t

autocorrelation function

X

X

t

s

autocovariance function

covariance function

X

X

t

s

X

X

t

s

t

t

t

t

t

uncorrelated

X



t

t

s

X



X

t

only

if taken at different times

t

s

t

t

t

t

t t

t

t

t

background image

4.11. CORRELATION AND COVARIANCE FUNCTIONS

Gaussian Processes Revisited

t

t

t

t

X

X

X

k

i

n



i

n



l

i l

X

i

l

X

t

t

s

s

s

s

t

t

X

background image

CHAPTER 4. EXPECTATION AND AVERAGES

n



i

n



l

i l

X

i

l

n



i

n



l

i l

t

i

t

i

t

l

t

l

n



i

n



l

i l

t

i

t

i

t

l

t

l

n



i

i

t

i

t

i

X

k

t

i,l

k

i,l

i

i

l

l

background image

4.12.

THE CENTRAL LIMIT THEOREM

4.12

The Central Limit Theorem

central limit theorem

n

X

n

X

n

X

n

/

n



k

i

n

R

n

/

i

R

n

X

m /σ



/



n

background image

CHAPTER 4. EXPECTATION AND AVERAGES

X

m /σ

&

ju

n

1/2

'

n

X

m /σ

/

n

R

n

n



 

n

n

R

n

u

2

/

not

pdf

n

n

n

n

n

in distribution

n

/

n



k

i

background image

4.13. SAMPLE AVERAGES

Theorem 4.10 (A Central Limit Theorem). Let

n

be an iid ran-

dom process with a finite mean

and variance

. Then

/

n



k

i

converges in distribution to a Gaussian random variable with mean

and

variance

.

/

in-

finitely divisible

th

4.13

Sample Averages

background image

CHAPTER 4. EXPECTATION AND AVERAGES

n

n

n

n



i

i

n

n

$

n



i

i

%

n



i

i

i

n

n

unbiased

S

n

n

n


n



i

i

n



i

i


n



i

i

i

n



i

n



j

i

i

j

j

S

n

n



i

n



j

X

background image

4.14. CONVERGENCE OF RANDOM VARIABLES

S

n

n



i

X

i

X

i

X

S

n

X

4.14

Convergence of Random Variables

n

n

n

n

individual

n

S

n

n

not

background image

CHAPTER 4. EXPECTATION AND AVERAGES

n

limit

n

n

every

n

n

n

pointwise

n

with probability one

w.p. 1

n

n

almost surely

a.s.

almost

everywhere

a.e.

convergence in mean square,

mean ergodic theorem.

convergence in probability,

background image

4.14. CONVERGENCE OF RANDOM VARIABLES

n

converge in

mean square

converge in quadratic mean

n

n

n

n

n

n

n

does not

n

faux ami

n

converge in probability

n

n

th

n

n

background image

CHAPTER 4. EXPECTATION AND AVERAGES

The Tchebychev Inequality

X

X

X

X

X

X



x

X

X



x x m

X



X

X



x x m

X

>

X

X



x x m

X

>

X

X



x x mX >

X

X

V

The Markov Inequality.

U

background image

4.14. CONVERGENCE OF RANDOM VARIABLES

Proof:

F

c

c

F

F

c

F

F

c

F

F



X



α,





α,



X





α,

X



background image

CHAPTER 4. EXPECTATION AND AVERAGES





U



Lemma 4.3 If

n

converges to

in mean square, then it also converges

in probability.

Proof.

n

n

n

n

n

Y

n



n

n

n

n

n

4.15

Weak Law of Large Numbers

background image

4.15. WEAK LAW OF LARGE NUMBERS

A Mean Ergodic Theorem

Theorem 4.11 Let

n

be a discrete time uncorrelated random process

such that

n

is finite and

X

n

X

for all

; that is, the

mean and variance are the same for all sample times. Then

n

n



i

i

that is,

n



i

i

in mean square.

Proof.

n

n



i

i

n

i

n

n

n

n

n

n

S

n

n

X

a

Theorem 4.12 The Weak Law of Large Numbers.

Let

n

be a discrete time process with finite mean

n

and

variance

X

n

X

for all

. If the process is uncorrelated, then the

sample average

n



i

i

converges to

in probability.

n

n

n

n



i

i

background image

CHAPTER 4. EXPECTATION AND AVERAGES

n

k

n

k

weak

strong

n

/



n
k

k



n
k

k

4.16

Strong Law of Large Numbers

background image

4.16.

STRONG LAW OF LARGE NUMBERS

n



n

n

Lemma 4.4 If

n

converges to

with probability one, then it also con-

verges in probability.

Proof:

n

m

n

n

n

n

n

n

n

n

n

n

Borel-Cantelli lemma

Lemma 4.5

n

converges to

with probability one if for any



n

n

Proof:

n

not

n

background image

CHAPTER 4. EXPECTATION AND AVERAGES

n

n



n N

n

N



n N

n



n N

n



n

n

N



n N

n

n

n

n

n

n



k

k

background image

4.16.

STRONG LAW OF LARGE NUMBERS

n

n

n

n

n

X



n

not

λY

λy

λY

λy

λ Y

y

λ Y

y

Chernoff inequality

n

n

n

n

n

λ S

n



λ

S

n



λ

&

λS

n

λS

n

'

λ

S

n

S

n

γS

n

n

X

X

juX

i

X

n

λ



n

X

n

X



background image

CHAPTER 4. EXPECTATION AND AVERAGES

X

n

n

2

σ2

X



n

X

X

n

X

X



X

w

2

σ

2
X

/

n

n

2

σ2

X



σ2

X

2 σ

2

X

2

n



2

2σ2

X



n

n

n



n

n



n

n

X



σ

2
X

X

X



2

/ σ

2
X

Theorem 4.13 Strong Law of Large Numbers

Given an iid process

n

with finite mean

and variance, then

n

n



k

k

with probability 1.

background image

4.17. STATIONARITY

4.17

Stationarity

Stationarity Properties

t

t

X

X t s

stationarity property

t

t τ

shift

weak

t

sufficient

X

t

X

t

X

t+τ

first order stationary

stationary

shifting

background image

CHAPTER 4. EXPECTATION AND AVERAGES

weakly stationary

stationary in the weak

sense

t

X

t

X

X

X

X

X

Toeplitz

Toeplitz matrix

X

X


X

X

X

X

X

X

X

X

X

X

X

X


weakly

second order stationary

X

t

,X

s

X

t+τ

,X

s+τ

background image

4.17. STATIONARITY

n

n

n k

X

n

,X

n+1

,... ,X

n+k

1

X

n+m

,X

n+m+1

,... ,X

n+m+k

1

x

shifting

X

n

X

0

X

n

,X

k

X

0

,X

k

−n

X

n

,X

k

,X

l

X

0

,X

k

−n

,X

l

−n

relative

shifted

n

stationary

strictly

stationary

stationary in the strict sense

background image

CHAPTER 4. EXPECTATION AND AVERAGES

n

stationary

X

n

,X

n+1

,... ,X

n+k

1

X

n+m

,X

n+m+1

,... ,X

n+m+k

1

t

stationary

X

t0

,X

t1

,... ,X

tk−1

X

t0−τ

,X

t1−τ

,... ,X

tk−1−τ

k

k

i

i

X

t

X

t

,X

s

X

t

−s

,X

0

X

X

t

t

X

t

X

X

background image

4.17. STATIONARITY

Strict Stationarity

t

t

process distribution

X

X

X

t

t

t

t

X

X

t

t

X

t

t τ

background image

CHAPTER 4. EXPECTATION AND AVERAGES

4.18

Asymptotically Uncorrelated Processes

n

asymptotically uncorrelated



k

X

k

X

X

n

n k

X

background image

4.18. ASYMPTOTICALLY UNCORRELATED PROCESSES

Theorem 4.14 (A mean ergodic theorem): Let

n

be a weakly sta-

tionary asymptotically uncorrelated discrete time random process such that

n

is finite and

X

n

X

for all

. Then .

n

n



i

i

that is,

n



i

i

in mean square.

X

X

k

Proof:

n



n
i

i

n

n

n

S

n

S

n

n



i

n



j

X

S

n

n



k

n

X

n

n



k

n

X



k

X

background image

CHAPTER 4. EXPECTATION AND AVERAGES

n

n



k

n

X

asymptotically uncorrelated



X

τ

X

why

not?

T



T

Theorem 4.15 (A mean ergodic theorem): Let

be a weakly sta-

tionary asymptotically uncorrelated continuous time random process such
that

is finite and

X t

X

for all . Then .

T



T

that is,



T

in mean square.

background image

4.19. PROBLEMS

T



T

4.19

Problems

Cauchy

X

EX. Hint:

Z

k

k

X

X

juX

Y

Y

X

X

background image

CHAPTER 4. EXPECTATION AND AVERAGES

t

t

AB





different

t

CD





n

X

l

λ

k

k





k
l

l

k

k

N

k

k

N

k

juN

k

N

k

k

Y

k

background image

4.19. PROBLEMS

N

k

N

1

,N

2

,... ,N

k

1

k

k

N

k

N

k

1

k

k

k

n

n

n

/

n



k

k

Y

n

n

u/ n

n

X

X

k

n

n



k

k

n

n



k

k

background image

CHAPTER 4. EXPECTATION AND AVERAGES

n

n



i

i

n

n

a

n

a

X

n





c



n

n

aX

a

background image

4.19. PROBLEMS

t

s

t

t

X

X

X

X

X

X

X

X

/

/

/

/

t

t

cross correlation function

XY

XY

t s

X

XX

XY

XY

X

Y

T



T

background image

CHAPTER 4. EXPECTATION AND AVERAGES

n

n

n

n

n

n



a

n

n

n

n

n

n

n

n

n

n

X

n

n

n

linear

2

n

n

2

n

n

2

n

n

n m

n

m

n m

m

background image

4.19. PROBLEMS

n

n

i

i

k

i

i

i

i

n



i

i

i

i

i

i

minimum variance unbiased

estimate of

i

i

N



i

i

X

X

X

n

X

n

background image

CHAPTER 4. EXPECTATION AND AVERAGES

W

i

i

n

n

n

Note:

n

n

n

n

Y

XY

k j

n

2

n

n

filtered

n

n

n

i

i

i

n

U

n

U

U

n

n

U

0

,U

1

,... ,U

n

1

n

n



i

U

i

n

n

U

k

k

j

j

background image

4.19. PROBLEMS

n

n

n

n

n

n

W

W

n

n

n

n



n



n



n

n



n

n



background image

CHAPTER 4. EXPECTATION AND AVERAGES

k

k

k

U

n

n

U

n

n

2

n

n



i

i

2

n

X

n

n

n

n

n

n

n

Y

Y

n

n

n

n



i

i

n

X

n

n

n

n

background image

4.19. PROBLEMS

n

n

n

n

n

n

V

n

juV

n

n

n

n

n

n

n

n



i

i

n

n



i

i

n

n



i

i

n

x

entropy



x

x

X

X

X

bits

n



i

X

i

n

X

0

,... ,X

n

1

n

nH X

asymptotic equipartition property

background image

CHAPTER 4. EXPECTATION AND AVERAGES

n

n

th

n

nH X

n

X

n



n

n



i

i

conditional differential entropy

n

n

n

n

n



X

0

,X

1

,... ,X

n

1

n

X

n

1

X

1

,... ,X

n

2

n

n

n

n

n

n

n

n

n

n



k

N

background image

4.19. PROBLEMS

Hint:

N

N

n

n

n

X

n

n



k

k

n k

n

Y

Y

n

n



i

i

n

S

n

n

n

n

n

n

2

N

n



i

i

background image

CHAPTER 4. EXPECTATION AND AVERAGES

2

n

n

n

n

n

k

k

n

n

U,W

n

n

n

n

n

n

correlation coefficient.

n

n

n

n

n



i

i

n



i

i

n

n

n

n



i

i

stable

background image

4.19. PROBLEMS

distribution.

n

nonlinear

j

πf

0

t cX t

X

W

W

Hint:

n

X

n

n

n



i

i

n

Y

n

n

n

n

n

n

n

background image

CHAPTER 4. EXPECTATION AND AVERAGES

n

Y

n

n

n

Y

n

n

2

n

n



n
l

l

n

X

n





n

n

n

n

n

X

n

n

Y

n

n

n



k

k

n

S

n

n

n

n

background image

4.19. PROBLEMS

n

n



i

i

martingalemartingale

n

n

n

n

n

n

n



n

n

n

random walk

Y

n

n

Y

n

n

Y

n

Y

0

,Y

1

,... ,Y

n

1

n

n

Y

n

Y

n

1

n

n

n

n

n

n

X

n

n

n

n

moving average

n

n

background image

CHAPTER 4. EXPECTATION AND AVERAGES

pseudo random

appear

rand

n

n

n

U

X

background image

4.19. PROBLEMS

U

n

n

n

n

n

n

n

n

n

n

Hint:



Y

Y

Y

n

n

n



i

i

background image

CHAPTER 4. EXPECTATION AND AVERAGES

U

n

n

n

n



i

i

Y

n

N

N



n

n

S

N

.S

K

background image

4.19. PROBLEMS

K

N

K

N

K



K
n

n

K

N

i

i

i

S

i

W

i

background image

CHAPTER 4. EXPECTATION AND AVERAGES

background image

Chapter 5

Second-Order Moments

background image

CHAPTER 5. SECOND-ORDER MOMENTS

5.1

Linear Filtering of Random Processes

t

t



s t s

background image

5.1. LINEAR FILTERING OF RANDOM PROCESSES

n



k n k

n k k



s t s

n



k n k

n k

k

k

N

n



k

n k k

background image

CHAPTER 5. SECOND-ORDER MOMENTS

5.2

Second-Order Linear Systems I/O Rela-
tions

Discrete Time Systems

n

n

n

X

n

n

k

k

k

n

n



k

k

n k



k

k

n k

n

n



k

k

m k

k

n k



k

k



k

k

n

background image

5.2. SECOND-ORDER LINEAR SYSTEMS I/O RELATIONS

n



k

k

n



k

k

n

n



k

k

k

n

Y

k

k

j

j

$



n

n

k n

k n



m

m

j

m

j

m

%



n



m

n m

k n

k n

j

m

j

m



n



m

n m

X

background image

CHAPTER 5. SECOND-ORDER MOMENTS

X

X

X

X

Y



n



m

n m

X

Y



n



m

n m

X

Y

k



n

j



m

n m

X

X

X

Y

Y

n



k

k

background image

5.2. SECOND-ORDER LINEAR SYSTEMS I/O RELATIONS

Y



n



m

n m

X

Continuous Time Systems

X





Y





X

X



Y





X

background image

CHAPTER 5. SECOND-ORDER MOMENTS

Transform I/O Relations

k

k

f

Y



k



n



m

n m

X

j πf k



n



m

n m



k

X

j πf k

n m

j πf n m



n

n

j πf n



m

m

j πf m

X

f

X

f

f

f

f

Y

f

X

Y

Y

Y

X

background image

5.3. POWER SPECTRAL DENSITIES

X

X

X

Y

Y

Y

f

Y

Y

f

X

X

f

Y

Y

&

f

X

X

'

f

X

X

f

X

X

X

power spectral density

X

f

X




X

j πf k



X

j πf τ

X

Y

X

Y

X

5.3

Power Spectral Densities

background image

CHAPTER 5. SECOND-ORDER MOMENTS

X




/

/

X

j πf τ



X

j πf τ

X

X

X

X

X

X

X

X

X



X

X

X

background image

5.3. POWER SPECTRAL DENSITIES

t

t

t

t



f f

0

f <f

0

X

t

Y



Y



X



F

X

X

X

t

power spectral density

X

background image

CHAPTER 5. SECOND-ORDER MOMENTS

5.4

Linearly Filtered Uncorrelated Processes

n

k

X

k

X

k

X



k

k

j πf k

X

white

white noise

X

k

Y



n

n n k



n k

n n k

Y

Y

k



n

n n

k j

l

background image

5.4. LINEARLY FILTERED UNCORRELATED PROCESSES

k j

k

j

j

k

k j

k

j

k j

Y

k,j



n

k n j

n

Y

k,j



n

n n

k j

reflection

k

sample autocorrelation

[5.1]

n

k

k

n

n



k

k

n k



k

k

background image

CHAPTER 5. SECOND-ORDER MOMENTS

n



k

k

Y



n k

n n k

k



n k

n

k

Y

k

Y

y

n

n



k

k

n k



k

k

n

k

n



k

k

n k



k

k

n

k

n

n

n

n

n

background image

5.4. LINEARLY FILTERED UNCORRELATED PROCESSES

n

n

n

n

n

n

autoregressive

moving average

Y



k

r

j πf k



k

k

j πf k

j πf

Y

πf

[5.2]

n

n

n

n



k

k

n

background image

CHAPTER 5. SECOND-ORDER MOMENTS

Y

k,j



n

n

k j

k,j

[5.3]

k

not

n

n



k

k

Y

background image

5.4. LINEARLY FILTERED UNCORRELATED PROCESSES

n

Wiener process

[5.4

Y

k

k

j

j

k

k



y

Y

Y

Y

k



y

Y

symmetric

Y

k

k

j

j

j

j

k

k

Y

Y

background image

CHAPTER 5. SECOND-ORDER MOMENTS

Y

j l

j

j l

j l

j



x,y,z

X

j+l

,Y

j+l

1

,Y

j



x,y,z

X

j+l

Y

j+l

1

,Y

j



y,z



Y

j+l

1

,Y

j

Y

j+l

1

,Y

j



Y



y,z

Y

j+l

1

,Y

j



y,z

Y

j+l

1

,Y

j

Y

Y

Y

l

Y

l

Y

k j

Y

Y

5.5

Linear Modulation

background image

5.5. LINEAR MODULATION

modulation

X

affine

a priori

background image

CHAPTER 5. SECOND-ORDER MOMENTS



π



π

Y

X

background image

5.6. WHITE NOISE

Y

X





Y

X

5.6

White Noise

n

X

X τ

τ

X

X

background image

CHAPTER 5. SECOND-ORDER MOMENTS

t

m

m

m

m

X

Y

t

white

background image

5.6. WHITE NOISE

X

α τ

X

Y

X

X

X

X

background image

CHAPTER 5. SECOND-ORDER MOMENTS

X

no matter how close together

in time the samples are!

X

t

Y

t

Y



Y



Y

causal

modeled

physically realizable

Paley-Wiener

criteria

innovations

background image

5.7.

TIME-AVERAGES

Y

jτ X

jτ X

X

X

τ

2

Y

f

2

5.7

Time-Averages

n

n

X

n

n k

X

n

n k

X

background image

CHAPTER 5. SECOND-ORDER MOMENTS

n

N

N



n

n

X

n

n k

N

N



k

n

n k

X

n

n k

N

N



n

n

n k

n

X

N

N



k

n

n k

X

X

N

N



k

n

power

n

X

X

n

X

X

X

X

background image

5.7.

TIME-AVERAGES

N

N



n

n

N

N



n

n

n k

X

X

N

N

N



k

n

N

N



k

n



k

n

n

N

n

N

n



n

background image

CHAPTER 5. SECOND-ORDER MOMENTS

N

n

N



n

N

n

j πf n

N



n

n

j πf n

Fourier transform

spectrum

N

N



n

n



/

/

N

N

N



n

n



/

/

N

N

N

N

background image

5.8.

DIFFERENTIATING RANDOM PROCESSES

N

N

N

N



k

k

i πf k

N

N



k

N



l

k

i πf k

l

i πf l

N

N



k

N



l

k

l

i πf k l

N

N



k

N



l

X

i πf k l

N

N



k

N

X

i πf k



k

X

N

N



k

X

i πf k

X

5.8

Differentiating Random Processes

background image

CHAPTER 5. SECOND-ORDER MOMENTS

t

t

background image

5.8.

DIFFERENTIATING RANDOM PROCESSES

t

t

t

s





X

X

X

X

Y

X

background image

CHAPTER 5. SECOND-ORDER MOMENTS

Y

X



Y

X

Y

nowhere differentiable!

5.9

Linear Estimation and Filtering

background image

5.9.

LINEAR ESTIMATION AND FILTERING

i

n

k

2

n

n

2

n



k n k

k n k



k

n k k

n

n

2

n

n

k

k

k

background image

CHAPTER 5. SECOND-ORDER MOMENTS

k

Theorem 5.1 Suppose that we are given a set of observations

k

of a zero-mean random process

k

and that we wish to find a linear

estimate 2

n

of a sample

n

of a zero-mean random process

n

of the

form

2

n



k n k

k n k

If the estimation error is defined as

n

n

2

n

then for a fixed

no linear filter can yield a smaller expected squared error

n

than a filter

(if it exists) that satisfies the relation

n k

all

or, equivalently,

n k



i n i

i

n i k

all

If

Y

k j

is the autocorrelation function of the

process and

X,Y

k j

is the cross correlation function of the two processes,

then (5.54) can be written as

X,Y



i n i

i

Y

all

If the processes are jointly weakly stationary in the sense that both are
individually weakly stationary with a cross-correlation function that depends
only on the difference between the arguments, then, with the replacement of

by

, the condition becomes

X,Y



i n i

i

Y

all

background image

5.9.

LINEAR ESTIMATION AND FILTERING

Comments.

orthogonality

principle

Proof.

n

n

n

n

n

n



i n i

i n i

n

n



i n i

i n i



i n i

i n i



i n i

i n i

n



i n i

i n i

n



i n i

i n i



i n i

i

i

n i



i n i

i

i

n i

n

n

n



i n i

i

i


n n i



j n j

j

n j

n i


background image

CHAPTER 5. SECOND-ORDER MOMENTS

n

n

n

n

n



k n k

k n k

n

n

X



k n k

k

X,Y

n

X



k n k

k

X,Y

[5.4]

n

X,Y



i

i

Y

background image

5.9.

LINEAR ESTIMATION AND FILTERING

X,Y

X,Y

X,Y

Y

X,Y

Y



/

/

X,Y

Y

i πkf

2

n



i

i n i

not

n

2

n



i

i n

i

n

any

n

n

n

n

n

X,V

X,Y

X

Y

X

V

X

X

V

[5.5]

background image

CHAPTER 5. SECOND-ORDER MOMENTS

Y

k

X,Y



i n i

i

k i

k

k

X,Y

2

n



k

X,Y

n k

[5.6]

Y

background image

5.9.

LINEAR ESTIMATION AND FILTERING

n

W

Y

n

whitened

in-

novations process

n

n

n

n

n

k

X,W

X,W

X,Y

k




/

/

X,Y

πjkf

background image

CHAPTER 5. SECOND-ORDER MOMENTS

n

G

n

k

n

[5.7]

n

n

n

n

n

n

n

n

n

n

n

n

n

n

n

n

n k

n

n

background image

5.9.

LINEAR ESTIMATION AND FILTERING

Kalman-Bucy filtering,

2

n

n

n

2

n

n

n

n

2

n

n

2

n

n n

n

2

n

n



i

i n i

n

X,Y

n



i

n

i

Y

background image

CHAPTER 5. SECOND-ORDER MOMENTS

n

2

n

n



i

n

i

n

i

X,Y

n



i

n

i

Y

n

n

n

X,Y

n

l

n

n

n

l

n

n l

n l

n

n l

n

X,Y

n

X,Y

n



i

n

i

Y

n

Y

n



i

n

i

Y

X,Y

n

l



n

n

n

l



n

n l

n

Y

background image

5.9.

LINEAR ESTIMATION AND FILTERING

Y

n

X,Y

X,Y

n



i

n

i

n

n

n

Y

n

i

n

i

n

n

n

n

2

n

n



i

n

i

n i

n

n

n

n



i

n

i

n

i

2

n

n

n

n

n

n

2

n

n

n

n

n

n

2

n

2

n

n

n

n

2

n

n

2

n

2

n

n

n

2

n

n

2

n

n

2

n

n

n

n

n

2

n

n n

n

2

n

background image

CHAPTER 5. SECOND-ORDER MOMENTS

n

n

n

2

n

n

n

n

l

n

l

l

n

n

n

n

n

2

n

n

n

n

n

2

n

n

n

2

n

n

n

n n

n

n

n

2

n

n

n

n

n n

n

2

n

n

n

n

n

n

n n

n

n

n

n

n

n

n

n

n

n

n

n

background image

5.9.

LINEAR ESTIMATION AND FILTERING

n

n

n

n

n

n

n

n

n

n

n

n

n

n

n

n

n

n

n

n

n

n

n

n

n

n n

n n

n

n n

n n

n

n

n

n

n

n

n

2

n

n

n

n

n

n

2

n

n n

n

n

n

n

n

n

n

n

n

n

n

n

n

n

n

n

n

n

n

n

n

2

2

n

n

n

n

2

n

n

2

n

background image

CHAPTER 5. SECOND-ORDER MOMENTS

n

n

n

n

n

n

n

n

n

n

n

n

n

n

n

n

n

5.10

Problems

n

k

n

n

n

n

n

W

n

n

W

n

W

n

X

x

n

Y

y

n

n

n

n

n

n

n

U

background image

5.10. PROBLEMS

X

X

X

n

k



K

n

n

K



k

n k

comb filter

n

n

n

n

n

Y

n

Y

n

n



k

k

n k

n



n
k

k



n
k

k

n

i

j

n

background image

CHAPTER 5. SECOND-ORDER MOMENTS

X

n

n

n

n

n



n

n

n

n

X

n



n
j

j

n

n



n
k

n

n

n

Y

X



γ

Z



N

0

background image

5.10. PROBLEMS

Y t X t

Y t

X t Y t



Hint:

n

X

n

n

n

n

n

n

n

n

n

n

n

n

n

n

n

n

2

n

n



background image

CHAPTER 5. SECOND-ORDER MOMENTS

e

2

n

n

2

n

n

2

n

n

n

2

n

n

n

2

n

n

Cascade filters.

k

k

k

k



k

k

j πkf

n

n

n

n

k

n



k

k

n

k

k

k

Y

&

n

'



/

/

Y

Hint:



/

/



/

/

One-step prediction.

background image

5.10. PROBLEMS

n

k

2

n

n

k

2

n



k

k

n k

n

i

k

one-step

predictor.

n

n

n

2

n



X

n



n

n

Spectral factorization.

n

X



/

/

X

X

X

Hint:

X

Binary filters.

background image

CHAPTER 5. SECOND-ORDER MOMENTS

n

n

n

n

n

n

n

n

Y

j

j k



i

i

Y

Y

Hint:

k

k



i

i

k

k

k

k

k

k

k

k

k

n

n

n

n

n

n

convo-

lutional code

W

n

W

n

background image

5.10. PROBLEMS

X





Y,V

t s

t

t

W

n

n



k

k

background image

CHAPTER 5. SECOND-ORDER MOMENTS

X

τ

S

Y

background image

5.10. PROBLEMS

Y



t

Y

X

N

n

n

n

n

n m

i

n

n

2

n

n

n

2

n

n

2

n

n

n

N

n m

n

n

n

n

n

background image

CHAPTER 5. SECOND-ORDER MOMENTS

n

n

X

Y

n

2

n

n



k n k

k n k

n

2

n

X

n

n

X

x k

Z

z k

n

n

n

n

n

n

n

n

n

n

n

n

n

n

n

Z

W

n

n

n

n

n

preemphasis

filter

n

n

n

n

n

n

n

n

n

background image

5.10. PROBLEMS

Z

Z

Z

n

n k

Z



k

Z

j πf k

Z

X

X

Y

Y

n

n

n

X

n

W

n

n

k


n

n

n

n

n

n

n

n

X

X

τ



t

t T

background image

CHAPTER 5. SECOND-ORDER MOMENTS

nonlinear

j

πf

0

t cX t

W

W

Hint:

X



k

Z

n

n

X

Z

n

n

n

n

n

n

background image

5.10. PROBLEMS

Y

n

n

n

n

n

n

n

n

n

n

X



λx

N

N

n

n

n

n

U

n

n

n

n



k

k

simple

n

U

U

n

X

k

n

n

n

n

Y

k

background image

CHAPTER 5. SECOND-ORDER MOMENTS

n

Y

whitening filter

n

n

N

N



i

i

n

X

X

n

k,n

n



i k

i

n

n



n

X

n

n

k

W

k,n

Y

n

Y

n

Y

Y

n

k,n

k,n

background image

5.10. PROBLEMS

Y

n

Y

k

k,n

Hint:

Y

n

Hint:

k,n

t

Y

t

Y

t

Y

t+τ

Y

t

Y

t+τ

Y

t



τ

T

Y

t

Y

W

T



T

Y

T

A

A

T

background image

CHAPTER 5. SECOND-ORDER MOMENTS

Y



T



T



T

Y



T

Y

Y

background image

Chapter 6

A Menagerie of Processes

background image

CHAPTER 6. A MENAGERIE OF PROCESSES

6.1

Discrete Time Linear Models

n

k

n

n



k n k

n k k

moving-average filter

k

k

n

n

n

moving-average filter

background image

6.1. DISCRETE TIME LINEAR MODELS

n

✲ m

n



k

k

n k

✲ m

✲ m

j πf

n



k

k n k

k

background image

CHAPTER 6. A MENAGERIE OF PROCESSES

n

n



k

, ,...

k n k

k

regression coefficients

autoregressive filter

k

k

n

n

n

n

n

n



k

, ,...

k n k

m

m



k

k

j πkf

background image

6.1. DISCRETE TIME LINEAR MODELS

j πf



k

k n k



i

i n i

k

k

n

✲ m

✲ m

✲ m

n

m

m



i

i

j πif



k

k

j πkf

background image

CHAPTER 6. A MENAGERIE OF PROCESSES

k

k

linear models

n

autoregressive random

process

moving-average random

process

ARMA random process

order

6.2

Sums of IID Random Variables

n

background image

6.2. SUMS OF IID RANDOM VARIABLES

k



n





n
i

i

n

k

n

Y

Y

Y

1

t

Y

Y

1

n



n

n

background image

CHAPTER 6. A MENAGERIE OF PROCESSES

n

j πf

j πf

6.3

Independent Stationary Increments

changes

jump

increment

background image

6.3. INDEPENDENT STATIONARY INCREMENTS

t

k

i

ordered

k

increments

t

t

i

t

i

1

t

independent increments

independent increment random process

i

t

i

t

i

1

stationary increments

t δ

s δ

independent stationary increment

background image

CHAPTER 6. A MENAGERIE OF PROCESSES

n

i

n

n

n

n

n

n



k

k

k

n



k

k

n

n

t

s

t



i s

i

background image

6.4.

SECOND-ORDER MOMENTS OF ISI PROCESSES

k

t

n

n



i

t

i

t

i

1

t

n

6.4

Second-Order Moments of ISI Processes

t

t τ

t τ

t

t

t τ

t τ

t

t

t τ

t

background image

CHAPTER 6. A MENAGERIE OF PROCESSES

τ

τ

t τ

τ

t

linear functional equation

t

t τ

t τ

t

t

t τ

t τ

t

τ

τ

Y

t+τ

Y

τ

Y

t

Y

t

Y

t

t

Y

t

t

background image

6.5. SPECIFICATION OF CONTINUOUS TIME ISI PROCESSES

Y

t

t

t

s

s

Y

t s

t

s

s

s

t

s

s

t

s

t s

t s

Y

Y

Y

6.5

Specification of Continuous Time ISI Pro-
cesses

Y

t

Y

s

Y

|t−s|

Y

0

Y

|t−s|

Y

t

Y

s

background image

CHAPTER 6. A MENAGERIE OF PROCESSES

n

i

j

t

n

n

t

n

1

n

t

n

2

n

i

n

n

t

n

t

n

1

n

t

n

n



i

t

n

n

t

n

1

n

t

n

2

n

X

n

n

n

Y

tn

Y

tn−1

n

n

Y

tn

Y

tn−1

,... ,Y

t1

n

n

X

n

n

n

Y

tn

Y

tn−1

n

n

Y

tn

Y

tn−1

,... ,Y

t1

n

n

X

n

n

n

Y

tn

Y

tn−1

n

n

Y

t1

,... ,Y

tn

n

n



i

Y

ti

Y

ti−1

i

i

Y

t1

,... ,Y

tn

n



i

Y

ti

Y

ti−1

i

i

background image

6.5. SPECIFICATION OF CONTINUOUS TIME ISI PROCESSES

t

t

t

n

n

n

t

n

n

t

n

1

n

t

n

2

n

t

n

n

t

n

1

n

Y

tn

Y

tn−1

,... ,Y

t1

n

n

Y

tn

Y

tn−1

n

n

Y

tn

Y

tn−1

,... ,Y

t1

n

n

Y

tn

Y

tn−1

n

n

[6.1

The Continuous Time Wiener process

Y

t

y2

22

background image

CHAPTER 6. A MENAGERIE OF PROCESSES

[6.2]

Y

t

k

λt

6.6

Moving-Average and Autoregressive Pro-
cesses

n

Y

n



k

h

k

X

n

−k

h

k

X

n

−k

6

juh

k

X

n

−k

7

X

n

−k

k

X

k

Y

n



k

X

k

n

k

background image

6.6. MOVING-AVERAGE AND AUTOREGRESSIVE PROCESSES

n

n



k>

k n k

n k

n

n

n

n

conditional

n

marginal

n

n

n

n

n

n


n



k

k n k


X




k

k n k


Y

n

Y

n

1

,Y

n2

,...

n

n

X



k

k n k

background image

CHAPTER 6. A MENAGERIE OF PROCESSES

6.7

The Discrete Time Gauss-Markov Pro-
cess

n

n



k

n k

k

Y

n



k

j ur

k

m

/

ur

k 2

σ

2
X

jum




k

r

k


/ u

2

σ

2
X




k

r

2k


Y



k

k

Y

X



k

k

X

n

n

n

n

n

Y

n

n

Y

n

n

n

n

Y

n

1

n

n

n

Y

1

Y

1

n



i

X

i

i

y

2
1

/

σ

2
Y



Y

n



i

y

i

ry

i

1

2

/ σ

2

y

2
1

/

σ

2
Y



Y



n
i
=2

y

i

ry

i

1

2

/ σ

2

n

1

2

background image

6.8. GAUSSIAN RANDOM PROCESSES

6.8

Gaussian Random Processes

Corollary 6.1 If a Gaussian random process

t

is passed through a

linear filter, then the output is also a Gaussian random process with mean
and covariance given by (5.3) and (5.7).

6.9

The Poisson Counting Process

background image

CHAPTER 6. A MENAGERIE OF PROCESSES

t

very small

background image

6.9.

THE POISSON COUNTING PROCESS

t

t

t

t

t

t

t

t

t

N

t

N

0

N

t

t

N

t

N

0

k



n

t

t

t

t

t

k



n

t

t

t

t



background image

CHAPTER 6. A MENAGERIE OF PROCESSES

N

t

k

λt

N

t

N

s

k

λ t s

t

s

s

r

t

r

6.10

Compound Processes

t

t

N

t

t

N

t



k

k

background image

6.10. COMPOUND PROCESSES

t

t

n

t

k

n

t

t

t





N

t

k

k

t

t

t

t

N

t



k

k

t

t

t

t

n

k

k

k

k

k

t

t

t

t

Y

t

juY

t

juY

t

t

X

N

t

t

t

t

X

t

t

background image

CHAPTER 6. A MENAGERIE OF PROCESSES

X

N

k

k



n

N

k

X

n

k



n





k

n k

X

n

X

k

t

X

N

t



n

N

t

X

n



n

n

λt

X

n

λt



n

X

n

λt

M

X

ju

6.11

Exponential Modulation

j a

1

t a

2

X t

background image

6.11.

EXPONENTIAL MODULATION

phase modulation

frequency modulation

N t

random telegraph wave

background image

CHAPTER 6. A MENAGERIE OF PROCESSES

Y

Y

Y

not

j a

1

t a

2

X t

j a

1

t a

2

EX t

E

j a

1

t s

a

2

X t

X s

ja

1

t s

ja

2

X t

X s

Y

ja

1

t s

X t

X s

background image

6.11.

EXPONENTIAL MODULATION

X

X t

X s

X

X

X

X

Y

ja

1

t s

X t

X s

ja

1

t s

/ a

2
2

R

X

t,t

R

X

s,s

R

X

t,s

Y

ja

1

τ

a

2
2

R

X

R

X

τ

U





 

ja

1

t s

j a

2

X t

X s

background image

CHAPTER 6. A MENAGERIE OF PROCESSES

Y

Y

U

/ a

2
2

R

X

t,t

R

X

s,s

R

X

t,s

U

a

2

R

X

R

X

τ

Y

ja

1

τ

X τ

Y

ja

1

τ

X τ

Y

Y

is

X τ



k

k

λτ

juk

λτ



k

ju k

λτ e

ju

background image

6.12.

THERMAL NOISE

Y

λ τ

6.12

Thermal Noise

x,k

background image

CHAPTER 6. A MENAGERIE OF PROCESSES

nAL



k

k

nAL



k

x,k

nAL



k

x,k

x,k

x,k

x,j

x,k

I

nAL



k

x,k

x,k

x

x

ατ

k

t,τ

x

x

x

x

t,τ

x

t,τ

x

t,τ

x

ατ

background image

6.13. ERGODICITY AND STRONG LAWS OF LARGE NUMBERS

x

I

α τ

E

α τ

E

6.13

Ergodicity and Strong Laws of Large Num-
bers

background image

CHAPTER 6. A MENAGERIE OF PROCESSES

-invariant

t

t τ

t

ergodic

Ergodic

Theory and Information

Theorem 6.1 The Strong Law of Large Numbers (The Pointwise Ergodic
Theorem)

Given a discrete time stationary random process

n

with

finite expectation

n

X

, then there is a random variable

with

the property that

n



n

n

with probability 1

background image

6.13. ERGODICITY AND STRONG LAWS OF LARGE NUMBERS

that is, the limit exists. If the process is also ergodic, then

X

and

hence

n



n

n

X

with probability 1

n



n

n

T



mixture

doubly stochastic

n

n

n

n

n

n

n

X

0

,... ,X

k

1

x

U

0

,... ,U

k

1

x

W

0

,... ,W

k

1

x

background image

CHAPTER 6. A MENAGERIE OF PROCESSES

X

0

X

0

,X

1

X

0

,X

1

X

0

X

1

n

n

the conditional expectation given knowledge of which stationary and

ergodic random process is actually being observed!

ergodic decomposition theorem

background image

6.14. PROBLEMS

6.14

Problems

n

n

n

Y

n

n

Y

n

n

n

n

n

n

U

n

n

X

n

X

n

X

n

n

N



i

n i

n

X

n

X

n

X

n

Y

n

Y

n

cross-correlation

X,Y

t s

n

U

n

n

background image

CHAPTER 6. A MENAGERIE OF PROCESSES

t

N

t





k

N

t

t

N

t

X

t

t

t

t

t

t

t

channel with

additive noise

t

t

t

t

X

τ

background image

6.14. PROBLEMS

Y

W t

W t ,W s

k

n

n

N

n

correlation

coefficient.

S,D

S,D

k

k

λ

background image

CHAPTER 6. A MENAGERIE OF PROCESSES

N

N

n

n

th

k

k



i

i

i

S

k

S

k

k

k

n

n

n

n

Z

n

n

n



i

i

i

n



i

i

n i

n

n

n

n

background image

6.14. PROBLEMS

n

n

Note:

V

n

V

n

1

,V

n

2

,... ,V

0

n

n

n

n

n

Z

W

n

n

n

n

n

preemphasis

filter

n

n

n

n

n

n

n

n

n

Z

Z

Z

n

n k

Z



k

Z

j πf k

Z

X

X

Y

Y

Y

n

X

n

U

n

,W

n

U

n

,W

n

background image

CHAPTER 6. A MENAGERIE OF PROCESSES

n

n

t

n

t

n

k

k

N

k



i

i

Y

n

n

n

n

n

n

t

t

N

t



i

i

t

t

N

t

k

λt

t

L

t

k

νt

t

t

t

t

t

t

t

background image

6.14. PROBLEMS

t

t

t

Z

k

k

t

n

Z

t

N

t



k

k

t

Y

t

n

X

n

W

n

n

k


n

n

n

n

U

1

,U

2

n

n

n

n

background image

CHAPTER 6. A MENAGERIE OF PROCESSES

X

X

τ



t

t T

k

k

k

k

k

k

k

k

k

k

X

X

Hint:

k

k

n

i

i

N

background image

6.14. PROBLEMS

i

i

N

N



i

i

N



N

N

N

N

t

N

t

k

λt

t

t

s

t

Y

k

t

n

n

n

n

n

X

n

N

2(n

1)

t

background image

CHAPTER 6. A MENAGERIE OF PROCESSES

n

X

α τ

X

Y

Y t

n

n

sampling

sampling period

n

n

t



k

th

k

t

N

t



k

k

Y t

Y t

Hint:

k

background image

6.14. PROBLEMS

k

th

k

α

interar-

rival times

k

k

k

k

k

k



i

i

k

Hint:

r

k

th

α

Erlang

Hint:

τ

n

τ

1

,... ,τ

n

1

n

τ

n

λα

background image

CHAPTER 6. A MENAGERIE OF PROCESSES

background image

Appendix A

Preliminaries: Set Theory,
Mappings, Linear
Algebra, and Linear
Systems

A.1

Set Theory

abstract space

space

elements

points

[A.0]

background image

APPENDIX A. PRELIMINARIES

nonempty

[A.1]

zero

one

heads

tails.

numeric

categorical

zero, one

heads, tails

[A.2]

k

[A.3]

ones

zeros

[A.4]

i

[A.5]

[A.6]

background image

A.1. SET THEORY

[A.7]

[A.8]

[A.9]

n

n

[A.10]

st

random vari-

able

random vector

background image

APPENDIX A. PRELIMINARIES

random process

random sequence

random time series

random process

discrete time random process

continuous time random process

k

product spaces

head, tail

head

tail

sets

subset

background image

A.1. SET THEORY

[A.11]

[A.12]

one.

one-point set

singleton set

[A.13]

zero

[A.14]

ones

zeros

one

[A.15]

[A.16]

[A.17]

[A.18]

[A.19]

[A.20]

[A.21]

element

background image

APPENDIX A. PRELIMINARIES

inclusion symbol

set inclusion

symbol.

complementation intersection

union

Venn diagrams

c

c

intersection

disjoint

mutually exclusive.

union

always

background image

A.1. SET THEORY

c

background image

APPENDIX A. PRELIMINARIES

set difference

c

symmetric difference

c

c

background image

A.2. EXAMPLES OF PROOFS

A.2

Examples of Proofs

Relation (A.8).

c c

c

c c

c

c c

c c

c c

Relation (A.18).

c

Relation (A.11).

c

c

c

c

c

c

Relation (A.12).

c

c

c

c

c

c c

c

c

background image

APPENDIX A. PRELIMINARIES

c c

c

c

c

c

c

c

c

c

c

c

background image

A.2. EXAMPLES OF PROOFS

c

c

Relation (A.20).

c

c

c

Relation (A.19).

c

c

c

c

c

c

c

c

c

c

c

c

c

c

c

c

c

c

c

c

n



i

i

c

n



i

c

i



n
i

i

c



n
i

i

i

c

i



n
i

c

i



n
i

c

i

c

i

background image

APPENDIX A. PRELIMINARIES

i



n
i

i

i

i i

i

family

class

k

countably

infinite

i



i

i

i



i

i

i

n



i

i

n



i

i

background image

A.3. MAPPINGS AND FUNCTIONS

i

disjoint

pairwise disjoint

mutually exclusive

i

j

collectively exhaustive

exhaust



i

i

i

j

partition

i

A.3

Mappings and Functions

function

mapping

domain

domain of definition

range

background image

APPENDIX A. PRELIMINARIES

image

inverse image

preimage

range space

onto.

one-to-one

A.4

Linear Algebra

x

k

i

x


k


x

k

t

background image

A.4. LINEAR ALGEBRA

k

k

t

t

n



i

i i

n



i

i

i


,

,

,

,n

,

,

,

,n

m

,

m

,

m

,

m

,n


square

t

t

t

k,j

j,k

k,j

j,k

Hermitian

i

n



k

i,k i

background image

APPENDIX A. PRELIMINARIES

n


n

n

n

n

n

n

n


eigenvalue

eigenvector

n



i

i,i

n



i

i

n



i

i

n



i

i

n



i

i

1

n

/n

background image

A.5. LINEAR SYSTEM FUNDAMENTALS

k

t

t

t

n



k

n



j

k j k,j

quadratic form

t

nonnegative definite

t

positive definite

t

t

/

/

/

/

/

/

A.5

Linear System Fundamentals

system

signal

discrete time system

continuous time system

background image

APPENDIX A. PRELIMINARIES

linear



s t s

t

t

impulse response

linear filter

time-invariant

convolution integral



s t s



s





background image

A.5. LINEAR SYSTEM FUNDAMENTALS



s



j πf t



j πf t

stable.

background image

APPENDIX A. PRELIMINARIES

n



k

k

k

m



i

i

i



k

k



k

k

n



 

k

k



k k

k n k



k n k

n k k

k

Kro-

necker

response

background image

A.5. LINEAR SYSTEM FUNDAMENTALS

m



k

k

k

k

k



k

k



k

k



k

k

j πkf



k

k

j πkf

stable




j πf t



k

k

j πf k



k

k n k



i

i n k

background image

APPENDIX A. PRELIMINARIES

k

causal

k

k

A.6

Problems

i

i

i

i

i



j

j

i



j

j

i

i

i

c

c

c

r



r

,

r



r

,

r

i



i

i



i

c

i

c

background image

A.6. PROBLEMS

n





c



n

n

i

i

i

i

i

i

i

i



i

i

c

c

i

i

background image

APPENDIX A. PRELIMINARIES



i

i



i

i



i

i

quantizer

waveform

channel,

decision

space

decision rule.

at

background image

A.6. PROBLEMS

k

k

k

k





t



j πf t

background image

APPENDIX A. PRELIMINARIES

n

n



n

n

n



n



n

n

j πf n

n



k

k



k

k

 

x,y

x,y

x

2

y

2

background image

A.6. PROBLEMS





x

y

 

x,y x

y

r





x

y

 

x,y

x

2

y

background image

APPENDIX A. PRELIMINARIES

background image

Appendix B

Sums and Integrals

B.1

Summation

The sum of consecutive integers.

n



k

Proof:

n



n
k

n

n

n

The sum of consecutive squares of integers.

n



k

background image

APPENDIX B. SUMS AND INTEGRALS

n



k

Proof:



n
k

The geometric progression

n



k

k

n



k

k

Proof:

n



n
k

k

n

n



k

k

n



k

k

n



k

k

n



k

k

n

background image

B.1. SUMMATION

n

n

n



k

k

n

n

n

n

n

First moment of the geometric progression



k

k

Proof:

k

k



k

k



k

k

Second moment of the geometric progression



k

k



k

k



k

k



k

k



k

k



k

k



k

k

background image

APPENDIX B. SUMS AND INTEGRALS



k

k



k

k

B.2

Double Sums

Lemma B.1 Given a sequence

n

,

N



k

N



l

k l

N



n

N

n

Proof:

k,l

N

k,l

k l

Toeplitz

n

N

Lemma B.2 Suppose that

n

is an absolutely summable se-

quence, i.e., that



n

n

Then

N

N



n

N

n



n

n

Comment:

background image

B.3. INTEGRATION

Proof:



n

n

N

N



n

N

n

N

N



n

N

n



n n

N

n

N



n

N

n

N



n

N

n



n n

N

0

n



n N

0

n

N

n



n n

N

0

n



n n

N

0

n



n n

N

0

n

B.3

Integration



r

background image

APPENDIX B. SUMS AND INTEGRALS



αx



k

αx

k

k



αx

k

k



k

αx

k

k



k

αx

k



x

2



x

2

8 

x

2

8 

x

2



y

2

8 

x

2

y

2

background image

B.4.

THE LEBESGUE INTEGRAL





π/



r

2



r

2



u



x

2



r2

2



(x

−m)2
2σ2



B.4

The Lebesgue Integral

background image

APPENDIX B. SUMS AND INTEGRALS

Probability, Random Processes, and Ergodic Properties

N



i

i F

i

i

simple function



N



i

i

i



quantizers

n


n

n

n

n

n

n

n

n

n

n

background image

B.4.

THE LEBESGUE INTEGRAL

n



n



n







integrable





n



n

n

n

n

n



n



background image

APPENDIX B. SUMS AND INTEGRALS

Theorem B.1 If

n

is a sequence of nonnegative random

variables that is monotone increasing up to

(with probability 1) and

n

(with probability 1) for all

, then

n



n



Theorem B.2 If

n

is a sequence of random variables that

converges to

(with probability 1) and if there is an integrable function

which dominates the sequence in the sense that and

n

(with proba-

bility 1) for all

, then

n



n



background image

Appendix C

Common Univariate
Distributions

Binary pmf

Uniform pmf

n

n

n

n

n

Binomial pmf

n





k

n k

n





Geometric pmf.

k

p

background image

APPENDIX C. COMMON UNIVARIATE DISTRIBUTIONS

p

2

Poisson pmf.

k

λ

Uniform pdf.

b a

b a

2

Exponential pdf.

λr

Doubly exponential (or Laplacian) pdf.

λ r

Gaussian (or Normal) pdf.

/

r m

2

σ

2

Gamma pdf

a

b

b

b

r
a



r b

Logistic pdf.

e

r/λ

λ

e

r/λ 2

Weibull pdf

b

a

b

b

r
a

b

Rayleigh

b

b

b

background image

Appendix D

Supplementary Reading

background image

APPENDIX D. SUPPLEMENTARY READING

background image

asymptotically mean stationary

n

n



i

i

background image

APPENDIX D. SUPPLEMENTARY READING

any only if

background image
background image

APPENDIX D. SUPPLEMENTARY READING

background image

Bibliography

Real Analysis and Probability

A First Course in Integration

Ergodic Theory and Information

Introductory Network Theory

The Fourier Transform and Its Applications

Probability

Introduction to Linear System Theory

Markov Chains with Stationary Transition Probabilities

A Course in Probability Theory

Stationary and Related Stochastic

Processes

Stochastic Processes

Fundamentals of Applied Probability Theory

Inequalities for Stochastic Processes:

How to Gamble If You Must

background image

BIBLIOGRAPHY

Markov Processes

An Introduction to Probability Theory and its Applications

IEEE Trans. Inform. Theory

Introduction to Communications Engineering

Introduction to the Theory of

Random Processes

The Theory of Probability

An Elementary Introduction

to the Theory of Probability

isl.stanford.edu

http://www-isl.stanford.edu/~gray/compression.html

Probability, Random Processes, and Ergodic Properties

Fourier Transforms

Ann. Probab.

Statistical Analysis of Stationary

Time Series

Toeplitz Forms and Their Applications

Measure Theory

Lectures on Ergodic Theory

background image

BIBLIOGRAPHY

Stationary Stochastic Processes

How to Take a Chance

Linear Systems

Lectures on Wiener and Kalman Filtering

Finite Markov Chains

Foundations of the Theory of Probability

Stochastic Processes: A Survey of the Mathematical The-

ory

Statistics of Random Processes

Probability Theory

Stochastic Convergence

Probability Theory: A Historical Sketch

Stochastic Integrals

Discrete-Parameter Martingales

The World of Mathematics

The Fourier Integral and Its Applications

Signal Analysis

Probability Measures on Metric Spaces

background image

BIBLIOGRAPHY

Stochastic Processes

Markov Processes: Structure and Asymptotic Behavior

Real Analysis

Stationary Random Processes

Principles of Mathematical Analysis

Introduction to Topology and Modern Analysis

An Introduction to Discrete Systems

Problems in Probability Theory, Mathematical

Statistics,and Theory of Random Functions

Probability

The Fourier Integral and Certain of Its Applications

Time Series: Extrapolation,Interpolation,and Smoothing of

Stationary Time Series with Engineering Applications

Fourier Transforms in the Complex

Domain

Introduction to Random Processes

An Introduction to the Theory of Stationary Random

Functions

background image

Index

background image

INDEX

background image

INDEX

background image

INDEX

background image

INDEX

background image

INDEX


Wyszukiwarka

Podobne podstrony:
(ebook pdf) Mathematics An Introduction To Cryptography ZHS4DOP7XBQZEANJ6WTOWXZIZT5FZDV5FY6XN5Q
(ebook pdf) Mathematics Bayesian Methods, A General Intr PVL7A2PAHPNMYQDCY56JC5QFHCB2WS5QY2PB4FQ P
(ebook pdf) Mathematics Bayesian Networks DMHT5LLVIGVC7GROARQI5O35WWBART7WWHZTUDQ
(ebook pdf) Mathematics Abstract And Linear Algebra PJFCT5UIYCCSHOYDU7JHPAKULMLYEBKKOCB7OWA
(ebook pdf) Mathematics A Brief History Of Algebra And Computing
(ebook pdf) Mathematics Advanced determinant calculus
Mathematics SPSS Guide Statistics (ebook pdf
(ebook PDF)Shannon A Mathematical Theory Of Communication RXK2WIS2ZEJTDZ75G7VI3OC6ZO2P57GO3E27QNQ
(ebook PDF)Shannon A Mathematical Theory Of Communication RXK2WIS2ZEJTDZ75G7VI3OC6ZO2P57GO3E27QNQ
praktyczny pomocnik procesowy darmowy ebook pdf
(ebook pdf) Matlab Getting started
Komandosi w bialych kolnierzykach Metody zarzadzania stosowane przez najlepszych menedzerow eBook Pd
Physics Ebook(PDF) Aristotle Physics id 804538
1Introduction Signal Processing Haslerid 19014 (2)
(ebook pdf chemistry) Methamphetamine Synthesisid 1274
(ebook PDF) How to Crack CD Protections
(ebook PDF) Perl Tutorialid 1275
(ebook pdf) Programming OpenGL Programming Guide
Joomla! 1 6 Ćwiczenia eBook Pdf

więcej podobnych podstron