Binary Symmetric channel (BSC) is idealised model used for noisy channel.

Download Report

Transcript Binary Symmetric channel (BSC) is idealised model used for noisy channel.

Binary Symmetric channel (BSC) is idealised model used for noisy channel.

• binary (0,1) • symmetric p( 0  1) =p(1  0)

X0 P(Y0|X0) Y0 P(X0) P(Y0) P(Y1|X0) P(Y0|X1) X1 P(Y1|X1) Y1 P(X1) P(Y1)

Forward transition probabilities of the noisy binary symmetric channel.

Binary Symmetric Channel (BSC).

Xi

;

i=0,1.

Received symbols Transmitted source symbols

Yj; j=0,1;

, Given

P(Xi),

source probability

; P(Yj| Xi) = Pe if i≠j; P(Yj| Xi) = 1- Pe if i=j;

where Pe is the given error rate

.

Calculate the average mutual information (

the average amount of source information acquired per received symbol, as distinguished for that per source symbol, which was given by the entropy H(X)

).

Step 1: P(Yj,Xi)=P(Xi)P(Yj|Xi); i=0,1. Step 2: P(Yj) =Σ X P(Yj,Xi)

;

j=0,1. j=0,1.

(Logically the probability of receiving a particular Yj is the sum of all joint probabilities over the range of Xi. (i.e. the prob of

receiving 1

is the sum of the probability of sending 1 receiving 1 plus the probability of sending 0, receiving 1. that is, the sum of the probability of sending 1 and receiving correctly plus the probability of sending 0, receiving wrongly.

)

Step 3: I(Yj,Xi) = log {P(Xi|Yj)/P(Xi) } =log {P(Yj,Xi)/ [P(Xi)P(Yj)]}

;

i=0,1. j=0,1.

(This quantifies the amount of information conveyed, when

Xi

is transmitted, and

Yj

is received. Over a perfect noiseless channel, this is self information of

Xi

, because each received symbol uniquely identifies a transmitted symbol with

P(Xi|Yj)=1;

If it is very noisy, or communication breaks down

P(Yj,Xi)=P(Xi)P(Yj),

this is zero, no information has been transferred

).

Step 4: I(X,Y)= I(Y ,X ) = ΣX ΣY P(Yj,Xi)log{P(Xi|Yj)/P(Xi)}

=

ΣX ΣY P(Yj,Xi) log {P(Yj,Xi)/ [P(Xi)P(Yj)]}

•Equivocation Represents the destructive effect of noise, or additional information needed to make the reception correct x y Noisy channel transmitter receiver Noiseless channel Hypothetical observer The observer looks at the transmitted and received digits; if they are the same, reports a ‘1’, if different a ‘0’.

x y observer M M 1 M S 0 S S 1 S S 1 M M 1 M M 1 M M 1 S S 1 The information sent by the observer is easily evaluated as -[p(0)logp(0)+p(1)logp(1)] applied to the binary string. The probability of ‘0’ is just the channel error probability.

Example:

A binary system produces Marks and Spaces with equal probabilities, 1/8 of all pulses being received in error. Find the information sent by the observer.

The information sent by observer is -[7/8log (7/8)+1/8log(1/8)]=0.54 bits since the input information is 1 bit/symbol, the net information is 1-0.54=0.46 bits, agreeing with previous results.

M S 0

The noise in the system has destroyed 0.55 bits of information, or that the equivocation is 0.55 bits.

• General expression for equivocation Consider a specific pair of transmitted and received digits {

x,y

} 1. Noisy channel probability change

p(x)

p(x|y)

2. Receiver : probability correction

p(x|y)

1

The information provided by the observer =-log(

p(x|y)

) Averaging over all pairs probability of a given pair General expression for equivocation

H ( x | y )

  

x y p ( xy )

log

p ( x | y )

The information transferred via the noisy channel (in the absence of the observer)

I(xy)

H(x)

H(x | y )

Information transfer Information loss due to noise (equivocation) Information in noiseless system (source entropy) 

H(y)

H(y|x)

Example:

A binary system produces Marks with probability of 0.7 and Spaces with probability 0.3, 2/7 of the Marks are received in error and 1/3 of the Spaces. Find the information transfer using the expression for equivocation.

x y

M M M M M M M M M M M S M S S S S S S M

x y P(x) P(y) P(x|y) P(y|x) P(xy) I(xy)

terms M M 0.7

0.6

5/6 5/7 0.5

0.126

M S 0.7

0.4

1/2 2/7 0.2

-0.997

S S 0.3

0.4

1/2 2/3 0.2

0.147

S M 0.3

0.6

1/6 1/3 0.1

-0.085

•Summary of basic formulae by Venn Diagram

H(x|y) I(xy) H(y|x) H(y) H(x)

Source entropy Receiver Equivocati on

H (

:

H (

entropy :

H (

:

H ( y | x x ) | x y )

   

p ( x )

log

) y )

       

p ( p ( p ( xy y xy p ( )

log

)

log

)

log

x p ( p ( y y p ( | ) x x | ) y )

Informatio n transf er :

I ( xy )

 

p ( xy )

log [

p ( p ( x x | ) p ( y ) y )

]  

H ( x )

H H ( y )

H ( ( x | y | y ) x )

Quantity Source information Received information Mutual information Average mutual information Source entropy Destination entropy Equivocation Error entropy Definition

I ( X i )

 

log

2

P ( X i ) I ( Y j )

 

log

2

P ( Y j ) I ( X i , Y j )

log

2

P ( X P ( i X | Y j ) i ) I ( X , Y )

 

X Y P ( X i , Y j ) log

2

P ( X i | Y j ) P ( X i )

 

X Y P ( X i , Y j ) log

2

P ( Y j | X i ) P ( Y j )

 

X Y P ( X i , Y j ) log

2

P ( X i , Y j ) P ( X i ) P ( Y j ) H ( X )

  

X P ( X i ) log

2

P ( X i ) H ( Y )

  

Y P ( Y j ) log

2

P ( Y j ) H ( X | Y H ( Y | X )

)

   

X

X Y

Y

P ( P ( X i , Y j X i , Y j ) log

2

P ( X i | Y j ) log

2

P ( Y j | X i ) )

•Channel capacity

C

=max

I(xy)

 that is maximum information transfer Binary Symmetric Channels The noise in the system is random, then the probabilities of errors in ‘0’ and ‘1’ is the same. This is characterised by a single value

p

of binary error probability.

Channel capacity of this channel

I ( xy )

 

H ( H ( x y ) )

 

H ( x |



y ) p (

H ( y )

 

x p ( x xy )

log

p(y|x) )

  

y p ( y | x )

log

p(y|x)

 

Channel capacity of BSC channel

I ( xy )

 

H ( H ( y x )

)

H ( x |



y ) p ( xy )

log

p(y|x)

H (

y )

H(y)

(

x p ( p

log

p

x

)

 

y p

log

p p ( ) y | x )

log

p(y|x)

  where 

H(y)

H ( H ( p )

 

( p ) p

log

p

p

log

p ) p(0) Mutual information increases as error rate decreases p 0 0 p x y

(transmit)

p

(receive)

p(1)=1-p(0) p

 1 

p