Distributed Source Coding

Download Report

Transcript Distributed Source Coding

1

Distributed Source Coding

Trial Lecture

Fredrik Hekland

1. June 2007

2

Outline

● Concept of DSC ● Slepian-Wolf coding (lossless) ● Wyner-Ziv coding (lossy) ● Application areas

3

Distributed Source Coding - Sensor Networks

4

Correlated Sources

Entropy

H(X)

Conditional entropy

H(Y|X)

X Joint Entropy

H(X,Y)

Y Mutual Information

I(X;Y)

Co-located, Correlated observations

Source

X Y

Joint Encoder

R

Joint Decoder 5 ●

X

and

Y

correlated ● Both encoder and decoder know the correlation

R = H(X,Y)= H(Y) + H(X|Y) < H(X) + H(Y)

6

Distributed, but Correlated Observations

X

Encoder 1

R X Y

Joint Decoder Encoder 2

R Y

X

and

Y

spatially separated, but still correlated ● Informed encoders  Rate:

R=R X +R Y =H(X,Y)=H(Y)+H(X|Y)

● Uninformed, naive encoders  Rate:

R=R X +R Y =H(X)+H(Y) > H(X,Y)

Slepian-Wolf Coding (SWC)

7

X Y

Encoder 1

R X

Joint Decoder Encoder 2

R Y

X

and

Y

spatially separated, but still correlated ● Encoder/decoder designed w.r.t.

p(X,Y)

No communication between encoders!

R =R X +R Y = H(X,Y) = H(Y) + H(X|Y)

still possible!!

8

Achievable Rate Region - SWC

Code

X

with

Y

as side-information

R Y H(Y)

Time-sharing/ Source splitting/ Code partitioning

H(Y|X)

No errors

R

Vanishing error probability for long sequences Code

Y

with

X

as side-information

H(X|Y) H(X) H(X,Y) R X

Slepian & Wolf, “Noiseless Coding of Correlated Information Sources,” IEEE Trans. Inf.Theory, Jul.1973

Principle - SWC

2

nH(Y)

codewords

Y n X n

Apply

2

nH(X|Y) colors randomly 9

R Y = nH(Y) R X = nH(X|Y)

10

Toy Example – Binary Source

● ●

X X

and

Y

each 3 bits and

Y

differs at most in one bit 1. Make sets of

X

’s with Hamming distance 3:  X: {000,111}, {100,011}, {010,101}, {001,110} 2. Send index of set (requires 2 bits) 3. Send

Y

(requires 3 bits) Coset 4. Decode

X

by using the element in the set which is closest to

Y

5. Declare error if no element with d H ≤1

11

SWC design

● Proof in Slepian&Wolf’s article “non-constructive” ● Important realization: SWC is a channel coding problem ● “Virtual” correlation channel between

X

and

Y

A good channel code for this channel can provide

a good SW code by using coset codes as bins

Wyner’s Scheme

● Use a linear block code, send syndrome ● (n,k) block code, a set of 2

k

2 (n-k) syndromes, each corresponding to words of length

n .

● Each set is a coset code.

● Compression ratio of n:(n-k).

X

Lossless Encoder (Syndrome Former)

R ≥ H(X|Y)

Syndrome bits Joint Decoder

Y

12 A. Wyner, “Recent Results in the Shannon Theory,” IEEE Trans. Inf.Theory, Jan.1974

13

Practical SWC Design

● Use more powerful channel codes  LDPC / Turbo codes ● Send parity bits  Zhao & Garcia-Frias, “Data compression of correlated

non-binary sources using punctured turbo codes”, DCC’02

● Or send syndrome  Liveris et al., “Compression of binary sources with

side-information at the decoder using LDPC codes,” IEEE Commun.Lett. vol.6, no.10, 2002

SWC using LDPC codes

14 Xiong et al., “Distributed Source Coding for Sensor Networks,” IEEE Sig.Proc.Mag., Sept. 2004

Continuous Sources – Wyner-Ziv Coding (WZC)

15 ● Generalizes SWC by introducing a fidelity criterion ● A joint source-channel coding problem ● We need   Good source coder to achieve the source coding gains (e.g.TCQ) Good channel code which approaches Slepian-Wolf limit (LDPC)

16

Wyner-Ziv Rate-Distortion Function

X p t

(z|x)

Z f Y R

WZ

d

p I X Z

 under the following conditions: 

f

:   X ˆ 

R

( ),

d

 0

X

d

Distributed Source Coding Using Syndromes (DISCUS)

● First constructive design approach for WZC ● Trellis-based quantization and coset construction.  2-5 dB away from WZ-bound 17 ● [Yang et al. ’03]: SWC-TCVQ  Irregular LDPC, n=10 6   2-D TCVQ Quadratic Gaussian: 0.47 dB away for 3.3 bit/sym Pradhan & Ramchandran,“Distributed Source Coding Using Syndromes (DISCUS): Design and Construction,” Data Compression Conf. (DCC), 1999

18

Other Approaches to Lossy DSC

● Distributed Karhunen-Loève transform  Local minima  ● Distributed scalar quantizers optimized for noisy channels  Simpler encoder   Local minima 

19

Application Areas

Sensor networksMultimedia transmission ● Robust coding for co-located sources  

Digitally enhanced analog TV

Multiple description coding ● Data hiding / watermarking ● Coding for multiple access channels ● MIMO broadcast channel ● Searchable compression (…)

20

Sensor Networks

Remote sensor Sink Remote sensor Remote sensor Remote sensor Remote sensor Remote sensor ● Possible rate savings with WZC  ● Hard to find correlation model   Can be determined through training  But what about time-varying correlation?

Sink

21

Wyner-Ziv for Video Compression (1/3)

Network Infrastructure

Wyner-Ziv video decoder MPEG encoder Wyner-Ziv video encoder MPEG decoder ● MPEG: High encoder complexity ● Portables: Less powerful hardware ● Solution: Wyner-Ziv video coding 

Shifts complexity to the decoder

 Transcoding to MPEG provides simple decoder for receiver

Wyner-Ziv for Video Compression (2/3)

22

X

Video frame Intraframe Encoder Slepian-Wolf Codec Scalar Quantizer Turbo Encoder Buffer Interframe Decoder Turbo Decoder Key frames Reconstruction

Y

Interpolation

Wyner-Ziv for Video Compression (3/3)

23 Girod et al.,“Distributed Video Coding,” Proc. IEEE, Jan.2005

24

Digitally Enhanced Analog TV

Watermarking

W

Encoder

Z=X+W

Attacker

Y

Decoder 25

X

● “Hide” a message

W

inside a host

X

● A dual problem to DSC  Channel coding with side-information at encoder ● Attacker tries to remove/destroy watermark

W

 Source

X

must be preserved For AWGN attack, knowledge of

X

encoder is as good as knowing

X

and decoder.

only at the at both encoder Costa," Writing on Dirty Paper,” IEEE Trans.Inf.Theory, May 1983

MIMO Broadcast Channel

26 Complexity at transmitter ● Non-degraded broadcast channel   Complexity at receiver Cannot use superposition coding with successive decoding Related to watermarking: Dirty paper coding!

● Costa’s “writing on dirty paper” scheme  

Adapt to interference, don’t try to cancel it

User 1’s signal hosts, insert “watermark” as message to User 2

27

Summary

● Distributed Source Coding  Enables compression of correlated, spatially separated sources  Slepian-Wolf Coding: Lossless  Wyner-Ziv Coding: Lossy ● Other uses    Multimedia Watermarking Multiple access / broadcast channels / MIMO

28

Further Reading

Slepian & Wolf, “Noiseless Coding of Correlated Information Sources,” IEEE Trans.Inf.Theory, Jul. 1973 Wyner & Ziv, “The Rate-Distortion Function for Source Coding with Side Information at the Decoder,” IEEE Trans.Inf.Theory, Jan. 1976 Pradhan & Ramchandran,“Distributed Source Coding Using

Syndromes (DISCUS): Design and Construction,” IEEE Trans.Inf.Theory, Mar.2003

Pradhan et al., “Distributed Compression in a Dense Microsensor Network,” IEEE Sig.Proc.Mag., Mar.2002.

Xiong et al., “Distributed Source Coding for Sensor Networks,” IEEE Sig.Proc.Mag., Sept. 2004 Yang et al. “Wyner-Ziv Coding Based on TCQ and LDPC Codes”,

37 th Asilomar Conference on Sig.,Sys.and Comp. 2004

Girod et al.,“Distributed Video Coding,” Proc. IEEE, Jan.2005 Cox et al.,”Watermarking as Communications with Side Information,” Proc. IEEE, Jul. 1999