Chapter 8 Fuzzy Associative Memories

Download Report

Transcript Chapter 8 Fuzzy Associative Memories

Chapter 8

Fuzzy Associative Memories

Li Lin 2004-11-24

CONTENTS

     Review Fuzzy Systems as between-cube mapping Fuzzy and Neural Function Estimators Fuzzy Hebb FAMs Adaptive FAMs

Review

   In Chapter 2, we have mentioned BAM theorem Chapter 7 discussed fuzzy sets as points in the unit hypercube What is associative memories?

Fuzzy systems

Koskos: fuzzy systems as between-cube mapping universe of discourse

I n

Fig.1 A fuzzy system

I p

The continuous fuzzy system behave as associative memories, or fuzzy associative memories.

Fuzzy and neural function estimators

 Fuzzy and neural systems estimates sampled function and behave as associative memories 

Similarities:

1. They are model-free estimator 2. Learn from samples 3. Numerical, unlike AI 

Differences:

They differ in how to estimate the sampled function 1. During the system construction 2. The kind of samples used

Differences:

3. Application 4. How they represent and store those samples 5. How they associatively inference Fig.2 Function f maps domains X to range Y

Neural vs. fuzzy representation of structured knowledge  Neural network

problems:

1. computational burden of training 2. system inscrutability There is no natural inferential audit tail, like an computational black box. 3. sample generation

Neural vs. fuzzy representation of structured knowledge  Fuzzy systems 1. directly encode the linguistic sample (HEAVY,LONGER) in a matrix 2. combine the numerical approaches with the symbolic one  Fuzzy approach does not abandon neural-network, it limits them to unstructured parameter and state estimate, pattern recognition and cluster formation.

FAMs as mapping  Fuzzy associative memories are transformations  FAM map fuzzy sets to fuzzy sets, units cube to units cube. Access the associative matrices in parallel and store them separately Numerical point inputs permit this simplification binary input-out FAMs, or BIOFAMs

FAMs as mapping 1 Light Medium Heavy 1 Short Medium Long 0 50 100

Traffic density

150

x n

 200 0 10 20 30

Green light duration

Fig.3 Three possible fuzzy subsets of traffic-density and green light duration, space X and Y.

y n

 40

Fuzzy vector-matrix multiplication: max-min composition  Max-min composition 

A

M

B

Where,

A

 (

a

1 ,...

a n

),

B

 (

b

1 ,...

b p

) , M is a fuzzy

b j I

 max 1 

i

n

min(

a i

,

m i

,

j

)

 Fuzzy vector-matrix multiplication: max-min composition  Example Suppose A=(.3 .4 .8 1),

B

A

M

  .

8 .

4

M

.

5    .

2   .

7   .

8 0 .

8 .

6 .

1 .

2 .

.

.

7 6 5 .

3      Max-product composition

b j

 max 1 

i

n a i m ij

Fuzzy Hebb FAMs

   Classical Hebbian learning law: 

ij

 

m ij

S i

(

x i

)

S j

(

y j

) Correlation minimum coding:

m ij

 min(

a i

,

b j

) Example

M

A T

B

  

a

1   

a n

B

 

B

      

A T

.

3

M

A

B

   .

4 .

8     .

8 1 .

4 .

5    .

3     .

.

.

4 8 8 .

3 .

4 .

4 .

4 .

.

.

.

3  4   5 5   

b

1 

A T

b m

The bidirectional FAM theorem for correlation-minimum encoding   The height and normality of fuzzy set A

H

(

A

)  max 1 

i

n a i

fuzzy set A is normal, if H(A)=1 Correlation-minimum bidirectional theorem (i) (ii) (iii) (iv)

A A B

  

B

  

M M M M T T

 

A B

 

B A

iff iff

H H

( (

A

)

B

)  

H H

( for any for any

A

B

 (

B

)

A

)

The bidirectional FAM theorem for correlation-minimum encoding  Proof

A

A T

 max 1 

i

n a i

A

 max 1 

i

n a i

H

(

A

) Then So

A

M

A

 (

A T

M

)  (

A

A T

) 

B

H

(

A

) 

B

H

(

A

) 

B H

(

A

) 

B

B iff H

(

A

) 

H

(

B

)

Correlation-product encoding   Correlation-product encoding provides an alternative fuzzy Hebbian encoding scheme

M M

 

A T B A T

B

  .

3 .

4 .

8 1   

and

.

8 .

4

m ij

.

5  

a i b

.

12 .

16 .

32 .

4 .

.

15  2   .

.

4 5   

Correlation-product encoding  Correlation-product bidirectional FAM theorem then (i) (ii) (iii) (iv)

A A B

  

B

  

M M M M T T

 

A B

 

B A

iff

H

(

A

)  iff

H

for any for any (

B

)

A

 

B

 1 1

FAM system architecture A ( (

A

1 ,

B

1 )

A

2 ,

B

2 ) FAM Rule 1

B

1   1 FAM Rule 2

B

 2  2    (

A m

,

B m

) FAM Rule m

B

m

m

FAM SYSTEM B

Defuzzifier

y j

Superimposing FAM rules    Suppose there are m FAM rules or associations The natural neural-network maximum or add the m associative matrices in a single matrix M:

M

 max 1 

k

m M k or M

 

M k k

This superimposition scheme fails for fuzzy Hebbian encoding The fuzzy approach to the superimposition problem additively superimposes the m recalled vectors instead of the fuzzy Hebb matrices

M k B

k A

M k

A

 (

A T k

B k

) 

B k

Superimposing FAM rules   Disadvantages: Separate storage of FAM associations consumes space Advantages: 1 provides an “ audit trail ” of the FAM inference procedure 2 avoids crosstalk 3 provides knowledge-base modularity 4 a fit-vector input A activates all the FAM rules in parallel but to different degrees.

Back

Recalled outputs and “ defuzzification ”   The recalled output B equals a weighted sum of the individual recalled vectors B  m 

k B

'

k B k

 k   1 How to defuzzify?

1. maximum-membership defuzzification

m B

(

y

max )  max 1 

j

p m B

(

y j

) simple, but has two fundamental problems: ① the mode of the B distribution is not unique ② ignores the information in the waveform B

Recalled outputs and “ defuzzification ” 2. Fuzzy centroid defuzzification B  p  j  1 y j

m B

(

y j

)

j p

  1

m B

(

y j

) The fuzzy centroid is unique and uses all the information in the output distribution B

Thank you!