Transcript Lecture 6

S. Mandayam/ ANN/ECE Dept./Rowan University
Artificial Neural Networks
ECE.09.454/ECE.09.560
Fall 2010
Lecture 6
October 18, 2010
Shreekanth Mandayam
ECE Department
Rowan University
http://engineering.rowan.edu/~shreek/fall10/ann/
S. Mandayam/ ANN/ECE Dept./Rowan University
Plan
• Radial Basis Function Networks
• RBF Formulation
• Network Implementation
• Matlab Implementation
• RBF Design Issues
• K-means clustering algorithm
• Adaptive techniques
• Lab Project 3
• Final Project Discussion
S. Mandayam/ ANN/ECE Dept./Rowan University
RBF Principle
Non-linearly
separable classes
Transform to
“higher”-dimensional
vector space
Linearly
separable classes
S. Mandayam/ ANN/ECE Dept./Rowan University
Example: X-OR Problem
x1
0
0
1
1
x2
0
1
0
1
j 1 (x)
0.13
0.36
0.36
1
y
0
1
1
0
j 2(x)
1
0.36
0.36
0.13
y'
0
1
1
0
j2(x)
x2
Decision
Boundary
x1
j1(x)
S. Mandayam/ ANN/ECE Dept./Rowan University
RBF Formulation
Problem Statement
• Given a set of N distinct real data vectors
(xj; j=1,2,…,N) and a set of N real numbers
(dj; j=1,2,…,N), find a function that satisfies
the interpolating condition
F(xj) = dj;
j=1,2,…,N
S. Mandayam/ ANN/ECE Dept./Rowan University
RBF Network
Input
Layer
x1
Inputs
x2
1
Hidden
Layer
Output
j
Layer
j
1
Outputs
1
1
j
x3
j ij  e

x i c j
2 2
2
y1
1
1
j
wij
y2
1
0.5
j(t)
0
-5
t
5
S. Mandayam/ ANN/ECE Dept./Rowan University
Matlab Implementation
Matlab Demos
» demorb1
» demorb3
» demorb4
%Radial Basis Function Network
%S. Mandayam/ECE Dept./Rowan University
%Neural Nets/Fall 10
clear;close all;
%generate training data (input and target)
p = [0:0.25:4];
t = sin(p*pi);
%Define and train RBF Network
net = newrb(p,t);
plot(p,t,'*r');hold;
%generate test data
p1 = [0:0.1:4];
%test network
y = sim(net,p1);
plot(p1,y,'ob');
legend('Training','Test');
xlabel('input, p');
ylabel('target, t')
S. Mandayam/ ANN/ECE Dept./Rowan University
RBF - Center Selection
x2
x1
Data points
Centers
S. Mandayam/ ANN/ECE Dept./Rowan University
K-means Clustering Algorithm
• N data points, xi; i = 1, 2, …, N
• At time-index, n, define K clusters with cluster centers
cj(n) ; j = 1, 2, …, K
• Initialization: At n=0, let cj(n) = xj; j = 1, 2, …, K
(i.e. choose the first K data points as cluster centers)
• Compute the Euclidean distance of each data point from the
cluster center, d(xj , cj(n)) = dij
• Assign xj to cluster cj(n) if dij = mini,j {dij};
i = 1, 2, …, N, j = 1, 2, …, K
• For each cluster j = 1, 2, …, K, update the cluster center
cj(n+1) = mean {xj  cj(n)}
• Repeat until ||cj(n+1) - cj(n)|| < e
S. Mandayam/ ANN/ECE Dept./Rowan University
Summary