-Artificial Neural NetworkChapter 3 Perceptron 朝陽科技大學 資訊管理系 李麗華教授 Outline •History •Structure •Learning Process •Recall Process •Solving OR Problem •Solving AND Problem •Solving XOR Problem 朝陽科技大學 李麗華 教授.

Download Report

Transcript -Artificial Neural NetworkChapter 3 Perceptron 朝陽科技大學 資訊管理系 李麗華教授 Outline •History •Structure •Learning Process •Recall Process •Solving OR Problem •Solving AND Problem •Solving XOR Problem 朝陽科技大學 李麗華 教授.

Slide 1

-Artificial Neural NetworkChapter 3 Perceptron

朝陽科技大學
資訊管理系
李麗華教授


Slide 2

Outline
•History
•Structure
•Learning Process
•Recall Process
•Solving OR Problem
•Solving AND Problem
•Solving XOR Problem

朝陽科技大學 李麗華 教授

2


Slide 3

History of Perceptron Model
In 1957, Rosenblatt and several other researchers
developed perceptron, which used the similar network
as proposed by McCulloch, and the learning rule for
training network to solve pattern recognition problem.
(*) But, this model was later criticized by Minsky who
proved that it cannot solve the XOR problem.

朝陽科技大學 李麗華 教授

3


Slide 4

Structure
The network structure includes:
Input layer: input variables with binary type information.
The number of node depends on the problem dimension.
Processing node: uses linear activation function, i.e.,
n
n e t j  I j , and the Bias  j is used.
Output layer: the computed results is
generated through transfer function.
Transfer Function: discrete type,
i.e., step function.

朝陽科技大學 李麗華 教授

4


Slide 5

Perceptron Network Structure
W11

X1

f1

W13

W12
f3

Y1

W21
X2

W22

f2

朝陽科技大學 李麗華 教授

W23

5


Slide 6

The training process
The training steps: (One layer at a time)
1. Choose the network layer, nodes, and connections.
2. Randomly assign weights: Wij & bias:  j
3. Input training sets Xi (preparing Tj for verification )
4. Training computation:
net

j



W

ij

X i 

j

i

1
Y j=

net j > 0
if

0

net j  0

朝陽科技大學 李麗華 教授

6


Slide 7

The training process
5. Training computation:
If T  Y   0 than:
j

j

 W ij   T j  Y j  X i



j

  T j  Y j

Update weights and bias :
W ij  W ij   W ij
new  j   j   



j

6. repeat steps 3 ~step 5 until every input pattern is satisfied as:

T

j

 Y j   0

朝陽科技大學 李麗華 教授

7


Slide 8

The recall process
After the network has trained as mentioned above,
any input vector X can be send into the Perceptron
network to derive the computed output. The ratio of
total number of corrected output is treated as the
prediction performance of the network.
The trained weights, Wij, and the bias, θj , is used to
derive netj and, therefore, the output Yj can be
obtained for pattern recognition(or for prediction).
朝陽科技大學 李麗華 教授

8


Slide 9


Slide 10


Slide 11


Slide 12

朝陽科技大學 李麗華 教授

12


Slide 13


Slide 14

Example: Solving the AND problem
•This is a problem for recognizing the AND pattern
•Let the training patterns are used as follow
X1

X2

T

0

0

0

0

1

0

1

0

0

1

1

1

X2

朝陽科技大學 李麗華 教授

f1
X1

14


Slide 15


Slide 16


Slide 17

朝陽科技大學 李麗華 教授

17


Slide 18

朝陽科技大學 李麗華 教授

18


Slide 19

朝陽科技大學 李麗華 教授

19


Slide 20


Slide 21


Slide 22


Slide 23

朝陽科技大學 李麗華 教授

23


Slide 24

朝陽科技大學 李麗華 教授

24


Slide 25

朝陽科技大學 李麗華 教授

25


Slide 26

朝陽科技大學 李麗華 教授

26