A Comparative Study of Some Multiple Expert Recognition

Download Report

Transcript A Comparative Study of Some Multiple Expert Recognition

DecisionCombination of
Multiple Classifiers for Pattern
Classification: Hybridization of
Majority Voting and Divide and
Conquer Techniques
A. F. R. Rahman
BCL Computers Inc.
M. C. Fairhurst
University of Kent
Santa Clara, Calif, USA
Canterbury, Kent, UK
Presentation Outline
•
•
•
•
•
•
•
Multiple Expert Classification
Majority Voting Technique
Divide and Conquer Technique
Concept of Hybridization
Problem Selection (Database/Experts)
Performance
Discussion and Conclusion
Basic Problem Statement
• Given a number of experts working on the
same problem, is group decision superior to
individual decisions?
Ghosts from the Past…
•
•
•
•
•
Jean-Charles de Borda (1781)
N. C. de Condorcet (1785)
Laplace (1795)
Issac Todhunter (1865)
C. L. Dodgson (Lewis Carrol)
(1873)
• M. W. Crofton (1885)
• E. J. Nanson (1907)
• Francis Galton (1907)
Is Democracy the answer?
• Infinite Number of Experts
• Each Expert Should be Competent
How Does It Relate to Pattern
Classification?
Each Expert has its:
• Strengths and Weaknesses
• Peculiarities
• Fresh Approach to Feature Extraction
• Fresh Approach to Classification
• But NOT 100% Correct!
Practical Resource Constraints
Unfortunately, We Have Limited
• Number of Experts
• Number of Training Samples
• Feature Size
• Classification Time
• Memory Size
Solution
• Clever Algorithms to Exploit Experts
– Complimentary Information
– Redundancy: Check and Balance
– Simultaneous Use of Arbitrary Features and
Classification Routines
Majority Voting
Classifier #1
Classifier #2
Classifier #n
Decision Fusion: Counting Individual Votes to
Support a Classification
Final Decision Combination
At least k classifiers
have to agree, when
k = n/2 + 1 (n even)
k = (n+1)/2 (n odd)
Majority Voting: Analysis
• Probability that x classifiers would arrive at
the correct decision: x  y ! x y
Pc  Pe 
x! y!
and at wrong decision is:
x  y ! P y P x
x! y!
c
e
The Precondition of Correctness (Condorcet) is
x
y

Pc  Pe 

Pc x Pe y  Pc y Pe x
Majority Voting: Analysis (cont.)
Assuming x and y to be constant,
x  y 1



P
  2 x  y  e x  y 1 Pc  Pe 
Pc
Pc 
Since
 x  y  1  0 ,

0
Pc
So when x and y are given, as Pc increases,  increases
continuously from 0 to infinity.
Divide and Conquer
Initial
Problem
Smaller
Problem
Smaller
Problem
Smaller
Problem
Individual Solution
Final Decision Combination
Final Solution
Divide and Conquer: Analysis
0
1
Specialized
Filters
m
Rejected
Specialized
Classifiers
m
0
0
n
Input Side
Yes
Is Group
Assignment
Accurate?
1
m
1
No
0
0
1
Specific
Group
Accepted
Reject Recovery
Classifier
1
Perform Reclassification
(Reject
Recovery)
Perform
Specialized
Groupwise Reclassification
Combine
Decisions
n
Output Side
Final
Classification
Results
Combined Structure: Divide and
Conquer with Consensus
Initial Problem
Smaller
Problem
Smaller
Problem
Smaller
Problem
Multiple Solutions
Solution
Consensus
Solution
Consensus
Solution
Consensus
Decision Combination
Final Solution
Selection of a Database
•
•
•
•
•
Handwritten Characters (NIST)
Collected off-line
Total samples of over 10,000 characters
Size Normalized to 32X32
Numeral Classes 0-9
Selected Classifiers
•
•
•
•
Binary Weighted Scheme (BWS)
Frequency Weighted Scheme (FWS)
Multi-layered Perceptrons (MLP)
Moment based Pattern Classifier (MPC)
(using Maximum Likelihood Method)
Performance of Individual
Classifiers
Classifier Accepted Recog Error
-nized
FWS
97.35
78.76 18.59
Rejected
MPC
97.62
85.78 11.84
2.38
BWS
95.50
72.31 23.19
4.50
MLP
95.13
82.31 12.82
4.87
2.65
Performance of Decision
Combination Methods
Decision
Acce- Recog- Error
Combination pted nized
Approach
Majority
96.59 90.59 6.00
Voting
Rejected
Divide and
Conquer
2.92
97.08 91.23
5.85
3.41
Implementation of Divide and
Conquer with Consensus
10-class Problem
Classes '2/4/5/6/8/9'
FWS
Primary
Classification
Directly Classified
MLP
MLP
Rejection
'1/7' Filter
'3/8' Filter
Accepted
Accepted
Rejection
MPC
MLP
MPC
MLP
MPC
MLP
FWS
BWS
FWS
BWS
FWS
BWS
Majority
Consensus
Majority
Consensus
Decision
Summation
Final Decision
Majority
Consensus
Performance of the Proposed
Method
Decision
Accepted Recog- Error Rejected
Combination
nized
Approach
Divide and
Conquer
With
Consensus
97.89
92.41 5.48
2.11
Comparison of Throughput
Classifier
Throughput (cps)
BWS
678.43
FWS
649.77
MPC
113.07
MLP
345.71
Throughput of Combination
Methods
Decision Combination
Approach
Majority Voting
Throughput (cps)
Divide and Conquer
52.96
Divide and Conquer
With Consensus
36.79
20.29
Conclusion
• Group Decisions Are often SUPERIOR to
Individual Decisions
• Multiple Expert Solutions can be Made
more Robust by incorporating a priori
information about the task domain
• Multiple Expert Solutions Does NOT
automatically mean a Slower System!