Neuro-Fuzzy Data Analysis - ICAR-CNR
Download
Report
Transcript Neuro-Fuzzy Data Analysis - ICAR-CNR
Industrial Applications of
Neuro-Fuzzy Networks
Prof. Dr. Rudolf Kruse
University of Magdeburg
Faculty of Computer Science
Magdeburg, Germany
[email protected]
N
SF
EURO
UZZY
Example: Continously Adapting Gear Shift Schedule in VW New Beetle
classification of driver / driving situation
by fuzzy logic
fuzzification
inference
machine
defuzzification
gear shift
computation
interpolation
accelerator pedal
filtered speed of
accelerator pedal
number of
changes in
pedal direction
rule
base
sport
factor [t]
determination
of speed limits
for shifting
into higher or
lower gear
depending on
sport factor
gear
selection
sport factor [t-1]
N
SF
EURO
März 2001
Rudolf Kruse
UZZY
Continously Adapting Gear Shift Schedule: Technical Details
Mamdani controller with 7 rules
Optimized program
24 Byte RAM
AG4
}
on Digimat
702 Byte ROM
Runtime 80 ms
12 times per second a new sport factor is assigned
How to generate knowledge automatically from data?
N
SF
EURO
März 2001
Rudolf Kruse
UZZY
Learning from Examples (Observations, Databases)
Statistics:
Machine Learning:
Neural Networks:
Cluster Analysis:
parameter fitting, structure
identification, inference method,
model selection
computational learning (PAC
learning), inductive learning, learning
decision trees, concept learning, ...
learning from data
unsupervised classification
Learning Problem is transformed into an optimization problem.
How to use these methods in fuzzy systems?
N
SF
EURO
März 2001
Rudolf Kruse
UZZY
Function Approximation with Fuzzy Rules
y
if x is large then y is large
output value
x
current input value
N
SF
EURO
März 2001
Rudolf Kruse
UZZY
t
How to Derive a Fuzzy Controller Automatically from Observed Process Data
ou
t
pu
• Function approximation
input
current input value
• Perform fuzzy cluster analysis of input-output data (FCM, GK, GG, ...)
• Project clusters
• Obtain fuzzy rules of the kind: „If x is small then y is medium“
N
SF
EURO
März 2001
Rudolf Kruse
UZZY
Fuzzy Cluster Analysis
Classification of a given data set X = {x1, ..., xn} p into c
clusters.
Membership degree of datum xk to class i is uik.
Representation of cluster i by prototype vi p.
Formal: Minimisation of functional:
JX, U, v uik d2 v i, x k
c
n
m
i1 k 1
under constraints n
u
ik
0 for all i 1, ..., c
ik
1 for all k 1, ..., n
k 1
c
u
i1
März 2001
N
SF
EURO
Rudolf Kruse
UZZY
Simplest Algorithm: Fuzzy-c-Means (FCM)
d v i , x k v i x k
2
2
Iterative Procedure (with random initialisation of prototypes vi)
u
n
uik
1
d2 v i , x k
2
j 1 d v j , x k
c
1
m 1
and
vi
m
xk
ik
k 1
n
u
m
ik
k 1
FCM is searching for equally large clusters in form of (hyper-)balls.
N
SF
EURO
März 2001
Rudolf Kruse
UZZY
Examples
N
SF
EURO
März 2001
Rudolf Kruse
UZZY
Fuzzy Cluster Analysis
Fuzzy C-Means: simple, looks for spherical
clusters of same size, uses Euclidean distance
Gustafson & Kessel: looks for hyper-ellipsoidal
clusters of same size, distance via matrices
Gath & Geva: looks for hyper-ellipsoidal clusters
of arbitrary size, distance via matrices
Axis-parallel variations exist that use diagonal
matrices (computationally less expensive and less
loss of information when rules are created)
N
SF
EURO
März 2001
Rudolf Kruse
UZZY
Fuzzy Cluster Analysis with DataEngine
N
SF
EURO
März 2001
Rudolf Kruse
UZZY
Construct Fuzzy Sets by Cluster Projection
u(x)
Approximation by a
triangular fuzzy set
1
Convex hull of the discrete
degrees of membership
Connection of the discrete
degrees of membership
x
Projecting a cluster means to project the degrees of membership
of the data on the single dimensions: Histograms are obtained.
N
SF
EURO
März 2001
Rudolf Kruse
UZZY
FCLUSTER: Tool for Fuzzy Cluster Analysis
N
SF
EURO
März 2001
Rudolf Kruse
UZZY
Introduction
Building a fuzzy system requires
prior knowledge (fuzzy rules, fuzzy sets)
manual tuning: time consuming and error-prone
Therefore: Support this process by learning
learning fuzzy rules (structure learning)
learning fuzzy set (parameter learning)
Approaches from Neural Networks can be used
N
SF
EURO
März 2001
Rudolf Kruse
UZZY
Learning Fuzzy Sets: Problems in Control
Reinforcement learning must be used to compute an error
value
(note: the correct output is unknown)
After an error was computed, any fuzzy set learning
procedures can be used
Example: GARIC (Berenji/Kedhkar 1992)
online approximation to gradient-descent
Example: NEFCON (Nauck/Kruse 1993)
online heuristic fuzzy set learning using a
rule-based fuzzy error measure
N
SF
EURO
März 2001
Rudolf Kruse
UZZY
N
SF
EURO
März 2001
Rudolf Kruse
UZZY
Neuro-Fuzzy Systems in Data Analysis
Neuro-Fuzzy System:
System of linguistic rules (fuzzy rules).
Not rules in a logical sense, but function
approximation.
Fuzzy rule = vague prototype / sample.
Neuro-Fuzzy-System:
Adding a learning algorithm inspired by neural
networks.
Feature: local adaptation of parameters.
N
SF
EURO
März 2001
Rudolf Kruse
UZZY
Example:
Prognosis of the Daily Proportional Changes of the DAX at
the Frankfurter Stock Exchange (Siemens)
Database: time series from 1986 - 1997
DAX
Composite DAX
German 3 month interest rates
Return Germany
Morgan Stanley index Germany
Dow Jones industrial index
DM / US-$
US treasury bonds
Gold price
Nikkei index Japan
Morgan Stanley index Europe
Price earning ratio
N
SF
EURO
März 2001
Rudolf Kruse
UZZY
Fuzzy Rules in Finance
Trend Rule
IF
DAX = decreasing AND US-$ = decreasing
THEN DAX prediction = decrease
WITH high certainty
Turning Point Rule
IF
DAX = decreasing AND US-$ = increasing
THEN DAX prediction = increase
WITH low certainty
Delay Rule
IF
DAX = stable AND US-$ = decreasing
THEN DAX prediction = decrease
WITH very high certainty
In general
IF
x1 is m1 AND x2 is m2
THEN y = h
WITH weight k
März 2001
Rudolf Kruse
N
SF
EURO
UZZY
Classical Probabilistic Expert Opinion Pooling Method
DM analyzes each source (human expert, data +
forecasting model) in terms of (1) Statistical accuracy,
and (2) Informativeness by asking the source to asses
quantities (quantile assessment)
DM obtains a “weight” for each source
DM “eliminates” bad sources
DM determines the weighted sum of source outputs
Determination of “Return of Invest”
März 2001
Rudolf Kruse
N
SF
EURO
UZZY
E experts, R quantiles for N quantities
each expert has to asses R·N values
stat. Accuracy:
1 R2 2 N I s, p ,
R
si
I s, p si ln
p
i 0
C
information score:
R 1
1 N
pr 1
I lnvi, R 1 vi,o pr 1 ln
N i 1
vi,r vi,r 1
r 1
weight for expert e:
E
we
e
outputt= we outputt
T
e 1
eE1ce I e id e ce
roi = yt sign outputtDM
t 1
März 2001
ce I e id ce
N
SF
EURO
Rudolf Kruse
UZZY
Formal Analysis
Sources of information
R1
rule set given by expert 1
R2
rule set given by expert 2
D
data set (time series)
Operator schema
fuse (R1, R2)fuse two rule sets
induce(D)
induce a rule set from D
revise(R, D)
revise a rule set R by D
N
SF
EURO
März 2001
Rudolf Kruse
UZZY
Formal Analysis
Strategies:
fuse(fuse (R1, R2), induce(D))
revise(fuse(R1, R2), D)
fuse(revise(R1, D), revise(R2, D))
Technique: Neuro-Fuzzy Systems
Nauck, Klawonn, Kruse, Foundations of Neuro-Fuzzy
Systems, Wiley 97
SENN (commercial neural network environment, Siemens)
N
SF
EURO
März 2001
Rudolf Kruse
UZZY
From Rules to Neural Networks
1. Evaluation of membership degrees
2. Evaluation of rules (rule activity)
n
r
ml: IR [0,1] ,
x j l 1 m c(,js) xi
D
3. Accumulation of rule inputs and normalization
NF: IR IR, x l 1 wl
n
r
kl m l x
r
j 1
k j m j x
N
SF
EURO
März 2001
Rudolf Kruse
UZZY
Neuro-Fuzzy Architecture
N
SF
EURO
März 2001
Rudolf Kruse
UZZY
The Semantics-Preserving Learning Algorithm
Reduction of the dimension of the weight space
1. Membership functions of different inputs share their parameters,
e.g.
stable
stable
mdax
mcdax
2. Membership functions of the same input variable are not allowed to pass
each other, they must keep their original order,
e.g.
m decreasing m stable m increasing
Benefits:
the optimized rule base can still be interpreted
the number of free parameters is reduced
N
SF
EURO
März 2001
Rudolf Kruse
UZZY
Return-on-Investment Curves of the Different Models
Validation data from March 01, 1994 until April 1997
N
SF
EURO
März 2001
Rudolf Kruse
UZZY
A Neuro-Fuzzy System
is a fuzzy system trained by heuristic learning
techniques derived from neural networks
can be viewed as a 3-layer neural network with fuzzy
weights and special activation functions
is always interpretable as a fuzzy system
uses constraint learning procedures
is a function approximator (classifier, controller)
N
SF
EURO
März 2001
Rudolf Kruse
UZZY
Learning Fuzzy Rules
Cluster-oriented approaches
=> find clusters in data, each cluster is a rule
Hyperbox-oriented approaches
=> find clusters in the form of hyperboxes
Structure-oriented approaches
=> used predefined fuzzy sets to structure the
data space, pick rules from grid cells
N
SF
EURO
März 2001
Rudolf Kruse
UZZY
Hyperbox-Oriented Rule Learning
y
Search for hyperboxes in
the data space
Create fuzzy rules by
projecting the hyperboxes
Fuzzy rules and fuzzy
sets are created at the
same time
x
Usually very fast
N
SF
EURO
März 2001
Rudolf Kruse
UZZY
Hyperbox-Oriented Rule Learning
y
y
y
x
x
y
x
x
Detect hyperboxes in the data, example: XOR function
Advantage over fuzzy cluster anlysis:
No loss of information when hyperboxes are represented as
fuzzy rules
Not all variables need to be used, don‘t care variables can be
discovered
Disadvantage: each fuzzy rules uses individual fuzzy sets,
i.e. the rule base is complex.
N
SF
EURO
März 2001
Rudolf Kruse
UZZY
Structure-Oriented Rule Learning
y
large
Provide initial fuzzy sets for
all variables.
medium
The data space is partitioned
by a fuzzy grid
Detect all grid cells that
contain data (approach by
Wang/Mendel 1992)
small
Compute best consequents and
select best rules (extension by
Nauck/Kruse 1995,
x NEFCLASS model)
small
März 2001
medium
large
Rudolf Kruse
N
SF
EURO
UZZY
Structure-Oriented Rule Learning
Simple: Rule base available after two cycles through the
training data
1. Cycle: discover all antecedents
2. Cycle: determine best consequents
Missing values can be handled
Numeric and symbolic attributes can be processed at the
same time (mixed fuzzy rules)
Advantage: All rules share the same fuzzy sets
Disadvantage: Fuzzy sets must be given
N
SF
EURO
März 2001
Rudolf Kruse
UZZY
Learning Fuzzy Sets
Gradient descent procedures
only applicable, if differentiation is possible, e.g.
for Sugeno-type fuzzy systems.
Special heuristic procedures that do not use
gradient information.
The learning algorithms are based on the idea of
backpropagation.
N
SF
EURO
März 2001
Rudolf Kruse
UZZY
Learning Fuzzy Sets: Constraints
Mandatory constraints:
Fuzzy sets must stay normal and convex
Fuzzy sets must not exchange their relative
positions (they must not „pass“ each other)
Fuzzy sets must always overlap
Optional constraints
Fuzzy sets must stay symmetric
Degrees of membership must add up to 1.0
The learning algorithm must enforce these
constraints.
N
SF
EURO
März 2001
Rudolf Kruse
UZZY
Different Neuro-Fuzzy Approaches
ANFIS (Jang, 1993)
no rule learning, gradient descent fuzzy set learning, function approximator
GARIC (Berenji/Kedhkar, 1992)
no rule learning, gradient descent fuzzy set learning, controller
NEFCON (Nauck/Kruse, 1993)
structure-oriented rule learning, heuristic fuzzy set learning, controller
FuNe (Halgamuge/Glesner, 1994)
combinatorical rule learning, gradient descent fuzzy set learning, classifier
Fuzzy RuleNet (Tschichold-Gürman, 1995)
hyperbox-oriented rule learning, no fuzzy set learning, classifier
NEFCLASS (Nauck/Kruse, 1995)
structure-oriented rule learning, heuristic fuzzy set learning, classifier
Learning Fuzzy Graphs (Berthold/Huber, 1997)
hyperbox-oriented rule learning, no fuzzy set learning, function approximator
NEFPROX (Nauck/Kruse, 1997)
structure-oriented rule learning, heuristic fuzzy set learning, function approx.
N
SF
EURO
März 2001
Rudolf Kruse
UZZY
Example: Medical Diagnosis
Results from patients tested for breast cancer
(Wisconsin Breast Cancer Data).
Decision support: Do the data indicate a malignant or a benign
case?
A surgeon must be able to check the classification for
plausibility.
We are looking for a simple and interpretable classifier:
knowledge discovery.
N
SF
EURO
März 2001
Rudolf Kruse
UZZY
Example: WBC Data Set
699 cases (16 cases have missing values).
2 classes: benign (458), malignant (241).
9 attributes with values from {1, ... , 10}
(ordinal scale, but usually interpreted as a numerical
scale).
Experiment: x3 and x6 are interpreted as nominal
attributes.
x3 and x6 are usually seen as „important“ attributes.
N
SF
EURO
März 2001
Rudolf Kruse
UZZY
Applying NEFCLASS-J
Tool for developing Neuro-Fuzzy Classifiers
Written in JAVA
Free version for research available
Project started at Neuro-Fuzzy Group of University of
Magdeburg, Germany
N
SF
EURO
März 2001
Rudolf Kruse
UZZY
NEFCLASS: Neuro-Fuzzy Classifier
Output variables (class labels)
Unweighted connections
Fuzzy rules
Fuzzy sets (antecedents)
Input variables (attributes)
N
SF
EURO
März 2001
Rudolf Kruse
UZZY
NEFCLASS: Features
Automatic induction of a fuzzy rule base from data
Training of several forms of fuzzy sets
Processing of numeric and symbolic attributes
Treatment of missing values (no imputation)
Automatic pruning strategies
Fusion of expert knowledge and knowledge obtained
from data
N
SF
EURO
März 2001
Rudolf Kruse
UZZY
Representation of Fuzzy Rules
Example: 2 Rules
c1
c2
R1: if x is large and y is small, then class is c1.
R2: if x is large and y is large, then class is c2.
R1
R2
The connections x R1 and x R2
are linked.
large
The fuzzy set large is a shared weight.
small
large
x
y
That means the term large has always the
same meaning in both rules.
N
SF
EURO
März 2001
Rudolf Kruse
UZZY
1. Training Step: Initialisation
Specify initial fuzzy partitions for all input variables
c2
small
medium
c1
large
y
x
x
y
small
medium
large
N
SF
EURO
März 2001
Rudolf Kruse
UZZY
2. Training Step: Rule Base
Algorithm:
Variations:
for (all patterns p) do
find antecedent A,
such that A( p) is maximal;
if (A L) then add A to L;
end;
for (all antecedents A L) do
find best consequent C for A;
create rule base candidate R = (A,C);
Determine the performance of R;
Add R to B;
end;
Select a rule base from B;
Fuzzy rule bases can
also be created by using
prior knowledge, fuzzy
cluster analysis, fuzzy
decision trees, genetic
algorithms, ...
N
SF
EURO
März 2001
Rudolf Kruse
UZZY
Selection of a Rule Base
e of a Rule :
Pe rform anc
1
Pr
N
N
1
p 1
c
Rr x p , w ith
0 if class(x p ) con( Rr ),
c
1 otherw ise.
• Order rules by performance.
• Either select
the best r rules or
the best r/m rules per class.
• r is either given or is
determined automatically such
that all patterns are covered.
N
SF
EURO
März 2001
Rudolf Kruse
UZZY
Rule Base Induction
NEFCLASS uses a modified Wang-Mendel procedure
c2
medium
c1
large
y
R2
R3
small
R1
x
x
y
small
medium
large
N
SF
EURO
März 2001
Rudolf Kruse
UZZY
Computing the Error Signal
Fuzzy Error ( jth output):
Error Signal
E j sgn(d ) 1 (d ) , w ithd t j o j
c1
R1
c2
R2
ad
d max
2
and : 0, 1, (d ) e
(t : correctoutput, o : actual output)
R3
Rule Error:
x
y
Er r 1 r Econ( Rr ) , w ith0 1
N
SF
EURO
März 2001
Rudolf Kruse
UZZY
3. Training Step: Fuzzy Sets
Example:
triangular
membership
function.
Parameter
updates for an
antecedent
fuzzy set.
x a
b a
c x
m a ,b,c : [0,1], m a ,b,c ( x)
c b
0
if x [a, b)
if x [b, c]
otherw ise
if E 0
m ( x)
f
1 m ( x) otherw ise
b f E c a sgn( x b)
a f E b a b
c f E c b b
N
SF
EURO
März 2001
Rudolf Kruse
UZZY
Training of Fuzzy Sets
initial fuzzy set
m(x)
reduce
enlarge
0.85
medium
large
y
small
0.55
0.30
x
x
small
medium
large
Heuristics: a fuzzy set is moved away from x (towards x)
and its support is reduced (enlarged), in order to
reduce (enlarge) the degree of membership of x.
N
SF
EURO
März 2001
Rudolf Kruse
UZZY
Training of Fuzzy Sets
Algorithm:
Variations:
repeat
for (all patterns) do
accumulate parameter updates;
accumulate error;
end;
modify parameters;
until (no change in error);
local
minimum
• Adaptive learning rate
• Online-/Batch
Learning
• optimistic learning
(n step look ahead)
Observing the error on
a validation set
N
SF
EURO
März 2001
Rudolf Kruse
UZZY
Constraints for Training Fuzzy Sets
• Valid parameter values
• Non-empty intersection of
adjacent fuzzy sets
1
• Keep relative positions
2
• Maintain symmetry
• Complete coverage
(degrees of membership add up to
1 for each element)
3
Correcting a partition after
modifying the parameters
N
SF
EURO
März 2001
Rudolf Kruse
UZZY
4. Training Step: Pruning
Goal: remove variables, rules and fuzzy sets, in order to
improve interpretability and generalisation.
N
SF
EURO
März 2001
Rudolf Kruse
UZZY
Pruning
Algorithm:
Pruning Methods:
repeat
select pruning method;
1. Remove variables
(use correlations, information
gain etc.)
repeat
execute pruning step;
train fuzzy sets;
2. Remove rules
(use rule performance)
if (no improvement)
then undo step;
3. Remove terms
(use degree of fulfilment)
until (no improvement);
4. Remove fuzzy sets
(use fuzziness)
until (no further method);
N
SF
EURO
März 2001
Rudolf Kruse
UZZY
WBC Learning Result: Fuzzy Rules
R1: if uniformity of cell size is small and bare nuclei is fuzzy0 then benign
R2: if uniformity of cell size is large then malignant
N
SF
EURO
März 2001
Rudolf Kruse
UZZY
WBC Learning Result: Classification Performance
Predicted Class
malign
malign 228 (32.62%)
benign
13
not
classified
sum
(1.86%) 0
(0%) 241
(34.99%)
benign 15 (2.15%) 443 (63.38%) 0
243 (34.76%) 456 (65.24%) 0
sum
(0%) 458
(0%) 699
(65.01%)
(100.00%)
Estimated Performance on Unseen Data (Cross Validation)
NEFCLASS-J:
95.42%
NEFCLASS-J (numeric): 94.14%
Discriminant Analysis: 96.05%
Multilayer Perceptron:
94.82%
C 4.5:
C 4.5 Rules:
95.40%
95.10%
N
SF
EURO
März 2001
Rudolf Kruse
UZZY
WBC Learning Result: Fuzzy Sets
1.0
uniformity of cell size
lg
sm
0.5
0.0
1.0
2.8
4.6
6.4
8.2
10.0
8.2
10.0
bare nuclei
1.0
0.5
0.0
1.0
2.8
4.6
6.4
N
SF
EURO
März 2001
Rudolf Kruse
UZZY
NEFCLASS-J
N
SF
EURO
März 2001
Rudolf Kruse
UZZY
Resources
Detlef Nauck, Frank Klawonn & Rudolf Kruse:
Foundations of Neuro-Fuzzy Systems
Wiley, Chichester, 1997, ISBN: 0-471-97151-0
Neuro-Fuzzy Software (NEFCLASS, NEFCON, NEFPROX):
http://www.neuro-fuzzy.de
Beta-Version of NEFCLASS-J:
http://www.neuro-fuzzy.de/nefclass/nefclassj
N
SF
EURO
März 2001
Rudolf Kruse
UZZY
Conclusions
Neuro-Fuzzy-Systems can be useful for knowledge discovery.
Interpretability enables plausibility checks and improves
acceptance.
(Neuro-)Fuzzy systems exploit tolerance for sub-optimal
solutions.
Neuro-fuzzy learning algorithms must observe constraints in
order not to jeopardise the semantics of the model.
Not an automatic model creator, the user must work with the
tool.
Simple learning techniques support explorative data analysis.
N
SF
EURO
März 2001
Rudolf Kruse
UZZY
Download NEFCLASS-J
Download the free version of NEFCLASS-J at
http://fuzzy.cs.uni-magdeburg.de
N
SF
EURO
März 2001
Rudolf Kruse
UZZY
Fuzzy Methods in Information Mining: Examples
here: Exploiting quantitative and qualitative
information
Fuzzy Data Analysis (Projects with Siemens)
Information Fusion (EC Project)
Dependency Analysis (Project with Daimler/Chrysler)
N
SF
EURO
März 2001
Rudolf Kruse
UZZY
Analysis of Daimler/Chrysler Database
Database: ~ 18.500 passenger cars
> 100 attributes per car
Analysis of dependencies between special equipment and
faults.
Results used as a starting point for technical experts looking
for causes.
N
SF
EURO
März 2001
Rudolf Kruse
UZZY
Learning Graphical Models
data
+
prior information
A
Inducer
B
C
local models
N
SF
EURO
März 2001
Rudolf Kruse
UZZY
The Learning Problem
known structure
B
A
C
complete data
A
<a4,
<a3,
B
b3,
b2,
C
c1>
c4>
Statistical Parametric
Estimation (closed from eq.):
statistical parameter fitting,
ML Estimation,
Bayesian Inference, ...
incomplete data Parametric Optimization:
(missing values,
hidden variables,...)
A
<a4,
<a3,
B
?,
b2,
EM,
gradient descent, ...
C
c1>
?>
unknown structure
B
A
C
Discrete Optimization over
Structures (discrete search):
likelihood scores,
MDL
Problem:
search complexity
heuristics
Combined Methods:
structured EM
only few approaches
Problems:
criterion for fit?
new variables?
local maxima?
fuzzy values?
N
SF
EURO
März 2001
Rudolf Kruse
UZZY
Possibility Theory
fuzzy set induces possibility
A sup m
1
A
mcloudy
2
0 55, 60
3
axioms
50
65
85
100
0 0
1
A B max A , B
A B min A , B
N
SF
EURO
März 2001
Rudolf Kruse
UZZY
General Structure of (most) Learning Algorithms for Graphical Models
Use a criterion to measure the degree to which a
network structure fits the data and the prior
knowledge
(model selection, goodness of hypergraph)
Use a search algorithm to find a model that
receives a high score by the criterion
(optimal spanning tree, K2: greedy selection of
parents, ...)
N
SF
EURO
März 2001
Rudolf Kruse
UZZY
Measuring the Deviation from an Independent Distribution
Probability- and Information-based Measures
information gain *
identical with mutual information
information gain ratio *
g-function (Cooper and Herskovits)
minimum description length
gini index *
Possibilistic Measures
expected nonspecificity
specificity gain
specificity gain ratio
(Measures marked with * originated from decision tree learning)
März 2001
Rudolf Kruse
N
SF
EURO
UZZY
Data Mining Tool Clementine
N
SF
EURO
März 2001
Rudolf Kruse
UZZY
Analysis of Daimler/Chrysler Database
electrical
roof top
air conditioning
faulty
battery
type of
engine
faulty
compressor
type of
tyres
slippage
control
faulty
brakes
Fictituous example:
There are significantly more faulty batteries, if both
air conditioning and electrical roof top are built
into the car.
N
SF
EURO
März 2001
Rudolf Kruse
UZZY