Knowledge Engineering for Bayesian Networks Ann Nicholson School of Computer Science and Software Engineering Monash University.

Download Report

Transcript Knowledge Engineering for Bayesian Networks Ann Nicholson School of Computer Science and Software Engineering Monash University.

Knowledge Engineering for
Bayesian Networks
Ann Nicholson
School of Computer Science
and Software Engineering
Monash University
1
Overview



Summary of my BN-related projects
Thoughts on the BN Knowledge Engineering
Process
Case Study: Intelligent Tutoring System for
decimal misconceptions
2
BN-related projects


DBNS for discrete monitoring
Approximate BN inference algorithms based on a
mutual information measure for relevance (with Nathalie
Jitnah, ICONIP97, ECSQARU97, PRICAI98,AI99)



Plan recognition: DBNs for predicting users actions
and goals in an adventure game (with David Albrecht,
Ingrid Zukerman, UM97,UMUAI1999,PRICAI2000)
DBNs for ambulation monitoring and fall diagnosis
(with biomedical engineering, PRICAI’96)
Autonomous aircraft monitoring and replanning (with
Tim Wilkin, PRICAI2000)
3
BN-related projects



Bayesian Poker (with Kevin Korb, UAI’99)
Seabreeze prediction: joint project with Bureau of
Meteorology (with Russell Kennett and Kevin Korb,
PAKDD’2001)
» Comparison of existing simple rule, expert elicited
BN, and BNs from 2 automated learners -- Tetrad-II
(Spirtes et al. 1993) and CaMML (Wallace and Korb,
1999).
Intelligent tutoring system for decimal misconceptions
(UAI2001)
4
Other related research at Monash

Machine learning
» Minimum-Message length (Wallace, Dowe)
» BN Learning
– CaMML (Casual MML) (Wallace, Korb)
– Gas for search (Neil, Korb UAI’99)

BNs for Argument Generation (Zukerman, Korb)
5
Elicitation from experts

Variables
» important variables? values/states?

Structure
» causal relationships?
» dependencies/independencies?

Parameters (probabilities)
» quantify relationships and interactions?

Preferences (utilities) (for decision networks)
6
Expert Elicitation Process




These stages are done iteratively
Stops when further expert input is no longer
cost effective
Process is difficult and time consuming.
Current BN tools
» inference engine
» GUI

BN
EXPERT
Domain
EXPERT
Next generation of BN tools?
BN TOOLS
7
Knowledge discovery

There is much interest in automated methods
for learning BNs from data
» parameters, structure (causal discovery)

Computationally complex problem, so current
methods have practical limitations
» e.g. limit number of states, require variable
ordering constraints, do not specify all arc
directions, don’t handle hidden variables

Evaluation methods
8
The knowledge engineering process
1. Building the BN
» variables, structure, parameters, preferences
» combination of expert elicitation and knowledge discovery
2. Validation/Evaluation
» case-based, sensitivity analysis, accuracy testing
3. Field Testing
» alpha/beta testing, acceptance testing
4. Industrial Use
» collection of statistics
5. Refinement
» Updating procedures, regression testing
9
Case Study: Intelligent tutoring


Tutoring domain: primary and secondary school
students’ misconceptions about decimals
Based on Decimal Comparison Test (DCT)
» student asked to choose the larger of pairs of decimals
» different types of pairs reveal different misconceptions


ITS System involves computer games involving
decimals
This research also looks at a combination of expert
elicitation and automated methods
10
Expert classification of Decimal
Comparison Test (DCT) results
expert
class
ATE
AMO
MIS
AU
LWH
LZE
LRV
LU
SDF
SRN
SU
UN
1
0.4
0.35
H
H
L
H
L
L
L
L
H
H
H
-
2
5.736
5.62
H
H
L
H
H
H
H
H
L
L
L
-
Item Type
3
4
4.7
0.452
4.08
0.45
H
H
H
L
L
L
L
H
H
H
L
H
H
L
H
L
-
5
0.4
0.3
H
H
L
H
H
H
H
L
-
6
0.42
0.35
H
H
L
H
H
L
H
L
11
The ITS architecture
Adaptive
Bayesian
Network
Inputs
Student
Generic BN
model of student
Decimal
comparison
test
(optional)
Answers
Diagnose
misconception
Predict outcomes
Identify most
useful information
Information about
student e.g. age
(optional)
Classroom
diagnostic test
results (optional)
Answer
Computer Games
Hidden
number
Answer
Feedback
Answer
System
Controller
Module
Sequencing
tactics
Item
Select next item
type
Decide to present
help
Decide change to
new game
Identify when
expertise gained
Flying
photographer
Item type
Item
Decimaliens
New
game
Help
Number
between
Help
….
Report
on student
Classroom
Teaching
Activities
12
Teacher
Expert Elicitation

Variables
» two classification nodes: fine and coarse (mut. ex.)
» item types: (i) H/M/L (ii) 0-N

Structure
» arcs from classification to item type
» item types independent given classification

Parameters
» careless mistake (3 different values)
» expert ignorance: - in table (uniform distribution)
13
Expert Elicited BN
14
Evaluation process

Case-based evaluation
» experts checked individual cases
» sometimes, if prior was low, ‘true’ classification did
not have highest posterior (but usually had biggest
change in ratio)

Adaptiveness evaluation
» priors changes after each set of evidence

Comparison evaluation
» Differences in evaluation between BN and expert
rule
15
Comparison evaluation



Development of measure: same classification,
desirable and undesirable re-classification
Use item type predictions (not yet undertaken)
Investigation of effect of item type granularity
and probability of careless mistake
16
Investigation by Automated
methods



Classification (using SNOB program, based
on MML)
Parameters
Structure (using CaMML)
17
Results
Method
Expert
Type
values
0-N
H/M/L
30 DCT
0-N
H/M/L
EBN
0-N
learned H/M/L
CaMML 0-N
contr.
H/M/L
CaMML 0-N
uncontr. H/M/L
Match
0.22
0.11
0.03
0.22
0.11
0.03
SNOB
Avg
Avg
Avg
Avg
Avg
Avg
77.88
82.93
84.37
80.47
83.91
90.40
80.55
72.06
72.51
95.97
97.63
86.51
83.48
85.15
92.63
Desir.
Undes.
Change Change
20.39
1.72
15.63
1.44
11.86
3.78
18.71
0.82
13.66
2.42
6.48
3.12
16.41
3.04
16.00
11.94
17.03
10.46
2.36
1.66
1.61
0.75
5.08
8.41
8.12
8.34
5.87
7.92
4.61
2.76
18
Open Research Questions

Methodology for combining expert elicitation and
automated methods
» expert knowledge used to guide search
» automated methods provide alternatives to be presented to
experts

Evaluation measures and methods
» may be domain depended

Improved BN tools
» e.g. visualisation of d-separation
19