Diagnosing a Team of Agents: Scaling-Up

Download Report

Transcript Diagnosing a Team of Agents: Scaling-Up

Assumption-Based Truth
Maintenance Systems
Meir Kalech
Outline


Last lecture:
1.
Consistency-based diagnosis
2.
GDE – general diagnosis engine
3.
Conflict generation using ATMS
4.
Candidate generation
Today’s lecture:
1.
What is TMS
2.
TMS architecture
3.
Justification-based TMS
4.
Assumption-based TMS
What is TMS?
A Truth Maintenance System (TMS) is a
Problem Solver module responsible for:

Enforcing logical relations among beliefs.

Generating explanations for conclusions.

Finding solutions to search problems

Supporting default reasoning.

Identifying causes for failure and recover
from inconsistencies.
1. Enforcement of logical relations

AI problem -> search.

Search utilizes assumptions.

Assumptions change.

Changing assumptions -> updating
consequences of beliefs.

TMS: mechanism to maintain and update
relations among beliefs.
1. Enforcement of logical relations
Example:
If (cs-501) and (math-218) then (cs-570).
If (cs-570) and (CIT) then (TMS).
If (TMS) then (AI-experience).
The following are relations among beliefs:
(AI-experience) if (TMS).
(TMS) if (cs-570), (CIT).
(cs-570) if (cs-501), (math-218)


Beliefs are propositional variables
TMS is a mechanism for processing large collections
of logical relations on propositional variables.
2. Generation of explanations

Solving problems is what Problem Solvers do.

However, often solutions are not enough.

The PS is expected to provide an explanation

TMS uses cached inferences for that aim.

TMS is efficient:

Generating cached inferences once
is more beneficial than

running inference rules that have generated these
inferences more than once.
2. Generation of explanations
Example:
Q: Shall I have an AI experience after completing the CIT
program?
A: Yes, because of the TMS course.
Q: What do I need to take a TMS course?
A: CS-570 and CIT.


There are different types of TMSs that provide different
ways of explaining conclusions (JTMS vs ATMS).
In this example, explaining conclusions in terms of
their immediate predecessors works much better.
3. Finding solutions to search problems
B
A
D
C



E
Color the nodes: red (1), green (2) yellow (3).
Adjacent nodes are of different colors.
The set of constraints describe this problem:
A1 or A2 or A3
B1 or B2 or B3
C1 or C2 or C2
D1 or D2 or D3
E1 or E2 or E2
not (A1 and B1)
not (A2 and B2)
not (A3 and B3)
not (A1 and C1)
not (A2 and C2)
not (A3 and C3)
not (B1 and D1)
not (B2 and D2)
not (B3 and D3)
not (D1 and E1)
not (D2 and E2)
not (D3 and E3)
not (C1 and E1)
not (C2 and E2)
not (C3 and E3)
3. Finding solutions to search problems
To find a solution we can use search:
A is red
B is red
B is green B is yellow
C is green
C is yellow
D is red
E is red
A is green
C is red
D is yellow
D is green
E is green E is yellow
A is yellow
4. Default reasoning and TMS

PS must make conclusions based on incomplete
information.

“Closed-World Assumption” (CWA)

X is true unless there is an evidence to the contrary.

CWA helps us limit the underlying search space.

The reasoning scheme that supports CWA is called
“default (or non-monotonic) reasoning”.
4. Default reasoning and TMS


Example: Consider the following knowledge base
Bird(tom) and ¬Abnormal(tom)  Can_fly(tom)
Penguin(tom)  Abnormal(tom)
Ostrich(tom)  Abnormal(tom)
Bird(tom)
--------------------------------------------Under the CWA, we assume
¬Abnormal(tom)
and therefore we can derive:
Can_fly(tom)
Non-monotonic TMS supports this type of
reasoning.
5. Identifying causes for failures and
recovering from inconsistencies
Inconsistencies among beliefs in the KB are
always possible:



wrong data (example: “Outside temperature is
320 degrees.”)
Impossible constraints (example: Big-house and
Cheap-house and Nice-house).

TMS maintains help identify the reason for
an inconsistency

“dependency-directed backtracking” allows
the TMS to recover.
TMS applications
Constraint Satisfaction Problems (CSP)


Set of variables

Domain over each variable

Constraints between variables’ domain

Goal: find “solution”: assignments to the
variables that satisfy the constraints
Scenario and Planning Problems


Find a path of state transitions lead from initial
to final states. (games, strategies).

TMS – identifies of applicable rules.
CSP example
Allocation problem

Two hosts:
{h1,h2} (variables)

Three tasks: {t1,t2,t3}(domain)

Two constraints:

t1 before t2 on the same host

t1 could not be run on the same host of t3
CSP example
t1-h1
t2-h1
t3-h1
t1-h2
t2-h2
t3-h3
t1-h1
t2-h1
t2-h1
t1-h1
t1-h1
t3-h1
t1-h2
t2-h2
t2-h2
t1-h2
t1-h2
t3-h2
t1-h1
t2-h1
t3-h2
t1-h1
t2-h1
t3-h1
…
t1-h2
t2-h2
t3-h1
t1-h2
t2-h2
t3-h2
48 nodes
6 are solutions
…
Outline


Last lecture:
1.
Consistency-based diagnosis
2.
GDE – general diagnosis engine
3.
Conflict generation using ATMS
4.
Candidate generation
Today’s lecture:
1.
What is TMS
2.
TMS architecture
3.
Justification-based TMS
4.
Assumption-based TMS
Problem Solver Architecture
The TMS / PS relationship is the following:
Problem
Solver
Justifications,
assumptions
TMS
Beliefs
contradictions
How the TMS and the PS communicate?

The PS works with:








assertions (facts, beliefs, conclusions, hypotheses)
inference rules
procedures
Each one of these is assigned a TMS node.
Example:
N1: (rule (student ?x)
(assert (and (underpaid ?x) (overworked ?x))))
N2: (student Bob)
Given N1 and N2, the PS can infer N3:
(and (underpaid Bob) (overworked Bob))
PS threats nodes as logical formulas,
While TMS treats nodes as propositional variables.
TMS nodes

Different types of TMS support types of nodes:

Premise nodes. These are always true.

Contradiction nodes. These are always false.

Assumption nodes. PS believes no matter whether or not
they are supported by the existing evidence.

Node has a label associated with it. The contents and
the structure of the label depends on the type of TMS.

Other properties are node type (premise, assumption,
etc.), node support (justifications, antecedents), node
consequences, etc.
TMS justifications

If N3, is created by the PS, it reports to the TMS
together with the fact that it follows from N1, N2.
justification:
(N3

N2
N1)

Here N3 is called the consequent, N1 and N2 are the
antecedents of the justification.

Justifications record relations among beliefs or
explaining consequents and identifying causes for
inconsistencies.

The general format of justifications is the following:
(<consequent> <antecedents>)
Propositional specification of a
TMS

TMS nodes are propositional variables 

TMS justifications are propositional formulas
N1 & N2 & … & Ni  Nj

Here N1, N2, …, Ni, Nj are positive literals,
therefore this implication is a Horn formula.
TMS can be viewed as a set of Horn formulas
PS / TMS interaction
Responsibilities of the PS:
Responsibilities of the TMS:
Adds assertions and
justifications.
Makes premises and
assumptions.
Retracts assumptions.
Provides advise on
handling contradictions
1. Cashes beliefs and
consequences and
maintains labels.
2. Detects contradictions.
3. Performs belief revision.
4. Generates explanations.
1.
2.
3.
4.
Outline


Last lecture:
1.
Consistency-based diagnosis
2.
GDE – general diagnosis engine
3.
Conflict generation using ATMS
4.
Candidate generation
Today’s lecture:
1.
What is TMS
2.
TMS architecture
3.
Justification-based TMS
4.
Assumption-based TMS
Justification-based TMS
Justifications are used for:

Belief update purpose, when belief state of
a node changes.

Handle contradiction:
1.
2.
3.
Justification is added to the dependencydirected backtracking system
Then search through the dependency network
for the assumptions of the contradiction
Contradiction is removed.
Justification-based TMS

A justification contains inlist and outlist for a
justified node to be believed:

inlist – a set of nodes that must be in

outlist – a set of nodes that must be out

Syntax: {(inlist),(outlist)}

Premises hold universally: empty in and out

Only one context includes the set of
assumptions currently believed.
Justification-based TMS – Example
Propositions:
Justifications:
A: Temperature>=25
{(),(B)}
B: Temperature< 25
C: Not raining
{(),(D)}
D: Raining
E: Day
{(),(F)}
F: Night
PS concludes “nice weather” from A and C
Justification-based TMS – Example
Propositions:
Justifications:
A: Temperature>=25
{(),(B)}
B: Temperature< 25
C: Not raining
{(),(D)}
D: Raining
E: Day
{(),(F)}
F: Night
G: Nice weather
{(A,C),()}
New node in the JTMS
Justification-based TMS – Example
Propositions:
Justifications:
A: Temperature>=25
{(),(B)}
B: Temperature< 25
C: Not raining
{(),(D)}
D: Raining
E: Day
{(),(F)}
F: Night
G: Nice weather
{(A,C),()}
PS concludes “swim” from E and G
Justification-based TMS – Example
Propositions:
Justifications:
A: Temperature>=25
{(),(B)}
B: Temperature< 25
C: Not raining
{(),(D)}
D: Raining
E: Day
{(),(F)}
F: Night
G: Nice weather
{(A,C),()}
H: Swim
{(E,G),()}
Justification-based TMS – Example
Propositions:
Justifications:
A: Temperature>=25
{(),(B)}
B: Temperature< 25
C: Not raining
{(),(D)}
D: Raining
E: Day
{(),(F)}
F: Night
G: Nice weather
{(A,C),()}
H: Swim
{(E,G),()}
Justification-based TMS – Example
Propositions:
A: Temperature>=25
B: Temperature< 25
C: Not raining
D: Raining
E: Day
F: Night
G: Nice weather
H: Swim
I: Contradiction
Justifications:
{(),(B)}
{(),(D)}
{(),(F)}
{(A,C),()}
{(E,G),()}
{(C),()}
Justification-based TMS – Example
Propositions:
A: Temperature>=25
B: Temperature< 25
C: Not raining
D: Raining
E: Day
F: Night
G: Nice weather
H: Swim
I: Contradiction
X: Handle
D: Raining
Justifications:
{(),(B)}
{(),(D)}
{(),(F)}
{(A,C),()}
{(E,G),()}
{(C),()}
{(),()}
//premise
{(X),()}
Justification-based TMS – Example
Propositions:
A: Temperature>=25
B: Temperature< 25
C: Not raining
D: Raining
E: Day
F: Night
G: Nice weather
H: Swim
I: Contradiction
X: Handle
D: Raining
J: Read
K: Contradiction
Justifications:
{(),(B)}
{(),(D)}
{(),(F)}
{(A,C),()}
{(E,G),()}
{(C),()}
{(),()}
{(X),()}
{(D,E),()}
{(J),()}
//premise
//becomes tired
Justification-based TMS – Example
Propositions:
A: Temperature>=25
B: Temperature< 25
C: Not raining
D: Raining
E: Day
F: Night
G: Nice weather
H: Swim
I: Contradiction
X: Handle
D: Raining
J: Read
K: Contradiction
F: Night
Justifications:
{(),(B)}
{(),(D)}
{(),(F)}
{(A,C),()}
{(E,G),()}
{(C),()}
{(),()} //premise
{(X),()}
{(D,E),()}
{(J),()}
//becomes tired
{(X),()}
Justification-based TMS – Example
Propositions:
A: Temperature>=25
B: Temperature< 25
C: Not raining
D: Raining
E: Day
F: Night
G: Nice weather
H: Swim
I: Contradiction
X: Handle
D: Raining
J: Read
K: Contradiction
F: Night
L: Sleep
Justifications:
{(),(B)}
{(),(D)}
{(),(F)}
{(A,C),()}
{(E,G),()}
{(C),()}
{(),()}
{(X),()}
{(D,E),()}
{(J),()}
{(X),()}
{(F),()}
//premise
//becomes tired
Outline


Last lecture:
1.
Consistency-based diagnosis
2.
GDE – general diagnosis engine
3.
Conflict generation using ATMS
4.
Candidate generation
Today’s lecture:
1.
What is TMS
2.
TMS architecture
3.
Justification-based TMS
4.
Assumption-based TMS
Assumption-based TMS: Motivation

Problem solvers need to explore multiple
contexts at the same time, instead of a
single one (the JTMS case)




Alternate diagnoses of a broken system
Different design choices
Competing theories to explain a set of data
Problem solvers need to compare contexts
switching from one context to another.


In JTMS, this can be done by enabling and
retracting assumptions.
In ATMS, alternative contexts are explicitly stored.
The idea behind ATMS

The assumptions underlying conclusions are
important in problem-solving



Solutions can be described as sets of assumptions
States of the world can be represented by sets of
assumptions
Identify sets of assumptions called here
environments

Organize problem solver around manipulating
environments

Facilitates reasoning with multiple hypotheses
Assumptions and Justifications



ATMS keeps and manipulates sets of assumptions
rather than sets of beliefs
Three types of nodes:

Premise nodes. These are always true, but they
are of no special interest for ATMS.

Assumption nodes. Once made, assumptions
are never retracted.

Contradictions. These are defined by means of
assumptions that originate them. Such sets of
assumptions are called nogoods.
ATMS justifications are Horn formulas of the form:
Jk: I1, I2, …, In  Ck,
where I1, I2, …, In are the antecedents, and Ck is
the consequent of justification Jk.
Basic ATMS terminology

ATMS answers queries about whether a node
holds in a given set of beliefs.

Definition. A set of assumptions upon which a given
node depends is called an environment. Example:
{A,B,C}

Definition. A label is a set of environments.
Example: {{A,B,C}, … ,{D,F}}
That is, the label is the assumptions upon which the
node ultimately depends – major difference from
JTMS label, where labels are simple, :IN or :OUT.

Definition. An ATMS-node, Nk is a triplet
<datum, label(status), justifications>
Basic ATMS terminology
Definition. A node n holds in a given environment E, iff
it can be derived from E given the set of justifications
J: E,J ⊢ n
An environment is inconsistent if false is derived: E,J ⊢⊥
Definition. Let E be a (consistent) environment, and N be
a set of nodes derived from E. Then, E  N is called the
context of E.
Definition. A characterizing environment is a
minimal consistent environment from which a context
can be derived.
Each context is completely specified by its characterizing
environment.

ATMS efficiency

ATMS is provided by a set of assumptions and
justifications.

The task of ATMS is efficiently determines the
contexts.

Incrementally updating only the changed contexts.

Data structure for context-consistency checking and
node inclusion very fast.
Relations between environments
Because environments are monotonic, set inclusion
between environments implies logical subsumption
of consequences.
Example:
E1 = {C}
E2 = {C, D}
E3 = {D, E}
E1 subsumes E2
E2 is subsumed by E1
E1 neither subsumes or is subsumed by E3
How ATMS answers queries
How ATMS answers queries about whether a node
holds in a given environment?
 Easiest way: associate with each node all of the
environments
 Better way: we can record only those environments
which satisfy the following four properties:
1.
2.
3.
4.
Soundness: a node holds in any of the environments
associated with it.
Consistency: no environment is a nogood.
Completeness: every consistent environment is either
associated with it or with a superset of it.
Minimality: no environment is a subset of any other.
ATMS labels
Example, dependency network:
A
{{A}}
E
B
{{B, C}}
{{A},{B,C,D}}
H
F
C
D
G
{{C, D}}
Is H believed?
Yes, because its label is non-empty.
Is H believed under {B, C, D, Z, X}?
Yes, because {B, C, D}  {B, C, D, Z, X}
Is H believed under {C, D}? No.
Contradictions

Certain nodes can be declared as contradictions:
Every environment which allows a contradiction is
inconsistent.
Inconsistent environments are called nogoods.

Example:


{A,B}
F
{A,B,C}
G
{B,C}
Special labels in ATMS
Case 1: Label = { } (empty label)
This means that there is no known consistent
environment in which the node is believed, i.e. either
there is no path from assumptions to it, or all
environments for it are inconsistent.
Case 2: Label = {{}} (empty environment)
This means that the node is believed in every consistent
environment, i.e. the node is either a premise or can be
derived strictly from premises.
Label propagation
A
R
G
B
H
L
C
I
K
D
Label propagation: enable A
{{A}}
A
R
{{A}}
G
B
H
L
C
I
K
D
Label propagation: enable B
{{A}, {B}}
A
R
{{A}, {B}}
G
B
H
L
C
I
K
D
Label propagation: enable C
{{A}, {B}}
A
B
R
{{C}}
{{A}, {B}}
G
{{A,C}, {B,C}}
H
L
C
{{C}}
K
D
I
{{C}}
Label propagation: enable D
{{A}, {B}}
A
B
R
{{C}}
{{A}, {B}}
G
H
L
C
{{C},{D}}
K
D
{{A,C}, {B,C}}
I
{{C}}
Example:
datum
environments
justifications
For example see Franz Wotawa’s slides page 7-8
Properties of ATMS
Environment Lattice
Comments to lattice

If an environment is nogood, then all of its
superset environments are nogood as well.


All nogoods are the result of the nogood {A, B, E}.
The ATMS associates every datum with its
contexts. If a datum is in a context, then it is
in every superset as well (the inconsistent
supersets are ignored).
Comments to lattice

The circled nodes indicate all the contexts of

The square nodes indicate all the contexts of

If PS infers y=0 from x+y=1 and x=1:

Then the context for y=0 is the intersection of
the contexts of the above:
Comments to lattice

One sound and complete label for the consequent is
the set whose elements are the union of all possible
combinations of picking one environment from each
antecedent node label. Thus one sound and complete
label is:

The environment {A, B, C, D} is removed because it
is subsumed by {A, B, C}.

The environment {A, B, D, E} is not included because
it is a superset of the inconsistent {A, B, E}.
ATMS algorithms


Logical specification of the ATMS:

ATMS does propositional reasoning over nodes.

ATMS justifications are Horn clauses.

Contradictions are characterized by nogoods.
Every ATMS operation which changes a node
label can be viewed as adding a justification,
i.e. this is the only operation we have to be
concerned here is label update as a result of
adding a justification.
ATMS algorithms

Step 1: Compute a tentative new (locally
correct) label for the affected node as follows
Given Jik the label of the i’th node of k’th justification
for consequent node n, a complete label for node n:
Lnew = k {x | x = i xi , where xi  Jik}

Step 2: All nogoods and subsumed
environments are removed from Lnew to achieve
global correctness.
Propagating label changes



To update node Ni, compute its new label as described
If the label has not changed
 DONE.
Else
 If Ni is a contradiction node do

Mark all environments in its label as nogoods.

For every node in the network,
 check its label for environments marked as
nogoods and remove from every node label.
 Else
 recursively update all Ni’s consequences (other
nodes having justifications which mention Ni).
Example
Assumptions:
#: Proposition:
A:
B:
C:
D:
Temperature >= 25
Temperature < 25
Not Raining
Day
Derived facts:
E: Nice weather
F: Swim
G: Read
H: Sleep
Label:
{{A}}
{{B}}
{{C}}
{{D}}
Justifications:
{(A)}
{(B)}
{(C)}
{(D)}
Rules:
1. A and C
2. E and D
3. D and out(C)
4. Out(D)
5. A and B





E
F
G
H
⊥
Example – empty environment
Assumptions:
#: Proposition:
A:
B:
C:
D:
H:
Temperature >= 25
Temperature < 25
Not Raining
Day
Sleep
Label:
{{A}}
{{B}}
{{C}}
{{D}}
{{Out(D)}}
Justifications:
{(A)}
{(B)}
{(C)}
{(D)}
{(Out(D))}
The empty environment caused the PS to provide
out(D) as justification to H
The problem solver applies the breadth first search
strategy, to provide the assumptions: The empty
environment is provided first, then A, B…,
{A,B},{A,C}…,{A,B,C}..etc. The rules are fired by the PS
as justifications to the assumptions.
Example – {D}
Assumptions:
#: Proposition:
A:
B:
C:
D:
H:
G:
Temperature >= 25
Temperature < 25
Not Raining
Day
Sleep
Read
Label:
{{A}}
{{B}}
{{C}}
{{D}}
{{Out(D)}}
{{D,Out(C)}}
Justifications:
{(A)}
{(B)}
{(C)}
{(D)}
{(Out(D))}
{{D,Out(C)}}
Environments {A},{B} and {C} have not changed
Example – {A,B}
Assumptions:
#: Proposition:
A:
B:
C:
D:
H:
G:
Temperature >= 25
Temperature < 25
Not Raining
Day
Sleep
Read
⊥
Label:
Justifications:
{{A,B}}
{(A,B)}
{{A}}
{{B}}
{{C}}
{{D}}
{{Out(D)}}
{{D,Out(C)}}
{(A)}
{(B)}
{(C)}
{(D)}
{(Out(D))}
{(D,Out(C))}
PS will fire the fifth rule (A and B  ⊥)
ATMS will add this environment to the nogood DB
Example – {A,C}
Assumptions:
#: Proposition:
A:
B:
C:
D:
H:
G:
E:
Temperature >= 25
Temperature < 25
Not Raining
Day
Sleep
Read
Nice weather
Label:
{{A}}
{{B}}
{{C}}
{{D}}
{{Out(D)}}
{{D,Out(C)}}
{{A,C}}
Justifications:
{(A)}
{(B)}
{(C)}
{(D)}
{(Out(D))}
{(D,Out(C))}
{(A,C)}
PS will fire the first rule (A and C  E)
{A,D} and {B,D} can fire the third rule, but it has
already been fired
Example – {A,C,D}
Assumptions:
#: Proposition:
A:
B:
C:
D:
H:
G:
E:
F:
Temperature >= 25
Temperature < 25
Not Raining
Day
Sleep
Read
Nice weather
Swim
Label:
{{A}}
{{B}}
{{C}}
{{D}}
{{Out(D)}}
{{D,Out(C)}}
{{A,C}}
{{A,C,D}}
Justifications:
{(A)}
{(B)}
{(C)}
{(D)}
{(Out(D))}
{(D,Out(C))}
{(A,C)}
{(E,D)}
•PS will fire the second rule (E and D  F)
•{A,B,C},{A,B,D} and {A,B,C,D} are superset of {A,B} and so are
not fired
•If: Label(E)={{A,C},{X,Y}} then Label(F)={{A,C,D},{X,Y,D}}
Example – environment lattice
When sleep (H) holds?
Square
Circle
Rhombus
–
–
-
where H holds
where G holds
where F holds
Back to Diagnosis…
NOGOODS: {A1,M1,M2}{A1,M1,M3,A2}
A=3,{{}}
*
M1
x=4,{{A1,M2}}
x=4,{{A1,M2},{A1,A2,M3}}
x=6,{{M1}}
+
A1
B=2,{{}}
C=2,{{}}
*
M2
D=3,{{}}
F=10,{{}}
y=6,{{M2},{A2,M3}}
y=6,{{M2}}
y=4,{{A1,M1}}
+
G=12,{{A2,M2,M3}}
*
M3
E=3,{{}}
F=12,{{A1,M1,M2}}
A2
G=10,{{A1,M1,M3,A2}}
G=12,{{}}
z=6,{{M3}}
z=6,{{M3},{A2,M2}}
z=8,{{A1,A2,M1}}
Bibliography
1.
2.
3.
4.
5.
Kenneth D. Forbus and Johan de Kleer, Building Problem
Solvers, The MIT Press, 1993.
Johan de Kleer, An assumption-based truth maintenance
system, Artificial Intelligence 28, 127-162, 1986.
Johan de Kleer, Problem Solving with the ATMS, Artificial
Intelligence 28, 197-224, 1986.
Johan de Kleer, Extending the ATMS , Artificial Intelligence 28,
163-196, 1986.
Mladen Stanojevic and Sanja Vranes and Dusan Velasevic,
Using Truth Maintenance Systems: A Tutorial, IEEE Expert:
Intelligent Systems and Their Applications 9(6), 45-56, 1994.